Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.
IGJ risk signalling system (IRiS dashboards)
- Publication category
- Impactful algorithms
- Impact assessment
- No
- Status
- In use
General information
Theme
Begin date
Contact information
Responsible use
Goal and impact
Purpose
The IGJ needs to make targeted choices in supervising organisations. The algorithm helps to weigh data from different sources more objectively and structurally in these choices. This makes supervision more effective and efficient.
Impact
These choices may include, for example, which supervisory activity we will do, which topics will be on the agenda in a supervisory activity or at which organisation the supervisory activity takes place.
Considerations
The size of the number of organisations we monitor and the quantity and diversity of data requires an algorithm to more quickly spot salient features in the data.
Human intervention
The algorithm supports the inspector to factor the data into choices, e.g. which supervisory activity will we do, which topics will be on the agenda in a supervisory activity or at which organisation the supervisory activity will take place. The final decision for these choices, always lies with the inspector.
Risk management
The algorithm is regularly discussed with inspectors for usability, and adjusted where necessary.
Legal basis
The IGJ oversees the various healthcare sectors.
Links to legal bases
Elaboration on impact assessments
In the case of an algorithm with a potential privacy risk to the rights and freedoms of data subjects, a DPIA was carried out.
Impact assessment
Operations
Data
Miscellaneous, mainly data from reports, previous inspections, patient experiences, care content indicators and business information
Technical design
A computational rule is applied to variables to make outliers visible by a signal. This signal depends, among other things, on the sector, the data used and variables in the data.
External provider
Similar algorithm descriptions
- This algorithm from Stichting Inlichtingenbureau (IB)* makes visible whether a person who is in a penitentiary is receiving social assistance benefits at the same time. If so, a signal is sent to the municipality.Last change on 5th of August 2025, at 9:52 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- DPIA
- Status
- In use
- The model helps detect and analyse irregularities and irregularities following the allocation of a Wmo/Jeugdwet provision. The model signals whether further investigation is needed into the spending of funds.Last change on 5th of July 2024, at 9:31 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- IAMA
- Status
- In use
- The algorithm in the software recognises and anonymises personal data and other sensitive information in documents. Governments regularly publish information related to the drafting and implementation of their policies (e.g. based on the Woo). This tool is used to render sensitive data unrecognisable in the process.Last change on 9th of January 2025, at 9:23 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- DPIA
- Status
- In use
- This algorithm helps Customs to select clients for controls based on risk. Among other things, it uses declaration data from companies and assesses whether or not there are risks in bringing fireworks consignments into the European Union.Last change on 2nd of April 2025, at 12:44 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- Field not filled in.
- Status
- In use
- The algorithm combines signals with data to assess whether more research is needed.Last change on 18th of December 2025, at 14:57 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- GEB, EIA
- Status
- In use