Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.

Performance monitor

This algorithm creates a risk sorting of educational institutions (schools, courses or school boards) to conduct targeted desk research, as part of the annual performance and risk analysis.

Last change on 22nd of May 2024, at 8:41 (CET) | Publication Standard 1.0
Publication category
Impactful algorithms
Impact assessment
Field not filled in.
Status
In use

General information

Theme

Education and Science

Begin date

07-2019

Contact information

https://contactformulier.onderwijsinspectie.nl/contact

Link to publication website

https://www.onderwijsinspectie.nl/onderwerpen/werkwijze-van-de-inspectie/jaarlijkse-prestatieanalyse

Responsible use

Goal and impact

The Education Inspectorate supervises the quality of education (the 'guarantee function'). From the safeguarding function of supervision, the inspectorate ensures compliance with the Education Act (and the regulations based on it). This is about what the board and the school/college all have to do.

Because of this safeguarding function, every year we map out whether there are educational institutions with risks. Because there are very many schools, study programmes and boards in the Netherlands, we do this check partly automatically. To do this, we use an algorithm: the algorithm calculates a risk score for each educational institution.

Based on the risk sorting made by the algorithm, manual desk analyses are performed. For the desk analysis, analysts and/or inspectors look at the data used for the algorithm, but also at other things, such as a school guide, for example, or specific reports (e.g. a concerned parent who contacted the inspectorate).

If, based on the manual analysis, inspectors think the risks are high and serious, the school or board will be contacted. Inspectors may then also conduct an on-site investigation at the school, programme, or board.

Considerations

The advantage of using an algorithm that calculates risk is that not all institutions need to be examined manually every year. Manual analysis costs a lot of time and money. By examining the institutions with a high risk score manually, analysts and inspectors can be deployed where they are needed most. A disadvantage could be that the algorithm would only show schools at risk in subjects the inspectorate has data on.

To prevent the inspectorate from only looking at data, analysts and inspectors always have the option of performing additional desk analyses. For example, for educational institutions they do have concerns about, even if that institution does not have a high risk score.

Human intervention

The outcome of the algorithm is a risk score. Educational institutions with a high risk score receive a manual desk analysis. In addition, inspectors or analysts can also conduct manual desk analyses for institutions they have concerns about even if that institution has a low risk score.

If the desk analysis shows that the risks are high and serious, the institution is contacted and other supervisory activities may follow. Sometimes this means an on-site inspection visit. The desk analysis and any on-site investigation are carried out entirely by people.

Risk management

Perfect risk estimation is impossible. To keep the risk estimation as good as possible, annual assessments of the quality of risk estimation of the Performance Monitor are carried out.

  1. To this end, we examine whether the schools, programmes and institutions with high risk scores are indeed more likely on average to receive an unsatisfactory assessment in the following year. If it turns out that the predictive power is low, we try to make adjustments that ensure better predictive power. Indeed, if it turns out that the risk assessment has insufficient predictive power, then certain institutions may be examined unnecessarily. The time of analysts and inspectors is then not used in the right places (inefficient). Institutions that do have problems may then be left out of the picture.
  2. Even more specifically: if certain groups of schools are disproportionately (disproportionately to the actual risks) represented in the high-risk groups, these institutions may be unnecessarily burdened. Even then, there is inefficient use of capacity, and unfair burden on certain schools or boards.
  3. Because institutions with higher risk scores are investigated more often, wrong risk estimation can also continue (tunnel vision). Therefore, since September 2023, we have started on-site examinations at randomly selected educational institutions. This allows us to evaluate the quality of risk estimation even better. Indeed, random selection helps prevent tunnel vision.

Legal basis

Education Supervision Act (WOT), Public Access Act (WOB), Secondary Education Act (WVO), Primary Education Act (WPO), Expertise Centres Act (WEC), Education and Vocational Education Act (WEB), Higher Education and Scientific Research Act (WHW) and General Administrative Law Act (AWB)

Links to legal bases

  • WOT: https://wetten.overheid.nl/BWBR0013800/2023-08-01
  • WOB: https://wetten.overheid.nl/BWBR0005252/2018-07-28
  • WVO: https://wetten.overheid.nl/BWBR0044212/2024-01-01
  • WPO: https://wetten.overheid.nl/BWBR0003420/2024-01-01
  • WEC: https://wetten.overheid.nl/BWBR0003549/2023-08-01
  • WEB: https://wetten.overheid.nl/BWBR0007625/2023-08-01
  • WHW: https://wetten.overheid.nl/BWBR0005682/2024-01-01
  • AWB: https://wetten.overheid.nl/BWBR0005537/2024-05-01

Elaboration on impact assessments

An IAMA was run through for this algorithm.

The pre-DPIA scan shows that a DPIA is not necessary, as the files containing person-level data contain only pseudonymised data and therefore information cannot be traced back to an individual. The inspection does not have a key to decrypt the pseudonymised data. In addition, this data is used in the algorithm only in aggregated form to the level of school, board or institution.

Operations

Data

The main sources of information are pupil numbers, staff characteristics, financial characteristics of the institutions, learning outcomes and signals (from parents and pupils who contact us).

Information on institutions comes from files from the Register of Educational Participants (ROD, formerly BRON), files from CBS, files from DUO, data supplied by schools (e.g. the Social Safety Monitor data) and data from Inspectorate of Education's own registries.

Data are always aggregated to school, board or institution level.

Links to data sources

Gegevens over leerlingaantallen, personeelskenmerken en financiële kenmerken zijn voor een groot deel openbaar beschikbaar gemaakt door DUO: https://duo.nl/open_onderwijsdata/

Technical design

It involves a rule-based algorithm. The rules are established with the help of inspectors and analysts, and based on quantitative research. The various data are aggregated to the school/board/institution level and thus the various indicators are calculated.

A risk score is then calculated based on the combination of indicator scores from 1-3 years, the limits for what is considered risky, the annual weights and the weightings per indicator.

The output of the model is a risk score per school/board/institution.

External provider

Internally developed

Similar algorithm descriptions

  • This algorithm helps Customs to select goods for inspection based on risk. It uses declaration data from companies and considers whether or not there are risks of inaccuracies in the declarations.

    Last change on 28th of February 2024, at 12:29 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • The algorithm is used to make an automated risk assessment for all Applications for Fixed Chargesgemoetkoming, prior to automated or manual granting and payment of the advance.

    Last change on 30th of May 2024, at 12:57 (CET) | Publication Standard 1.0
    Publication category
    High-Risk AI-system
    Impact assessment
    Field not filled in.
    Status
    In use
  • The risk model (the algorithm) helps choose which NOW applications to investigate further. It gives clues as to whether the information given in the NOW application is correct. With these indications from the risk model, SZW reviews the application.

    Last change on 10th of June 2024, at 13:42 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • The risk prioritisation scores companies based on compliance behaviour, environmental variables and the latest controls. The higher a company is on the list, the earlier it is eligible for inspections which can have an impact on any occurrences on humans and the environment.

    Last change on 3rd of April 2024, at 13:50 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • This algorithm helps Customs to select goods for control based on risk. It uses declaration data from companies and considers whether or not there are increased risks of importing and exporting chemicals intended for use as drug precursors.

    Last change on 17th of December 2024, at 10:55 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use