Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.

Re-examining benefit recipients

Residents who do not have sufficient income or assets themselves usually receive benefits. The municipality of Rotterdam investigates whether the benefits that residents receive are legitimate and still correspond to their current situation. The algorithm that supported the selection for these re-examinations has not been used since the beginning of 2022.

Last change on 18th of July 2024, at 12:37 (CET) | Publication Standard 1.0
Publication category
High-Risk AI-system
Impact assessment
Field not filled in.
Status
Out of use

General information

Theme

Social Security

Begin date

01-2017

End date

02-2022

Contact information

14 010

Link to publication website

https://algoritmeregister.rotterdam.nl/p/Onzealgoritmes/53761720551735604

Link to source registration

https://algoritmeregister.rotterdam.nl/p/OnzealgoritmesMobiel/53761720551735604

Responsible use

Goal and impact

The model made a prediction for the chance of unlawfulness or legality, by looking at the data known in the administration of citizens whose legality had been investigated in previous years. The conclusion may have been that the benefit was paid out lawfully; or that the previously paid out benefit did not match the actual situation (unlawfulness). Based on this historical data, a prediction can be made about the chance of legality or illegality for current benefit recipients. This prediction is therefore made on the basis of the data known in our administration of current benefit recipients. The outcome of the model is a 'risk assessment number' that varies between 0 (high chance of legality) and 1 (high chance of illegality). The benefit recipients with the highest risk assessment number are invited for an investigation interview, unless, for example, an investigation into legality has already taken place recently. The number of benefit recipients invited is determined annually on the basis of the risk assessment model. The model has been developed anew every year, because new research data becomes available every year. The information from recent studies into legality has therefore always been included in the development of a new model. The municipality of Rotterdam has not used this algorithm since the beginning of 2022. Our exploration into an improved 'risk model' has shown that it is currently not possible to develop an algorithm that fits within our policy.

Considerations

The model has not made a statement about legality or illegality: the research required for that statement has always been carried out by an employee. That research is always carried out in the same way, whether the benefit recipient has been invited on the basis of the risk assessment model or on the basis of one of the other selection methods. The algorithm does not process information that can directly lead to discrimination, such as nationality, age or health status. The municipality has investigated whether information is used in the model that can be related to nationality or origin, such as language skills, in order to prevent the chance of discrimination or bias. An investigation has shown that this is currently not possible to rule out and the development of an improved model has been stopped.

Human intervention

The risk assessment model does not determine whether someone is receiving benefits lawfully or unlawfully. The model does form part of the determination of which benefit recipients are invited for an investigation interview. An expert income consultant from the Work & Income department of the municipality of Rotterdam carries out this investigation. After carrying out the investigation, the income consultant determines whether the job seeker's benefits have been provided lawfully or unlawfully and whether they still fit the situation in which the benefit recipient finds himself.

Risk management

To determine the risk of an algorithm, the municipality of Rotterdam uses a fixed classification model. The risk assessment model has been designated as a high-risk algorithm. The municipality is therefore taking additional safety measures. These include measures that ensure that the development and use of the model is properly supervised. For example, specific attention is paid to the ethical risks and an external supervisory committee monitors the design and use of the model. This committee is expected to be in place at the beginning of 2022. If an algorithm is in the high-risk category, a human test is part of the work process in which the algorithm in question is used. The chance of bias is limited because the risk assessment model is only used to determine who receives an invitation for an investigation interview. The algorithm does not determine whether someone is lawfully or unlawfully receiving benefits. This is done by income consultants from the W&I department. They conduct extensive research into the situation, together with the benefit recipient. After all, providing benefits is tailor-made and is done at an individual level.

Legal basis

Participation Act

Operations

Data

Benefit administration data (Socrates) and information on reintegration of job seekers (RWM/Raak)

Technical design

The municipality of Rotterdam no longer uses this algorithm. The exploration of an improved 'risk model' has shown that it is currently not possible to

to develop an algorithm that fits within our policy.


When the algorithm was still in use, the following description applied: The Department of Work & Income (W&I) regularly checks whether residents are entitled to benefits and whether these still match their situation. Benefit recipients can therefore be invited for an interview.


To determine who receives an invitation, the municipality uses various selection methods, including a risk algorithm. This algorithm helps the municipality to estimate which benefit recipients are most likely to receive benefits that no longer match their situation. This has two advantages for society:

1) This helps the municipality to prevent overpayment.

2) In addition, benefit recipients who are less likely to have special circumstances are less likely to be invited for an interview.


The risk assessment algorithm works predictively. This algorithm uses data that the W&I department needs to perform its tasks and to be able to pay out the benefit.


This involves data such as the amount of the benefit and the family situation. But also the skills and qualities of the job seeker, which give you more chances on the labour market. This comes from two data sources:

1) Socrates: the benefits administration system. This determines and records the amount of benefits provided. This system contains data on, for example, the living situation of the benefit recipient.

2. RWM/Raak: system in which data on the reintegration of job seekers is recorded.


The municipality only uses data from its own administration. There is no link with other data files.


The risk assessment algorithm is only used to invite benefit recipients for an interview. The algorithm does not make a statement about whether the benefit is justified. An employee of the department assesses whether there is a right to a benefit. This depends on your data and this is different for everyone.

Similar algorithm descriptions

  • A resident who has been given a prison sentence and is trying to evade it is called a fugitive convict. These residents are not entitled to assistance. Municipalities must check this and, if necessary, stop the assistance benefit after investigation.

    Last change on 11th of October 2024, at 8:14 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • Some residents receive income support, such as welfare benefits. You are then only allowed to have a maximum amount of money or equity. It is the municipality's job to check this. Therefore, the municipality gets information about the assets of this group. The municipality uses filtering to decide who to investigate further.

    Last change on 27th of June 2024, at 12:56 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • A benefit recipient who is studying and can claim study financing is not entitled to assistance. Study financing must be used before claiming assistance. St.Inlichtingenbureau (IB) examines whether there are benefit recipients who have been granted study financing in addition to assistance.

    Last change on 11th of October 2024, at 8:27 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • Someone with social assistance benefit and (also) income from employment or another benefit will receive less benefit under the Participation Act. Excess benefit payments can also be reclaimed. To get a good overview of this income, municipalities can request information from the IB.

    Last change on 17th of December 2024, at 12:26 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • The algorithm selects potential high-risk addresses on which several people with residential addresses are registered in the Basic Registration of Persons (BRP), who are not first-degree relatives of each other.

    Last change on 27th of May 2024, at 12:04 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use