Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.
Reporting assistant intake Wmo/Jugdwet
- Publication category
- Other algorithms
- Impact assessment
- DPIA
- Status
- In use
General information
Theme
Begin date
Contact information
Link to publication website
Responsible use
Goal and impact
The aim of deploying the AI assistant is to reduce the administration burden among neighbourhood team members. By deploying the tool, a time saving of 50-75% is aimed at processing interviews into a draft report. The AI assistant does not make independent decisions; a neighbourhood team member always checks the outcomes. Conclusions are always formulated by the neighbourhood team officer himself. Every report, including those not prepared through the AI assistant, is submitted to residents for approval. Participation is voluntary for residents and is explicitly requested prior to the interview. Use of the tool is separate from (access to) help and support.
Considerations
To carefully consider the deployment of the AI assistant, a DPIA was conducted, an internal values dialogue was held, and the external Digitalisation and Ethics Sounding Board Group was consulted. The insights and advice were adopted. Prior to the decision to start this pilot, other steps were taken to significantly improve the administrative burden for district team staff, but this did not yield sufficient results.
Human intervention
The (interim) results of the AI assistant can be read in real time by the neighbourhood team member and resident. If the report is generated, the neighbourhood team member can modify, delete or add any generated text.
Risk management
Through the DPIA carried out, the values dialogue conducted and the advice of the Digitalisation and Ethics Sounding Board Group, data protection and ethics risks have been identified. Measures taken to manage risks include the following: district team employees have been trained to work with the AI assistant. Specific attention has been paid to being able to properly explain to residents what this AI assistant does, and guidelines have been drawn up for its use. Furthermore, agreements have been made with the supplier about the processing of personal data, the right level of security has been applied to the system and this has been tested independently. Furthermore, only the data necessary for the process are kept. There is constant monitoring of the AI assistant's performance.
Legal basis
The processing of personal data is necessary for the fulfilment of a task of public interest (implementation of Wmo and Youth Act) assigned to the controller;
Links to legal bases
- artikel 5.1.1 Wet maatschappelijke ondersteuning: https://wetten.overheid.nl/BWBR0035362/2025-01-01/0
- artikel 7.4.0 en 7.4.1 Jeugdwet: https://wetten.overheid.nl/BWBR0034925/2025-01-01/0
- artikel 6, eerste lid onder e Algemene Verordening Gegevensbescherming : https://eur-lex.europa.eu/legal-content/NL/TXT/?uri=legissum:310401_2
Elaboration on impact assessments
The AI Assistant is not intended to be an AI system for assessing (the degree of) access to essential government benefits and services. In particular, its purpose is to provide better and faster assistance to residents and reduce the administrative burden. As a result, the application falls under one of the exceptions in the AI Act (Section 6(3) AI Act) where AI systems that fall under one of the application areas are nevertheless not considered high-risk AI (Essential private and public services and benefits (Section 6(2) in conjunction with Annex III AI Act)) and conducting an IAMA test is not mandatory. An internal ethical values dialogue on public values has been conducted (CODIO). In addition, the municipality received an opinion from the external sounding board group on digitisation and ethics in April 2025 (residents).
Impact assessment
Operations
Data
The algorithm was developed with a description about the conversation protocol. To refine the prompting model, simulated conversations between neighbourhood team members were used. The algorithm only processes data obtained from an audio recording of the conversation between the resident and the neighbourhood team member. Personal data that could potentially be processed include name and address, BSN, e-mail address, phone number, date of birth, family situation, health data. Directly traceable personal data is removed during the conversion of every 30 to 60 seconds of audio to text. Data such as family situation, health data, and other special personal data that are mentioned in the interview and may be relevant to providing the necessary help and support to the person concerned will be processed on the supplier's platform and stored for a maximum of 28 days.
Technical design
The AI conversation record assistant is a is a tool to convert the spoken word during a question clarification conversation into written record. The software converts audio to a written-out, de-indented transcript. This transcript is presented to various locally hosted copies of LLMs (Large language Models) for writing the interview report. To ensure that these reports are written in the desired form, the software contains specific rules about the prompts offered to the LLMs. The software forms, as it were, an intermediate layer between the interview transcript and the LLMs that ensures that the ouput of the LLMs better matches the district teams' method of reporting than if one were to offer a transcript directly to an LLM.
The software uses web services and locally hosted copies of LLMs. The following models are used: speech modelling services 'Azure Language' and speech-to-text services (Microsoft), GPT4o and GPT4o mini from OpenAI, Claude Sonnet from Anthropic and Llama (open source). LLM providers (e.g. OpenAI) do not connect to the servers of the vendor platform. LLM provider may provide the models, but hosting is done independently on supplier servers within the EEA. As a result, the data is not accessible or viewable by LLM providers.
External provider
Similar algorithm descriptions
- PZH-Assist is a chatbot that can be consulted for general use. This chatbot is not trained on internal data and has no knowledge of it. However, it is possible to ad hoc insert a document into the chatbot and ask questions about it. The chatbot is only available to employees of Province of South Holland.Last change on 8th of November 2024, at 13:12 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- DPIA
- Status
- In development
- In criminal investigations, voice comparison is used to find out whether the suspect's and offender's voices are the same. One of the methods that can be used for this purpose is automatic speaker comparison.Last change on 21st of June 2024, at 11:24 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- DPIA
- Status
- In use
- If a lawyer/mediator applies for subsidised legal aid for a citizen, this must be substantiated with documents. To speed up the process, a procedure has been devised whereby applications for an addition or declaration are granted automatically. This is checked afterwards by means of a random check.Last change on 3rd of February 2025, at 8:51 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- DPIA
- Status
- In use
- The AI system helps in the reporting of council meetings. The system does this by automatically converting spoken text to written text.Last change on 27th of November 2024, at 11:15 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- Field not filled in.
- Status
- In use
- The model helps detect and analyse irregularities and irregularities following the allocation of a Wmo/Jeugdwet provision. The model signals whether further investigation is needed into the spending of funds.Last change on 5th of July 2024, at 9:31 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- IAMA
- Status
- In use