Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.

The Summoner

The Summarizer allows ACM employees to summarise documents, especially reports and decisions, using an internal language model (currently GPT-5-mini in the ACM Azure environment). Confidential documents may also be summarised because of privacy and information security agreements with the vendor.

Last change on 24th of February 2026, at 19:34 (CET) | Publication Standard 1.0
Publication category
Other algorithms
Impact assessment
IAMA, DPIA
Status
In use

General information

Theme

  • Economy
  • Organisation and business operations

Begin date

1-2025

Contact information

info@acm.nl

Link to publication website

Bevoegdheden | ACM

Responsible use

Goal and impact

The purpose of the algorithm is to summarise voluminous documents. A summary is thus generated faster. In principle, citizens and companies do not come into contact with the algorithm. However, the product (the summary) may reach a citizen or company via, for example, a decision.

Considerations

The advantage of this algorithm is that it is time-efficient and produces more structured and uniform summaries than when employees do it by hand.

Human intervention

A generated summary is proofread by an individual and adjusted where necessary. The summary is used in the work of an ACM employee.

Risk management

A generated summary is proofread by a person and adjusted where necessary. If the summary has not yet been checked, a disclaimer is included that the summary was generated with AI. The Summariser also contains a warning to be careful with personal data and not to insert it into the Summariser. Personal data should only be inserted if it is proportionate and there is a basis for processing personal data. Last but not least, documents offered by the ACM are not used to train the model and are therefore not inadvertently made public.


Elaboration on impact assessments

The DPIA is nearing completion

Impact assessment

  • Human Rights and Algorithms Impact Assessment (IAMA)
  • Data Protection Impact Assessment (DPIA)

Operations

Data

Documents relevant to the ACM

Technical design

A language model from OpenAI is used. This model was initially trained on large amounts of data and then fine-tuned by OpenAI to follow user instructions. The language model was made available by Microsoft through the Azure OpenAI API. A number of prompts have been created by functional administrators ('super users'). These are sent to the language model along with a source document. The API then sends back a response (the summary) that is displayed in the Summariser. OpenAI's underlying language model may not store the data sent to summarise or use it to train the model further.

Similar algorithm descriptions

  • The algorithm underlines personal data in documents. An employee has to look at all pages and check if the document is properly lacquered. Then the software removes all highlighted information and blacklists it. After that, the documents can be published, for example under the Open Government Act (WOO).

    Last change on 5th of February 2025, at 9:15 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DEDA, DPIA
    Status
    In use
  • The algorithm underlines personal data in documents. An employee has to review all the pages and check whether the document is properly anonymised. Then the software removes all highlighted information and blacklists it. After that, the documents can be published, for example under the Open Government Act.

    Last change on 19th of September 2024, at 9:10 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DEDA, DPIA
    Status
    In use
  • The algorithm underlines personal data in documents. An employee has to review all the pages and check whether the document is properly anonymised. Then the software removes all highlighted information and blacklists it. After that, the documents can be published, for example under the Open Government Act.

    Last change on 18th of June 2024, at 8:38 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DEDA, DPIA
    Status
    In use
  • The algorithm underlines personal data in documents. An employee has to review all the pages and check whether the document is properly anonymised. Then the software removes all highlighted information and blacklists it. After that, the documents can be published, for example under the Open Government Act.

    Last change on 17th of June 2024, at 10:40 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • The algorithm underlines personal data in documents. An employee has to review all the pages and check whether the document is properly anonymised. Then the software removes all highlighted information and blacklists it. After that, the documents can be published, for example under the Open Government Act.

    Last change on 8th of July 2024, at 15:45 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    Field not filled in.
    Status
    In use