Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.
Anonymising documents Woo requests
- Publication category
- Other algorithms
- Impact assessment
- DPIA
- Status
- In use
General information
Theme
- Organisation and business operations
- Law
Begin date
Contact information
Link to publication website
Responsible use
Goal and impact
We use the computer anonymisation programme to ensure that we can disclose as much information as possible (transparency). And at the same time to protect individuals, companies and organisations that may be named in the documents. The computer programme helps us do anonymisation as quickly and easily as possible.
The computer programme's algorithm poses little risk. This applies to individuals (residents and employees of organisations using the programme), companies and organisations. The algorithm specifically searches for (personal) data and hides or reports them, whatever else is in the documents. The computer programme makes a proposal for anonymising a piece of text. An employee with knowledge of the subject assesses the proposal: the programme therefore does not make automatic decisions. Furthermore, an employee within the programme can hide (black mark) text that cannot be disclosed for other reasons. For example, a piece of text containing strategic information for the protection of the municipality or organisations we work with. The document always states the reason for hiding text.
Considerations
Without the use of the computer programme, anonymising documents takes much more time. Using the programme allows us to disclose information more quickly and easily. There is also less chance of human error due to automatic anonymisation. This reduces the chance of a data leak and ensures better protection of individuals' data.
If a subject does not involve many documents to be disclosed, we anonymise them ourselves. As a result, we use the algorithm only when necessary.
Human intervention
The algorithm makes proposals that are always checked by hand. So there is a human check on the algorithm. We can also specify which words should or should not be anonymised anyway.
Risk management
The risks are anonymising too much or too little. To ensure this does not happen, a staff member checks the algorithm's proposals.
Legal basis
The Open Government Act (Woo), which regulates disclosure of documents. The General Data Regulation (AVG) applies to the meaning of personal data.
Links to legal bases
- Wet open overheid: https://wetten.overheid.nl/BWBR0045754/2024-08-01/0
- Algemene verordening Persoonsgegevens: https://eur-lex.europa.eu/legal-content/NL/TXT/HTML/?uri=CELEX:02016R0679-20160504
Link to Processing Index
Impact assessment
Operations
Data
This depends on the document being anonymised. Examples include personal data such as e-mail addresses, phone numbers, bank account numbers, address details and signatures. And based on the Open Government Act (Woo), it can also include data beyond personal data. These exceptions are set out in the Woo.
Technical design
Deep learning models that determine which information is considered privacy-sensitive. The models do this by assessing how pages look (visual inspection) and by scanning text.
External provider
Similar algorithm descriptions
- The municipality is obliged to publish all official publications. Many of these documents, such as licensing decisions, contain sensitive information. Therefore, the documents have to be anonymised. This is done with the help of an algorithm.Last change on 29th of February 2024, at 19:01 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- DPIA
- Status
- In use
- Among other things, the algorithm identifies and anonymises (personal) data and confidential (financial) information in documents before they are published, as required by the Open Government Act.Last change on 11th of April 2025, at 8:12 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- DPIA
- Status
- In use
- Among other things, the algorithm identifies and anonymises (personal) data and confidential (financial) information in documents before they are published, as required by the Open Government Act.Last change on 12th of September 2024, at 8:23 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- Field not filled in.
- Status
- In use
- Among other things, the algorithm recognises and anonymises (personal) data and confidential (financial) data in documents before they are published, e.g. on the basis of the Open Government Act.Last change on 4th of April 2024, at 9:22 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- Field not filled in.
- Status
- In use
- Among other things, the algorithm recognises and anonymises (personal) data and confidential (financial) data in documents before they are published, e.g. on the basis of the Open Government Act.Last change on 5th of September 2024, at 14:30 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- DPIA
- Status
- In use