Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.
Experiment Microsoft Copilot 365
- Publication category
- Impactful algorithms
- Impact assessment
- DPIA
- Status
- In use
General information
Theme
Begin date
End date
Contact information
Responsible use
Goal and impact
The experiment investigates whether Copilot 365 increases staff productivity by reducing administrative burdens (such as documentation, reports and emails), thus freeing up more time for substantive tasks and better services to residents.
Considerations
When deploying Copilot 365, there are clear ground rules for responsible use and careful handling of data. Employees always remain responsible for the final choices and content, and assess whether the output generated by Copilot is correct and usable. Privacy and information security are safeguarded through the DPIA, agreements with the supplier and our ground rules.
Human intervention
Always: employee decides whether Copilot output is used, modified or rejected.
Risk management
In this experiment, we looked specifically at the requirements of the European AI Regulation. In doing so, we fit in with our existing policies on privacy, information security and procurement, as well as topics such as awareness, code of conduct and use of cloud services. To provide employees with guidance, rules have been drawn up for the responsible use of generative AI: what is and is not possible. Furthermore, we work on awareness, communication and training so that colleagues become familiar with AI and know how to use it responsibly.
Elaboration on impact assessments
The risks were assessed by the Data Protection Officer (FG) for relevance and applicability when using AI. Based on this assessment, and in the light of the publications of VNG and IBD, it was indicated that the AI project can be continued responsibly, provided the emphasis remains on careful and explainable use.
Impact assessment
Operations
Data
Copilot uses all the data within the Microsoft 365 graph to which an employee himself has access (such as emails, documents and Teams messages). This data is converted by Microsoft into a semantic index, allowing Copilot to make connections. Copilot sees data only when an employee actively asks for it in a prompt. The data does not leave the Microsoft tenant and is not used to train AI models.
According to the municipality's Generative AI game rules, when an employee enters a prompt, Copilot may only be used with unstructured work data.
Not allowed: personal data and other classified or sensitive information (such as payroll, healthcare or criminal records).
We intend to exclude certain SharePoint channels from indexing to avoid risks of AI misuse and non-compliance with the rules of the game.
Technical design
Copilot uses Large Language Models (LLMs) integrated into Microsoft 365. When an employee enters a prompt, Copilot combines this question with relevant context from the Microsoft Graph and the semantic index. Only information to which the employee himself has access can be used in the answer.
The processing of prompts and context takes place within the secure Microsoft tenant. Copilot then generates a response based on the prompt, the context and the underlying AI model. The temporary prompt data is not stored and not used for training the model.
Although Copilot provides support, human control remains necessary: Copilot may make mistakes or misinterpret information.
External provider
Similar algorithm descriptions
- Internally, Microsoft Copilot is used. It is integrated into applications such as Outlook, Word and Excel and is made available to employees on demand. Employees use it mainly for summarising or producing texts.Last change on 3rd of November 2025, at 13:20 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- Field not filled in.
- Status
- In use
- Employees can use Microsoft 365 Copilot: Copilot Chat for questions, texts and summaries. Agents perform tasks such as taking notes or starting workflows. Controls regulate data access, privacy and features by user group.Last change on 3rd of November 2025, at 7:27 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- IAMA, DPIA
- Status
- In use
- Employees may use Microsoft 365 CoPilot as a smart chatbot. With it, they can have texts created and get detailed answers to questions. Our rules state that employees must not use secret or confidential information. Also, employees must always check carefully whether the information is correct.Last change on 16th of September 2025, at 10:32 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- Field not filled in.
- Status
- In use
- To improve internal efficiency, a test project has been set up to see if Microsoft CoPilot can help employees perform day-to-day tasks.Last change on 21st of January 2025, at 10:50 (CET) | Publication Standard 1.0
- Publication category
- High-Risk AI-system
- Impact assessment
- Field not filled in.
- Status
- In development
- To improve internal efficiency, a test project has been set up to see if Microsoft CoPilot can help employees perform day-to-day tasks.Last change on 20th of January 2025, at 10:12 (CET) | Publication Standard 1.0
- Publication category
- High-Risk AI-system
- Impact assessment
- Field not filled in.
- Status
- In development