Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.
Microsoft 365 Copilot
- Publication category
- Impactful algorithms
- Impact assessment
- DPIA
- Status
- In use
General information
Theme
Begin date
Contact information
Responsible use
Goal and impact
The purpose of deploying Microsoft 365 Copilot is to support employees with standard tasks, such as writing texts, creating summaries, drafting e-mails and preparing for meetings. Copilot helps perform routine tasks faster so that employees have more time for substantive work and collaboration.
Using Copilot leads to higher efficiency and better quality of documents and communication. Residents and businesses notice this indirectly, for example because letters are clearer and response times are shorter. In addition, the application contributes to developing digital skills and increasing knowledge about generative AI within the organisation.
The municipality is also experimenting with AI agents to further speed up work processes. These agents support specific tasks, but are always used with explicit human intervention and assessment.
Considerations
The municipality chooses Microsoft 365 Copilot to support employees in day-to-day work. It saves time, improves the quality of documents and gives employees more room for content. It also contributes to job satisfaction and digital skills.
At the same time, there are risks. Copilot answers may contain errors and should always be checked. There is a focus on privacy and safe handling of data. Dependence on technology requires clear agreements and conscious use. Therefore, our employees are trained in responsible use of AI.
Alternatives such as ChatGPT or Claude have been considered but not found suitable. Copilot is integrated into frequently used programmes (the office apps such as Word, Excel, etc.). In addition, the governance capabilities and Enterprise Data Protection (EDP) offered by Microsoft are important reasons for this choice.
Human intervention
Microsoft 365 Copilot provides suggestions, but employees always remain fully responsible for the content they use. Every outcome of Copilot must be checked for accuracy by an employee. Copilot does not make decisions or perform tasks without human intervention. The use of Copilot is solely supportive and always requires a conscious review by the user. Employees are trained in the responsible deployment of generative AI so that they can properly assess and correct the output.
Risk management
There are technical, legal and ethical risks when using Microsoft 365 Copilot. Examples include incorrect or misleading output, incorrect data use and the risk of employees relying too much on automatically generated information. To mitigate these risks, employees are trained in responsible use of generative AI. They learn to critically assess output, handle information carefully and follow the applicable rules of the game.
During the pilot phase, extensive research was done on which applications could be used responsibly. Data use, privacy, security and possible impact on decision-making were examined. Riskier forms of deployment are tested in advance by the data protection team so that it is clear what measures are needed and what is or is not allowed.
Legal basis
The European AI Regulation forms the basis of municipal policy around the responsible development and deployment of AI and algorithms. It aims to ensure that AI systems are safe, work transparently and respect people's fundamental rights. The regulation distinguishes between different levels of risk of AI systems. The higher the risk, the stricter the rules. For generative AI such as Microsoft 365 Copilot, there are especially obligations around transparency and human oversight.
Besides the AI regulation, municipal policy rests on existing information security and privacy legislation, such as the General Data Protection Regulation (GDPR). In addition, internal ground rules have been drawn up for the use of Copilot. These laws ensure that personal data is properly protected and that employees handle data and Copilot with care.
Links to legal bases
Link to Processing Index
Impact assessment
Operations
Data
Microsoft 365 Copilot only uses data that an employee can access himself via the Microsoft Graph. This could be documents, e-mails, calendar items or chats, for example. So it does not work with all data within the organisation, but only up to the documentation the employee has rights to.
Employees are strongly advised against sharing personal data and business-sensitive information with generative AI. Information from subject applications should not be shared with Copilot. This helps to mitigate risks and handle confidential data carefully. Personal data may appear indirectly in documents or communications, but is not structurally stored or shared.
Copilot also processes prompts (user input). These prompts and responses are temporarily stored to support functionality, but not used for training AI models. This is a big difference from other AI tools, which do use your information to train their models.
The municipality ensures that employees handle data in Copilot consciously. The use of Copilot falls under the existing information security and privacy policy and has been supplemented with specific ground rules
The data used by Copilot remain within the municipal Microsoft environment.
Technical design
Microsoft 365 Copilot combines user input with context from Microsoft Graph, such as documents, e-mail and calendar items, for which the user is already authorised. The Copilot orchestrator enriches the command, checks access rights and forwards it to a large language model (LLM) within the Microsoft cloud. This model generates a draft response that returns to the relevant Microsoft 365 app. All processing takes place within the secure tenant and the output is not used for model training.
More information can be found on Microsoft's website, including in this resource:
https://learn.microsoft.com/nl-nl/copilot/microsoft-365/microsoft-365-copilot-overview
External provider
Similar algorithm descriptions
- Employees can use Microsoft 365 Copilot: Copilot Chat for questions, texts and summaries. Agents perform tasks such as taking notes or starting workflows. Controls regulate data access, privacy and features by user group.Last change on 3rd of November 2025, at 7:27 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- IAMA, DPIA
- Status
- In use
- Internally, Microsoft Copilot is used. It is integrated into applications such as Outlook, Word and Excel and is made available to employees on demand. Employees use it mainly for summarising or producing texts.Last change on 3rd of November 2025, at 13:20 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- Field not filled in.
- Status
- In use
- Explore the use of Microsoft Copilot 365, integrated into Office applications (including Word, Excel, Outlook, Teams) and the municipal work context, to investigate whether employees can work more efficiently and reduce administrative burdens.Last change on 13th of November 2025, at 12:04 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- DPIA
- Status
- In use
- Employees can use Microsoft 365 Copilot Chat (hereafter Copilot Chat). Copilot Chat is a generative AI tool offered by Microsoft, where employees with a Microsoft account from the municipality can ask questions, generate texts and images, and create translations and summaries.Last change on 30th of September 2025, at 12:24 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- Field not filled in.
- Status
- In use
- Employees may use Microsoft 365 CoPilot as a smart chatbot. With it, they can have texts created and get detailed answers to questions. Our rules state that employees must not use secret or confidential information. Also, employees must always check carefully whether the information is correct.Last change on 16th of September 2025, at 10:32 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- Field not filled in.
- Status
- In use