Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.

Microsoft CoPilot

An internal chatbot provided by Microsoft.

Last change on 11th of November 2025, at 12:45 (CET) | Publication Standard 1.0
Publication category
High-Risk AI-system
Impact assessment
Field not filled in.
Status
In use

General information

Theme

Organisation and business operations

Begin date

04-2025

Contact information

https://www.bergen.nl

Responsible use

Goal and impact

The purpose of Microsoft 365 CoPilot is to support employees in their daily work. For example, Microsoft 365 CoPilot helps with writing texts, creating summaries, drafting e-mails or preparing for meetings. This is done with the help of artificial intelligence (AI).


The impact is that employees can do their work faster and easier. They need to spend less time on standard tasks and can better focus on content and collaboration. Citizens and companies notice this indirectly, for example because letters are clearer, answers come faster or meetings are better prepared.

Considerations

The municipality chooses to deploy Microsoft 365 CoPilot to support employees in their daily work. For example, Microsoft 365 CoPilot helps with writing texts, making summaries and preparing for meetings. This saves time. And ensures better quality documents and gives employees more space to focus on content. It also contributes to job satisfaction and digital skills.


At the same time, there are risks. Microsoft 365 CoPilot answers may contain errors. And are therefore always checked. Attention is also needed to privacy and secure data handling. Dependence on technology requires clear agreements and conscious use.


The municipality prefers Microsoft 365 CoPilot and is banking on awareness around generative AI. Microsoft 365 CoPilot is integrated into widely used programmes and recommended within the organisation.


This choice involves ethical considerations where necessary. Consider comprehensibility, human control, equal treatment and transparency.


Employees are trained in the responsible use of artificial intelligence and specifically generative AI. Rules of the game have been drawn up for dealing with this technology. In addition, the municipality is working on strategic and tactical policies for dealing with AI. A DPIA and human rights test are carried out when necessary.

The set-up of CoPilot is such that all data remains within the municipality. We do not use CoPilot plus (the paid version of MS).

Human intervention

Microsoft 365 CoPilot supports employees in their work, but the outcomes are always used and assessed by humans. For example, when Microsoft 365 CoPilot makes a text proposal, the employee checks whether the content is correct and appropriate to the situation. Even with summaries or e-mails, the employee remains responsible for the final version.


The deployment of Microsoft 365 CoPilot thus requires active human intervention. Employees use the outcomes as a tool, but make the decisions themselves. They can modify the suggestions, add to them or ignore them altogether. This ensures the quality and reliability of the work.


There is no automatic decision-making. Microsoft 365 CoPilot supports, but does not replace humans. Employees are trained in the responsible use of generative AI, so they know how to properly assess and adjust outcomes. The setup of CoPilot is such that all data stays within the municipality. We do not use CoPilot plus (the paid version of MS).

Risk management

When using Microsoft 365 CoPilot, several risks are in the picture. These include technical risks, such as errors in generated texts, and legal risks around privacy and data use. There are also ethical risks, such as the risk of unequal treatment, unclear decision-making or loss of human control.


To mitigate these risks, employees are trained in the responsible use of AI, including generative AI such as Microsoft 365 CoPilot. They learn how to monitor, adjust outcomes and handle information carefully. Ground rules have been drawn up for handling generative AI so that it is clear what is and what is not allowed. This is also secured within the internal code of conduct. The set-up of CoPilot is such that all data remains within the municipality. We do not use CoPilot plus (the paid version of MS).


The municipality is working on a strategic policy framework for dealing with AI. This includes periodic evaluation and monitoring of the use of Microsoft 365 CoPilot in pilots. In this way, risks can be identified and addressed in a timely manner. This framework is in line with the standards around information security and privacy, which is considered guiding when deploying new technologies.

Legal basis

The European AI Regulation forms the basis of municipal policy around the responsible development and deployment of AI and algorithms. It aims to ensure that AI systems are safe, work transparently and respect people's fundamental rights. The regulation distinguishes between different levels of risk of AI systems. The higher the risk, the stricter the rules. For generative AI such as Microsoft 365 CoPilot, there are mainly obligations around transparency and human oversight. The setup of CoPilot is such that all data remains within the municipality. We do not use CoPilot plus (MS's paid version).


Besides the AI regulation, municipal policy rests on existing information security and privacy legislation, such as the General Data Protection Regulation (GDPR). These laws ensure that personal data is properly protected and that employees handle data with care.

Links to legal bases

AI Act: AI Act: Link naar externe paginahttps://eur-lex.europa.eu/legal-content/NL/TXT/PDF/?uri=OJ:L_202401689

Operations

Data

Field not filled in.

External provider

Microsoft

Similar algorithm descriptions

  • Employees can use Microsoft 365 Copilot Chat (hereafter Copilot Chat). Copilot Chat is a generative AI tool offered by Microsoft, where employees with a Microsoft account from the municipality can ask questions, generate texts and images, and create translations and summaries.

    Last change on 30th of September 2025, at 12:24 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • Employees can use Microsoft 365 Copilot Chat (hereafter Copilot Chat). Copilot Chat is a generative AI tool offered by Microsoft, where county employees with a county Microsoft account can ask questions, generate texts and images, and create translations and summaries.

    Last change on 15th of April 2025, at 9:56 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DPIA, IAMA
    Status
    In use
  • Employees can use Microsoft 365 CoPilot as an AI chatbot to generate texts and get comprehensive answers to questions. Some principles described in our guidelines are that no internal or confidential information should be used and that the information should be properly checked for accuracy.

    Last change on 27th of March 2025, at 13:24 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DPIA, IAMA
    Status
    In use
  • Employees may use Microsoft 365 CoPilot as a smart chatbot. With it, they can have texts created and get detailed answers to questions. Our rules state that employees must not use secret or confidential information. Also, employees must always check carefully whether the information is correct.

    Last change on 16th of September 2025, at 10:32 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • Employees can use Microsoft 365 Copilot: Copilot Chat for questions, texts and summaries. Agents perform tasks such as taking notes or starting workflows. Controls regulate data access, privacy and features by user group.

    Last change on 3rd of November 2025, at 7:27 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    IAMA, DPIA
    Status
    In use