Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.
Microsoft 365 Copilot Chat
- Publication category
- Other algorithms
- Impact assessment
- DPIA
- Status
- In use
General information
Theme
Begin date
Contact information
Responsible use
Goal and impact
Employees can use Copilot Chat for many different tasks. For example, you can let the "tool" ask questions, create texts or pictures, and translate or summarise texts. This helps to make daily work faster and easier. You will also learn more about how artificial intelligence (AI) works by using Copilot Chat. However, it is important that employees pay close attention to the risks of these AI tools.
The municipality sees the benefits of AI and is following developments closely. We are keen to use this new technology, but it must be done safely and sensibly. That is why the municipality has made clear ground rules. These rules tell you how to use Copilot Chat properly during your work.
Considerations
At Hollands Kroon, we advise employees to use only Microsoft 365 Copilot Chat for their work. Other AI tools are discouraged.
The municipality chose this tool for this reason:
- Security: The information you type into Copilot Chat remains within the municipality's secure environment (tenant) .
- Privacy: With other (free) AI tools, the municipality does not know what happens to your information or where it is stored.
By using only Copilot Chat, we keep our data safe and private.
Human intervention
You should always check Microsoft 365 Copilot Chat answers carefully before using them for your work. It is important that you keep thinking critically yourself about what the computer makes.
Copilot is a very useful tool for writing texts or gathering information. Nevertheless, control by a human is always necessary. Only you can determine whether the text is correct, whether the information is important and whether the quality is good enough. That is why the rules of the game contain clear agreements on this mandatory checking.
Risk management
AI policies form the basis for responsible handling of generative AI within the organisation. These frameworks provide guidance for identifying and managing risks associated with the use of AI technologies.
To provide concrete support to employees in the daily use of generative AI, additional ground rules have been drawn up. These ground rules provide clarity on what is and is not allowed, and help make responsible choices in line with organisational objectives and legal obligations.
In addition, structural investments are made in:
- Internal awareness: Through communication actions via the internal communication network, and using practical examples, employees are actively involved in the responsible use of AI.
- Training and education: Training is offered aimed at responsible use (AI Literacy), so that employees understand how AI works, the risks involved, and how to use this technology safely and effectively.
- Specific co-pilot user training has been made mandatory for the use of Co-pilot.
Legal basis
The European AI Regulation forms the basis of municipal policy around the responsible development and deployment of AI & Algortimen.
Links to legal bases
Impact assessment
Operations
Data
The existing AI policy is the starting point for the data to be used for input (prompts). In addition, guidelines for responsible use of generative AI have been drawn up to provide clarity to employees on what is and is not possible.
Technical design
Copilot Chat uses language models from OpenAI hosted as a service in a Microsoft Azure environment. The language models are trained on huge amounts of text data, which enables them to generate and understand human-like texts. Copilot can invoke these language models and use an internet search to answer questions, write texts and provide relevant information.
Using the Microsoft Graph, copilot can aggregate, display and process information from its own sources.
External provider
Similar algorithm descriptions
- Employees can use Microsoft 365 Copilot Chat (hereafter Copilot Chat). Copilot Chat is a generative AI tool offered by Microsoft, where employees with a Microsoft account from the municipality can ask questions, generate texts and images, and create translations and summaries.Last change on 30th of September 2025, at 12:24 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- Field not filled in.
- Status
- In use
- Employees can use Microsoft 365 Copilot Chat (hereafter Copilot Chat). Copilot Chat is a generative AI tool offered by Microsoft, where county employees with a county Microsoft account can ask questions, generate texts and images, and create translations and summaries.Last change on 15th of April 2025, at 9:56 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- DPIA, IAMA
- Status
- In use
- Employees can use Microsoft 365 Copilot: Copilot Chat for questions, texts and summaries. Agents perform tasks such as taking notes or starting workflows. Controls regulate data access, privacy and features by user group.Last change on 3rd of November 2025, at 7:27 (CET) | Publication Standard 1.0
- Publication category
- Impactful algorithms
- Impact assessment
- IAMA, DPIA
- Status
- In use
- Employees can use Microsoft 365 CoPilot as an AI chatbot to generate texts and get comprehensive answers to questions. Some principles described in our guidelines are that no internal or confidential information should be used and that the information should be properly checked for accuracy.Last change on 27th of March 2025, at 13:24 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- DPIA, IAMA
- Status
- In use
- Employees may use Microsoft 365 CoPilot as a smart chatbot. With it, they can have texts created and get detailed answers to questions. Our rules state that employees must not use secret or confidential information. Also, employees must always check carefully whether the information is correct.Last change on 16th of September 2025, at 10:32 (CET) | Publication Standard 1.0
- Publication category
- Other algorithms
- Impact assessment
- Field not filled in.
- Status
- In use