Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.

Microsoft 365 Copilot Chat

Employees can use Microsoft 365 Copilot Chat (hereafter Copilot Chat). Copilot Chat is a generative AI tool offered by Microsoft, where county employees with a county Microsoft account can ask questions, generate texts and images, and create translations and summaries.

Last change on 15th of April 2025, at 9:56 (CET) | Publication Standard 1.0
Publication category
Other algorithms
Impact assessment
DPIA, IAMA
Status
In use

General information

Theme

Organisation and business operations

Begin date

2025-04

Contact information

info@provinciegroningen.nl

Responsible use

Goal and impact

Province employees can use Copilot Chat to ask questions, generate texts and images, and create translations and summaries. Copilot Chat can be used to make daily work more efficient. In addition, it is important for county employees to gain knowledge and experience with this new technology. However, it is essential that county employees have knowledge of the risks of using generative AI tools. Guidelines on this have been established and are continuously propagated.

Considerations

The province of Groningen has taken SLM-Rich's DPIA and FRAIA into account in its risk assessments. In addition, guidelines have been established and are continuously promulgated to enable responsible use. Also, Microsoft 365 Copilot is currently deliberately not being used so that unintended access to certain confidential information is excluded.

Human intervention

The guidelines assume that all generated content is always checked by the user and, if necessary, also checked by a colleague (the four-eye principle) before being used.

Risk management

Existing policies from information security, privacy, information management, procurement and staff codes of conduct form the starting point for risk management around Generative AI. Additionally, guidelines for responsible use of Generative AI have been drawn up to provide clarity to employees on what is and what is not possible. Furthermore, internal awareness, communication and training on AI literacy are continuously addressed.

Legal basis

AI regulation underpins this analysis and our policies and guidelines.

Links to legal bases

AI-Verordening: https://eur-lex.europa.eu/legal-content/NL/TXT/PDF/?uri=OJ:L_202401689

Elaboration on impact assessments

The DPIA and FRAIA SLM realm were reviewed and the identified risks were assessed by CDO, CISO, PO and FG for relevance and applicability when using Copilot Chat.

Impact assessment

  • Data Protection Impact Assessment (DPIA): https://slmmicrosoftrijk.nl/wp-content/uploads/2024/12/20241217-Memo-M365-Copilot-DPIA-en-FRAIA.pdf
  • Impact Assessment Mensenrechten en Algoritmes (IAMA): https://slmmicrosoftrijk.nl/wp-content/uploads/2024/12/20241217-Memo-M365-Copilot-DPIA-en-FRAIA.pdf

Operations

Data

Existing policies from information security, privacy, information management, procurement and staff codes of conduct form the starting point for the data to be used in the input (prompts). In addition, guidelines for responsible use of generative AI have been drawn up to provide clarity to employees on what is and is not possible.

Technical design

Copilot Chat uses language models from OpenAI hosted as a service in a Microsoft Azure environment. The language models are trained on huge amounts of text data, which enables them to generate and understand human-like texts. Copilot can invoke these language models and use an internet search to answer questions, write texts and provide relevant information.

External provider

Microsoft

Link to code base

Niet beschikbaar

Similar algorithm descriptions

  • Employees can use Microsoft 365 CoPilot as an AI chatbot to generate texts and get comprehensive answers to questions. Some principles described in our guidelines are that no internal or confidential information should be used and that the information should be properly checked for accuracy.

    Last change on 27th of March 2025, at 13:24 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DPIA, IAMA
    Status
    In use
  • Employees can use Microsoft CoPilot. This to generate texts and images or get answers to questions. No confidential information may be used in the process, but no personal data either. An important condition is that the information is good, reliable and checked for accuracy.

    Last change on 7th of April 2025, at 13:51 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • A group of employees is experimenting with Microsoft 365 Copilot to gain hands-on experience with generative AI. This pilot aims to gain knowledge and experience and explore, through case studies, how it can help with overview, analysis, speed and cost savings.

    Last change on 17th of March 2025, at 17:20 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DPIA
    Status
    In development
  • A group of employees is experimenting with Microsoft 365 Copilot to gain hands-on experience with generative AI. This pilot aims to gain knowledge and experience and explore, through case studies, how it can help with overview, analysis, speed and cost savings.

    Last change on 17th of April 2025, at 14:39 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DPIA
    Status
    In development
  • To improve internal efficiency, a test project has been set up to see if Microsoft CoPilot can help employees perform day-to-day tasks.

    Last change on 20th of January 2025, at 10:12 (CET) | Publication Standard 1.0
    Publication category
    High-Risk AI-system
    Impact assessment
    Field not filled in.
    Status
    In development