Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.

Copilot

Internally, Microsoft Copilot is used. It is integrated into applications such as Outlook, Word and Excel and is made available to employees on demand. Employees use it mainly for summarising or producing texts.

Last change on 3rd of November 2025, at 13:20 (CET) | Publication Standard 1.0
Publication category
Other algorithms
Impact assessment
Field not filled in.
Status
In use

General information

Theme

Organisation and business operations

Begin date

2025-02

Contact information

info@odh.nl

Link to publication website

https://omgevingsdiensthaaglanden.nl/

Responsible use

Goal and impact

Omgevingsdienst Haaglanden uses Copilot Chat as a generative AI assistant to support employees in a variety of cognitive activities within the office environment. The aim is to enhance creativity, efficiency and digital skills, including by providing support in generating ideas, structuring information and drafting texts.

Copilot Chat operates solely on the basis of user input and does not access internal data or systems within the Microsoft 365 environment of Omgevingsdienst Haaglanden. No automated decisions are made and the application is not deployed for processes that directly affect residents, businesses or policy-making.

The impact of the use is limited to personal support for employees. To guarantee the use of Copilot Chat in a careful and responsible manner, Omgevingsdienst Haaglanden has established a number of principles that guide its use within the organisation.

Considerations

Omgevingsdienst Haaglanden chose to deploy Copilot Chat as a generative AI application because of its ability to support employees in cognitive work, such as organising information, exploring ideas and drafting texts.

In making this choice, conscious consideration was given to the limited impact on processes and data, the lack of automated decision-making and the guarantee that the use of Copilot Chat is advisory and supportive only. Microsoft's AI assistant was also explicitly chosen instead of solutions from OpenAI or Google, for example. This choice is in line with the fact that Omgevingsdienst Haaglanden has already set up its cloud environment, storage and collaboration services within the Microsoft 365 platform.

Human intervention

Human intervention is an essential starting point when using Copilot Chat within Omgevingsdienst Haaglanden. Copilot functions solely as an advisory tool and supports employees in carrying out their work.

The final responsibility for the content, interpretation and application of the generated output always remains with the employee. This ensures that Copilot itself does not make any decisions and that human control is maintained at all times.

In addition, employees are informed that (future) publications and external communication always take into account the applicable transparency requirements.

Risk management

Omgevingsdienst Haaglanden takes a risk-based approach when using AI applications. For Copilot Chat, risks have been identified, such as unintentional sharing of confidential information, misinterpretation of generated output and creation of dependence on the tool.

These risks are mitigated by using clear guidelines, raising awareness among employees and providing support from the CISO and privacy officer. New users must first undergo a short training course covering the dangers of using Copilot.

The established principles for the use of Copilot Chat form an important part of this risk management and contribute to responsible and secure use within the organisation.

Legal basis

When using Copilot Chat within Omgevingsdienst Haaglanden, no personal data are processed and no automated decision-making takes place. Its use is therefore in line with the General Data Protection Regulation (AVG), provided employees comply with internal guidelines.

These guidelines prescribe, among other things, that no confidential or sensitive information is shared with AI applications, so that privacy and information security are guaranteed at all times.

Operations

Data

Copilot within Omgevingsdienst Haaglanden works on the basis of the data available within the organisation's Microsoft 365 environment. This means that Copilot has access to information sources with which it is integrated, such as e-mail, documents and Teams environments, insofar as the user has access to them.

Data processing takes place solely for the purpose of supporting the employee in their work. No data shall be shared or stored outside the Microsoft 365 environment of Omgevingsdienst Haaglanden.

Employees remain responsible for the careful handling of confidential and privacy-sensitive information and follow the established internal guidelines and principles for the safe and responsible use of AI applications.

Technical design

Copilot uses large language models (Large Language Models, LLMs) from OpenAI, provided as a service through Microsoft Azure. These models are trained on large amounts of text data and can therefore understand and generate natural language. Within the Microsoft 365 environment, Copilot can invoke these models to support users in answering questions, writing text and structuring information. Processing takes place within Microsoft's secure cloud infrastructure, in line with applicable privacy and security standards. More information on the architecture and operation can be found at: Microsoft Copilot Architecture and Azure OpenAI Service.

External provider

Microsoft

Similar algorithm descriptions

  • Employees may use Microsoft 365 CoPilot as a smart chatbot. With it, they can have texts created and get detailed answers to questions. Our rules state that employees must not use secret or confidential information. Also, employees must always check carefully whether the information is correct.

    Last change on 16th of September 2025, at 10:32 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • Employees can use Microsoft CoPilot. This to generate texts and images or get answers to questions. No confidential information may be used in the process, but no personal data either. An important condition is that the information is good, reliable and checked for accuracy.

    Last change on 7th of April 2025, at 13:51 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • Employees can use Microsoft 365 CoPilot as an AI chatbot to generate texts and get comprehensive answers to questions. Some principles described in our guidelines are that no internal or confidential information should be used and that the information should be properly checked for accuracy.

    Last change on 27th of March 2025, at 13:24 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DPIA, IAMA
    Status
    In use
  • Employees can use Microsoft 365 Copilot: Copilot Chat for questions, texts and summaries. Agents perform tasks such as taking notes or starting workflows. Controls regulate data access, privacy and features by user group.

    Last change on 3rd of November 2025, at 7:27 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    IAMA, DPIA
    Status
    In use
  • Employees can use Microsoft 365 Copilot Chat (hereafter Copilot Chat). Copilot Chat is a generative AI tool offered by Microsoft, where employees with a Microsoft account from the municipality can ask questions, generate texts and images, and create translations and summaries.

    Last change on 30th of September 2025, at 12:24 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use