Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.

Bodycam

Integrated system of portable bodycams, docking stations and software for managing video images. The bodycam is worn on the body and captures video, audio and often GPS information during enforcement work.

Last change on 16th of March 2026, at 12:19 (CET) | Publication Standard 1.0
Publication category
High-Risk AI-system
Impact assessment
DPIA
Status
In use

General information

Theme

Public Order and Safety

Begin date

01-2026

Contact information

algoritme_privacy@debilt.nl

Link to publication website

https://debilt.nl/bestuur-en-organisatie/organisatie/privacy-en-algoritmes/algoritmeregister

Responsible use

Goal and impact

Purpose of the solution

The bodycam solution is designed to record interactions and incidents in the workplace using portable cameras and associated software. The system records image and audio recordings that are securely stored and managed via a central platform. The aim is to support transparency, safety and objective recording of events for organisations such as police, enforcement, security and healthcare institutions.

Impact of the solution

The deployment of bodycams has several organisational and societal impacts:

  1. Increased safety: employees often feel safer because incidents are recorded.
  2. De-escalation of situations: visible bodycams can reduce aggressive behaviour.
  3. Better evidence: recorded images can be used for reporting, investigation or legal proceedings.
  4. Increased transparency and accountability: organisations can better prove what happened during an incident.
  5. Privacy impact: the system may process personal data (images and audio of citizens), requiring careful handling of data and compliance with regulations such as the AVG.


Considerations

Advantages

1. Faster analysis of video footage

AI can automatically identify relevant moments in large amounts of video footage, allowing incidents to be recovered and investigated faster.

2. More efficient management of data

Algorithms can automatically classify footage (e.g. by time, location or event) making search and reporting easier.

3. Supporting security and situational awareness

In live streaming, AI can help alert operators to potential incidents or abnormal situations faster.

4. Privacy protection through automation

AI can be used to automatically blur faces or license plates before images are shared or stored.

Disadvantages / risks

1. Privacy and data protection

Bodycams often record personal data. AI analysis can increase privacy impacts and therefore requires compliance with regulations such as the AVG.

2. Errors or bias in algorithms

AI systems can make wrong interpretations or unintentionally disadvantage certain groups if the algorithms are not properly trained.

3. Transparency and explainability

It can be difficult to understand exactly how an algorithm arrives at a particular analysis or classification.

4. Legal and ethical risks

When AI analysis affects enforcement or decision-making, the system could potentially fall under stricter rules such as those in the European AI Act.

5. Dependence on technology

Organisations may become dependent on software and infrastructure for analysis and storage of video footage.

Human intervention

The algorithms provide supporting analytics, but people check the results, make decisions and correct any errors.

Risk management

Risks are managed through prior risk analysis, human control, privacy and security measures, transparent logging and continuous monitoring of the system.

Legal basis

The legal basis for this application lies mainly in the AVG, supplemented by sector or national legislation (such as police legislation) and, if AI is applied, the requirements of the European AI Act.

Links to legal bases

  • AVG Implementation Act: https://wetten.overheid.nl/BWBR0040940/
  • General Administrative Law Act: https://wetten.overheid.nl/BWBR0005537
  • Municipal Act: https://wetten.overheid.nl/BWBR0005416
  • Police Data Act (if cooperation with police): https://wetten.overheid.nl/BWBR0022463
  • European AI Act: https://eur-lex.europa.eu/legal-content/NL/TXT/HTML/?uri=OJ:L_202401689

Impact assessment

Data Protection Impact Assessment (DPIA): https://www.autoriteitpersoonsgegevens.nl/onderwerpen/avg-nieuwe-europese-privacywetgeving/data-protection-impact-assessment-dpia

Operations

Data

Field not filled in.

External provider

ZEPCAM

Similar algorithm descriptions

  • Bodycam boas

    Last change on 15th of October 2024, at 8:07 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • Film recordings bodycam boas are made suitable for further use. In which persons are made unrecognisable.

    Last change on 17th of October 2025, at 9:15 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use
  • An investigation by an expert examines whether certain digital images (photos and/or videos) were taken with a certain digital camera and/or mobile phone. To answer this question, the camera identification algorithm is deployed, which identifies the camera.

    Last change on 25th of June 2024, at 16:19 (CET) | Publication Standard 1.0
    Publication category
    High-Risk AI-system
    Impact assessment
    Field not filled in.
    Status
    In use
  • This tool can be used to quickly search large amounts of video footage, such as that from surveillance cameras, for moving objects and certain movements. This helps to perform targeted searches through video footage, without having to look at everything manually.

    Last change on 15th of December 2025, at 16:28 (CET) | Publication Standard 1.0
    Publication category
    High-Risk AI-system
    Impact assessment
    Field not filled in.
    Status
    In use
  • In an expert examination, facial images are compared. The facial image comparison aims to determine whether a person visible in camera images (crime suspect) and the image of a known face (police photo of a suspect) are of the same person or two different people.

    Last change on 25th of June 2024, at 16:15 (CET) | Publication Standard 1.0
    Publication category
    High-Risk AI-system
    Impact assessment
    Field not filled in.
    Status
    In use