Please note: The algorithm descriptions in English have been automatically translated. Errors may have been introduced in this process. For the original descriptions, go to the Dutch version of the Algorithm Register.

Pressure measurement - blurring of the image

Algorithm that on the Scheveningen promenade ensures that (after counting and determining group dynamics) the camera clip is blurred.

Last change on 5th of July 2024, at 8:04 (CET) | Publication Standard 1.0
Publication category
Other algorithms
Impact assessment
DPIA
Status
Out of use

General information

Theme

Public Order and Safety

Begin date

Field not filled in.

Contact information

datashop@denhaag.nl

Responsible use

Goal and impact

This was a pilot that has ended. The aim is efficient enforcement on crowds. The algorithm aims to make people unidentifiable. Even personal data such as height and thickness should not be traceable.

Considerations

It was a first experiment to work with blurred (=anonymised) image clips, in addition to working only with enforcers.

Human intervention

Yes, the enforcer. Camera and algorithm can be stopped at any time. The enforcement organisation can also ignore the signals.

Risk management

A DPIA has been carried out (DPIA pressure measurement and group dynamics) describing the risks and measures. There is no bias as blurring is determined without preference on types of people. The choice of where to hang the cameras was made together with the enforcers.

Legal basis

Municipal Act. Section 172 Maintenance of public order (camera images) and Section 151C (processing camera images under police direction).

Elaboration on impact assessments

A DPIA has been carried out (DPIA pressure measurement and group dynamics) describing the risks and measures.

Impact assessment

Data Protection Impact Assessment (DPIA)

Operations

Data

Camera footage.

Technical design

The algorithm recognises people in the images. It then covers them with an overlay such as a black block. The goal is unrecognisability but still seeing enough to determine action.

External provider

Processing took place by the supplier Natix (Hamburg), with whom a processing agreement was signed.

Similar algorithm descriptions

  • Algorithm that counts the number of people in a camera image

    Last change on 5th of July 2024, at 8:48 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DPIA
    Status
    In use
  • Algorithm that classifies movements in a group on camera images

    Last change on 5th of July 2024, at 8:38 (CET) | Publication Standard 1.0
    Publication category
    Other algorithms
    Impact assessment
    DPIA
    Status
    Out of use
  • In an expert examination, facial images are compared. The facial image comparison aims to determine whether a person visible in camera images (crime suspect) and the image of a known face (police photo of a suspect) are of the same person or two different people.

    Last change on 25th of June 2024, at 16:15 (CET) | Publication Standard 1.0
    Publication category
    High-Risk AI-system
    Impact assessment
    Field not filled in.
    Status
    In use
  • An investigation by an expert examines whether certain digital images (photos and/or videos) were taken with a certain digital camera and/or mobile phone. To answer this question, the camera identification algorithm is deployed, which identifies the camera.

    Last change on 25th of June 2024, at 16:19 (CET) | Publication Standard 1.0
    Publication category
    High-Risk AI-system
    Impact assessment
    Field not filled in.
    Status
    In use
  • Automated recognition of previously assessed damages to prevent previously processed damages from being processed again.

    Last change on 30th of July 2024, at 15:08 (CET) | Publication Standard 1.0
    Publication category
    Impactful algorithms
    Impact assessment
    Field not filled in.
    Status
    In use