Netherlands: End dangerous mass surveillance policing experiments

Police in the Netherlands must immediately stop using algorithmic systems that result in indiscriminate mass surveillance and ethnic profiling, said Amnesty International, in a report which exposes the threat “predictive policing” poses to human rights.

What until recently was the preserve of science fiction is now a reality for people across the Netherlands. Predictive policing subjects people to indiscriminate mass surveillance, which can never be justified.

Merel Koning, Senior Policy Officer, Technology and Human Rights at Amnesty International.

The report, We Sense Trouble, documents the dangers of emerging “predictive policing” projects that are being rolled out by law enforcement agencies across the Netherlands. The projects, branded “living labs” by Dutch police, use mathematical models to assess the risk that a crime will be committed by a certain person or at a certain location, with law enforcement efforts then directed towards those individuals or locations deemed “high risk”.  

Amnesty International investigated a predictive policing project in the city of Roermond, called the Sensing Project. This policing experiment treats people in Roermond as “guinea pigs” under mass surveillance and discriminates against people with Eastern European nationalities.

“What until recently was the preserve of science fiction is now a reality for people across the Netherlands. Predictive policing subjects people to indiscriminate mass surveillance, which can never be justified,” said Merel Koning, Senior Policy Officer, Technology and Human Rights at Amnesty International.

“The problematic Roermond experiment, which profiles and discriminates against people from Eastern Europe, exposes how algorithmic policing systems are prejudicial not predictive. While such projects are rapidly proliferating across the country, the safeguards required to address the multitude of human rights threats they pose are sorely lacking. The Dutch parliament must act to immediately end the use of these fundamentally flawed systems.”

The design and development of predictive policing systems are often touted as “objective” and “neutral”, but prejudices and stereotypes are embedded in the models and algorithms. This leads to discriminatory results, with higher risk scores for certain groups.

Amnesty International is calling for a mandatory human rights impact assessment before the use of predictive policing technologies. To date, none of the systems in use by Dutch police have been subjected to a comprehensive human rights evaluation.

The Sensing Project

Police claim the Sensing Project is designed to prevent and detect property crime committed by so-called “mobile bandits” in Roermond.

The Dutch authorities claim the system is neutral based on objective crime data, however Amnesty International documents how the Sensing Project is discriminatory in its very design, reflecting built-in human biases in policing. The fact the project predominantly focuses on ”mobile banditry”, defined as pickpocketing and shoplifting committed specifically by people from Eastern European countries results in automated ethnic profiling.

Using cameras and other sensors, the police systematically monitor all people driving in and around Roermond, collecting information about vehicles and movement patterns. The data collected is then processed using an algorithmic model that calculates a “risk score”’ for each vehicle, which the police believe informs them of the likelihood that the driver and passengers in the city will commit a property crime. One of the indicators used to make this assessment is whether people in a vehicle are from Eastern Europe.

When a vehicle is identified as high-risk, the police will attempt to intercept it and check the identification papers of the driver and any passengers. Dutch law lacks adequate legal safeguards to prevent arbitrary and discriminatory stops and searches.

“The residents of Roermond, as well as anybody that travels to the city, are effectively being used as guinea pigs in an experiment to which they have not consented.  This is an inherently discriminatory system, designed to racially profile and target people of Eastern European nationality,” said Merel Koning.

“The Dutch authorities must call for a halt to the Sensing Project and similar experiments, which are in clear violation of the right to privacy, the right to data protection and the principles of legality and non-discrimination.”

Amnesty International is also calling on the Dutch authorities to evaluate how many and in what way people have been affected by the Sensing Project and other comparable experimental predictive policing projects. This information should be made public, with steps taken to allow for effective remedy and redress of affected individuals.