Research

Xenophobic machines: Discrimination through unregulated use of algorithms in the Dutch childcare benefits scandal

Index Number: EUR 35/4686/2021

Social security enforcement agencies worldwide are increasingly automating their processes in the hope of detecting fraud. The Netherlands is at the forefront of this development. The Dutch tax authorities adopted an algorithmic decision-making system to create risk profiles of individuals applying for childcare benefits in order to detect inaccurate and potentially fraudulent applications at an early stage. Nationality was one of the risk factors used by the tax authorities to assess the risk of inaccuracy and/or fraud in the applications submitted. This report illustrates how the use of individuals’ nationality resulted in discrimination as well as racial profiling.

Choisir une langue pour afficher le rapport

Télécharger le PDF

  • Dutch
  • English
Télécharger le PDF