Predictive Policing: Repeating history through algorithms

Certain groups, such as racialised people, face disproportionate levels of police intervention and police violence in Europe and across the world. It is not surprising then, that the police forces’ uptake of new technologies follows the same patterns.

Many of the most high-profile discriminatory examples of facial recognition have been in the US. However, the EU is not without its share of examples of how predictive policing + biometric analysis = a perfect storm of unlawful police discrimination.

Worse still, automated predictive policing is often hidden behind a false claim that technology is neutral and excuses from police forces evading accountability: “The tech told me to do it!”.

Analog predictive policing

A common justification given by governments to explain the over-policing of racialised people is that racialised communities are inherently more criminal. They claim that this is supported by statistics showing that racialised people are more frequently arrested and imprisoned. . However, the only thing that these historical statistics highlight is, in fact, that racialised communities are vastly over-exposed to (often violent) police intervention, and are systematically treated more harshly and punitively by criminal justice systems. These statistics reflect on the actions of police and of justice systems, not on the behaviours or qualities of racialised people.

Systemic discrimination is rooted in analogue predictive policing practices: police (and wider society) making judgements and predictions about an individual based on, for instance, the colour of their skin or the community of which they are a part.

The use of new technologies by police forces makes these practices even more harmful to people’s lives, while hidding under the false pretext of “technological objectivity”.

Automated predictive policing: WHAT is it and HOW is it used?

Automated predictive policing is the practice of applying algorithms on historical data to predict future crime. This could be by using certain group factors (such as someone’s ethnicity, skin colour, facial features, postcode, educational background or who they are friends with) to automatically predict whether they are going to commit a crime.

There is a principle sometimes referred to as “Garbage in, garbage out”. This idea means that if you feed an algorithm with data that reflects bias and unfairness, the results you get will always be biased and unfair.

“Garbage in, garbage out” guides some of the ways law enforcements uses automated predictive policing when:

  • Deciding where to deploy extra police presence. This traps communities that have been over-policed in an inescapable loop of more and more police interventions;
  • Predicting whether people are likely to re-offend, an assesment that can influence whether someone gets parole or not. This means that a person’s liberty is decided based on discriminatory data about other people that the system thinks are similar to that person.


Having certainties in life can be comforting for all of us. However, when the police and the criminal justice system tries to predict crime, it is not possible to know with enough certainty how someone is going to act in the future. Trying to do so will only reinforce and intensify historical patterns of injustice and grow societal inequalities. Introducing algorithmic predictions in policing will only make the poor poorer, the excluded left out of society and those suffering from discrimination, even more discriminated.

As unique humans, with free will, self-determination and the power to change our life path, we have the right to be treated fairly and not punched down by (automated) justice system.


In the Netherlands, “smart” devices have sprayed the scent of oranges at people that the biometric algorithm thinks are displaying aggressive behaviour. Given the biases and discriminatory assumptions baked into such tech, it is likely that such technologies will disproportionately be used against racialised people. Being followed by the smell of oranges might not seem so bad – but this tech is also being used in the Netherlands to trigger the deployment of an emergency police vehicle responding to what the algorithm predicts is a violent incident: [p.92]

In Sweden, the police were fined for using unlawful facial recognition systems, and were particularly criticised for failing to undertake any assessment of how it might infringe on people’s rights to data protection and related rights, such as equality:

In the Italian city of Como, authorities deployed biometric surveillance systems to identify ‘loitering’ in a park in which stranded migrants were forced to sleep after being stopped at the Swiss-Italian border:

A Spanish biometric mass surveillance company called Herta Security – which has received funding from the EU – developed facial recognition technology which they say can profile people’s ethnicity. When we challenged them about this being unlawful, they said it isn’t a problem because they would only sell that part of their tech to non-EU countries: and

News and reports:

Learn more about the different ways that biometric mass surveillance affects us, and SIGN the ECI!

European Citizen's Initiative

Reclaim your Face

ReclaimYourFace is a movement led by civil society organisations across Europe:

Access Now ARTICLE19 Bits of Freedom CCC Defesa dos Direitos Digitais (D3) Digitalcourage Digitale Gesellschaft CH Digitale Gesellschaft DE Državljan D EDRi Electronic Frontier Finland Hermes Center for Transparency and Digital Human Rights Homo Digitalis IT-Political Association of Denmark IuRe La Quadrature du Net Liberties Metamorphosis Foundation Panoptykon Foundation Privacy International SHARE Foundation
In collaboration with our campaign partners:

AlgorithmWatch AlgorithmWatch/CH All Out Amnesty International Anna Elbe Aquilenet Associazione Luca Coscioni Ban Facial Recognition Europe Big Brother Watch Certi Diritti Chaos Computer Club Lëtzebuerg (C3L) CILD D64 Danes je nov dan Datapanik Digitale Freiheit DPO Innovation Electronic Frontier Norway European Center for Not-for-profit Law (ECNL) European Digital Society Eumans Football Supporters Europe Fundación Secretariado Gitano (FSG) Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung Germanwatch German acm chapter Gesellschaft Fur Informatik (German Informatics Society) GONG Hellenic Association of Data Protection and Privacy Hellenic League for Human Rights info.nodes irish council for civil liberties JEF, Young European Federalists Kameras Stoppen Ligue des droits de L'Homme (FR) Ligue des Droits Humains (BE) LOAD e.V. Ministry of Privacy Privacy Lx Privacy Network Projetto Winston Smith Reporters United Saplinq Science for Democracy Selbstbestimmt.Digital STRALI Stop Wapenhandel The Good Lobby Italia UNI-Europa Unsurv Vrijbit Wikimedia FR Xnet

Reclaim Your Face is also supported by:

Jusos Piratenpartei DE Pirátská Strana

MEP Patrick Breyer, Germany, Greens/EFA
MEP Marcel Kolaja, Czechia, Greens/EFA
MEP Anne-Sophie Pelletier, France, The Left
MEP Kateřina Konečná, Czechia, The Left

Should your organisation be here, too?
Here's how you can get involved.
If you're an individual rather than an organisation, or your organisation type isn't covered in the partnering document, please get in touch with us directly.