News




EU AI Act will fail commitment to ban biometric mass surveillance (Deutsche Version unten)

On 8 December 2023, EU lawmakers celebrated reaching a deal on the long-awaited Artificial Intelligence (AI) Act. Lead Parliamentarians reassured their colleagues that they had preserved strong protections for human rights, including ruling out biometric mass surveillance (BMS).

Yet despite the lawmakers’ bravado, the AI Act will not ban the vast majority of dangerous BMS practices. Instead, it will introduce – for the first time in the EU – conditions on how to use these systems. Members of the European Parliament (MEPs) and EU Member State ministers will vote on whether they accept the final deal in spring 2024.

The EU is making history – for the wrong reasons

The Reclaim Your Face coalition has long argued that BMS practices are error-prone and risky by design, and have no place in a democratic society. Police and public authorities already have so much information about each of us at their fingertips; they do not need to be able to identify and profile us all of the time, objectifying our faces and bodies at the push of a button.

Yet despite a strong negotiating position from the European Parliament which called to ban most BMS practices, very little survived the AI Act negotiations. Under pressure from law enforcement representatives, the Parliament were cornered into accepting only weak limitations on intrusive BMS practices.

One of the few biometrics safeguards which had apparently survived the negotiations – a restriction on the use of retrospective facial recognition – has since been gutted in subsequent so-called ‘technical’ discussions.

Despite promises the from Spanish representatives in charge of the negotiations that nothing substantive would change after 8 December, this watering-down of protections against retrospective facial recognition is a further letdown in our fight against a BMS society.

What’s in the deal?

Based on what we have seen of the final text, the AI Act is set to be a missed opportunity to protect civil liberties. Our rights to attend a protest, to access reproductive healthcare, or even to sit on a bench could still be jeopardised by pervasive biometric surveillance. Restrictions on the use of live and retrospective facial recognition in the AI Act will be minimal, and will not apply to private companies or administrative authorities.

We are also disappointed that when it comes to so-called ‘emotion recognition’ and biometric categorisation practices, only very limited use cases are banned in the final text, with huge loopholes.

This means that the AI Act will permit many forms of emotion recognition – such as police using AI systems to predict who is or is not telling the truth – despite these systems lacking any credible scientific basis. If adopted in this form, the AI Act will legitimise a practice that throughout history has been linked to eugenics.

Police categorising people in CCTV feeds on the basis of their skin colour is also set to be allowed. It’s hard to see how this could be permissible given that EU law prohibits discrimination – but apparently, when done by a machine, the legislators consider this to be acceptable.

Yet at least one thing had stood out for positive reasons after the final negotiation: the deal would limit post (retrospective) public facial recognition to the pursuit of serious cross-border crimes. Whilst the Reclaim Your Face campaign had called for even stronger rules on this, it would nevertheless have been a significant improvement from the current situation, where EU member states use retrospective facial recognition with abandon.

This would have been a win for the Parliament amid so much ground given up on biometrics. Yet in the time since the final negotiation, pressure from member state governments has seen the Parliament forced to agree to delete the limitation to serious cross-border crimes and to weaken the remaining safeguards [paywalled]. Now, just a vague link to the “threat” of a crime could be enough to justify the use of retrospective facial recognition of public spaces.

Reportedly, the country leading the charge to steamroller our right to be protected from abuses of our biometric data is France. Ahead of the Paris Olympics and Paralympics later this year, France has fought to preserve or expand the powers of the state to eradicate our anonymity in public spaces, and to use opaque and unreliable AI systems to claim to know what we are thinking. The other Member State governments, and the Parliament’s lead negotiators, have failed to stop them.

Under this new law, we will be guilty by algorithm until proven innocent – and the EU will have rubber-stamped biometric mass surveillance. This will give carte blanche to EU countries to roll out more surveillance of our faces and bodies, which in turn will set a chilling global precedent.

EU-Gesetz über Künstliche Intelligenz verfehlt das versprochene Verbot biometrischer Massenüberwachung

Am 8. Dezember 2023 haben die EU-Gesetzgeber eine Einigung über die lange erwartete Verordnung zu Künstlicher Intelligenz (KI-Gesetz / AI Act) gefeiert. Die federführenden Abgeordneten haben ihren Kolleg*innen versichert, dass sie einen starken Schutz von Menschenrechten erreicht hätten, einschließlich dass biometrischen Massenüberwachung ausgeschlossen worden sei.

Doch entgegen der prahlenden Behauptungen der Gesetzgeber*innen, wird das KI-Gesetz die große Mehrheit der gefährlichen biometrischen Überwachungspraktiken nicht verbieten. Stattdessen wird es – zum ersten Mal in der EU – Bedingungen einführen, wie diese Systeme genutzt werden können. Die Mitglieder des Europäischen Parlaments und die Minister*innen der EU-Mitgliedstaaten werden im Frühjahr 2024 darüber abstimmen, ob sie die endgültige Einigung annehmen.

Die EU schreibt Geschichte – aus den falschen Gründen

Das Bündnis „Reclaim Your Face“ erklärt seit langem, dass biometrische Massenüberwachung fehleranfällig und riskant ist und in einer demokratischen Gesellschaft keinen Platz hat. Polizei und Behörden haben bereits Zugriff auf so viele Informationen über jede*n von uns; sie müssen nicht auch noch in der Lage sein, uns jederzeit zu identifizieren, (Bewegungs-)Profile zu erstellen und unsere Gesichter und Körper auf Knopfdruck zu Objekten zu reduzieren.

Doch trotz einer starken Verhandlungsposition des Europäischen Parlaments, das ein Verbot der meisten Praktiken biometrischer Massenüberwachung gefordert hatte, hat im KI-Gesetz davon nur sehr wenig die Verhandlungen überlebt. Das Europäische Parlament hat dem Druck von Vertreter*innen der Strafverfolgungsbehörden nachgegeben und zugelassen, dass es nur schwache Grenzen für die Grundrechtseingriffe durch biometrische Massenüberwachung gibt.

Eine der wenigen Schutzbestimmungen, die offenbar ursprünglich die Verhandlungen überlebt hatten – eine Beschränkung der Verwendung zeitlich nachgelagerter Gesichtserkennung – wurde in den nachfolgenden – vermeintlich “technischen” – Diskussionen wieder ausgehöhlt.

Trotz der Versprechungen der spanischen Verhandlungsführer*innen, dass sich nach dem 8. Dezember nichts Wesentliches ändern würde, ist dieser verwässerte Schutz gegen die rückwirkende Gesichtserkennung eine weitere Enttäuschung in unserem Kampf gegen eine biometrische Überwachungsgesellschaft.

Was steht in der Einigung?

Nach dem was wir vom endgültigen Text einsehen konnten, wird das KI-Gesetz eine verpasste Chance sein, Freiheitsrechte zu schützen. Unser Recht, an einer Demonstration teilzunehmen, Zugang zu reproduktiver Gesundheitsfürsorge zu erhalten oder auch nur auf einer Bank zu sitzen, könnte immer noch durch die allgegenwärtige biometrische Überwachung gefährdet werden. Die Beschränkungen für die Verwendung von Echtzeit- und nachträglicher Gesichtserkennung im KI-Gesetz sind minimal und gelten nicht für private Unternehmen oder Verwaltungsbehörden.

Wir sind auch enttäuscht, dass in Bezug auf die so genannte “Emotionserkennung” und biometrische Kategorisierungsverfahren nur sehr begrenzte Anwendungsfälle im endgültigen Text verboten wurden und große Schlupflöcher übrig bleiben.

Das bedeutet, dass das KI-Gesetz viele Formen der Emotionserkennung zulassen wird – wie z. B. die Verwendung von KI-Systemen durch die Polizei, um vorherzusagen, wer die Wahrheit sagt oder nicht – obwohl diese Systeme keine wissenschaftliche Grundlage haben. Falls es in dieser Form angenommen wird, legitimiert das KI-Gesetz damit eine Praxis, die im Laufe der Geschichte durchgängig mit Eugenik verbunden war.

Die Polizei soll außerdem bei Videoüberwachung berechtigt werden, Menschen anhand ihrer Hautfarbe zu kategorisieren. Es ist schwer vorstellbar, wie das mit EU-Recht vereinbar sein soll, welches Diskriminierung verbietet – aber anscheinend halten die Gesetzgeber*innen das für akzeptabel, wenn es von einer Maschine gemacht wird.

Ein Ergebnis der Trilogverhandlungen war eigentlich positiv aufgefallen: Die Einigung sah vor, die zeitlich nachgelagerte Gesichtserkennung an öffentlichen Plätzen auf die Verfolgung schwerer grenzüberschreitender Straftaten zu beschränken. Die Kampagne Reclaim Your Face hatte zwar noch stärkere Regeln für diesen Bereich gefordert, aber das wäre trotzdem eine erhebliche Verbesserung gegenüber der jetzigen Situation gewesen, in der EU-Mitgliedstaaten retrograde Gesichtserkennung ungehemmt einsetzen.

Das wäre ein Erfolg für das Europäische Parlament gewesen, nachdem es sonst so viel im Bereich Biometrie aufgegeben hat. Doch seit den finalen Verhandlungen im Dezember wurde das Parlament durch den Druck der Mitgliedstaaten gezwungen, die Beschränkung auf schwere grenzüberschreitende Straftaten zu streichen und die verbleibenden Schutzklauseln zu schwächen. Nun könnte ein vager Hinweis auf die “Bedrohung” durch eine Straftat ausreichen, um den Einsatz einer nachträglichen Gesichtserkennung in öffentlichen Räumen zu rechtfertigen.

Berichten zufolge ist die Regierung von Frankreich dafür verantwortlich unser Recht, vor dem Missbrauch unserer biometrischen Daten geschützt zu werden, mit Füßen zu treten. Im Vorfeld der Olympischen Spiele und der Paralympics in Paris in diesem Jahr hat Frankreich dafür gekämpft, die Befugnisse des Staates zu erhalten bzw. zu erweitern, um unsere Anonymität im öffentlichen Raum zu beseitigen. Durch Einsatz von undurchsichtigen und unzuverlässigen KI-Systemen, soll behauptet werden zu wissen, was wir denken. Die Regierungen der anderen Mitgliedstaaten und die federführenden Verhandlungsführer*innen des Parlaments haben darin versagt, das aufzuhalten.

Nach diesem neuen Gesetz gilt für uns alle: Schuldig durch den Algorithmus, bis unsere Unschuld bewiesen ist – und die EU wird biometrische Massenüberwachung durchgewunken haben. Damit wird den EU-Ländern ein Freibrief für die Ausweitung der Überwachung unserer Gesichter und Körper geben, was wiederum weltweit einen erschreckenden Präzedenzfall schaffen wird.

France becomes the first European country to legalise biometric surveillance

EDRi member and Reclaim Your Face partner La Quadrature du Net charts out the chilling move by France to undermine human rights progress by ushering in mass algorithmic surveillance, which in a shocking move, has been authorised by national Parliamentarians.


For three years, the EDRi network has rallied against biometric mass surveillance practices through our long-running Reclaim Your Face campaign. Comprised of around eighty civil society groups and close to 100 thousand European citizens and residents, our movement has rejected the constant tracking of our faces and bodies across time and place by discriminatory algorithms. We have called for lawmakers to protect us from being treated as walking barcodes, and the European Parliament is now poised to vote to ban at least some forms of biometric mass surveillance at EU level.

In contrast, EDRi member and Reclaim Your Face partner La Quadrature du Net (LQDN) charts out the chilling move by France to undermine human rights progress by ushering in mass algorithmic surveillance, which in a shocking move, has been authorised by national Parliamentarians.

The article 7 of the Law on Olympic Games’ organisation has been adopted by the national parliament, “Assemblée Nationale”, formalising the introduction of Algorithmic Video-Surveillance in French Law, until December 2024. Due to the fuss regarding the retirements reform and following an expedited-as-usual process, French government succeeded in making acceptable one of the most dangerous technology ever deployed. Using lies and fake storytelling, the government escaped from the technical, political and judicial consequences in terms of mass surveillance. Supported by MPs from the governmental majority and far-right parties, algorithmic video-surveillance has been legalised backed by lies, undermining always more the democratic game.

  • The lie of biometrics: The government repeated and wrote down in the law that algorithmic video-surveillance is not related to biometrics. This is entirely false. This technology constantly identifies, analyses, classifies the bodies, physical attributes, gestures, body shapes, gait, which are unquestionably biometric data. LQDN explained it (see note or video), tirelessly told rapporteurs in the Sénat and Assemblée Nationale, the representatives, along with 38 other international organisations and more than 40 MEPs (members of European Parliament) who recently called out the French government. Despite this, the French government stood with its lies, concerning technical as well as legal aspects. France is once again violating EU’s law, consecrating its title of Europe surveillance’s champion.
  • The lie of usefulness: The government used the Olympic games as a pretext to achieve faster a long-running agenda of legalising these technologies. In fact, this choice is just keeping to a “tradition” widely observed of States profiting from international mega-events to pass exceptional laws. The government convinces people of the necessity to “spot suspicious packages” or “prevent massive crowd movements”. These events suddenly became the top priority for the Ministry of the Interior and deputies, who make the security of the Olympics just about these issues, which were rarely identified as a priority before. Also, these issues can also be resolved by human competency instead of these technologies, as LQDN have demonstrated in this article. Algorithmic video-surveillance acceptance relies on a well implanted myth according to which technology would magically ensure security. In this way, these opaque technologies are deemed useful without any honest evaluation or demonstration.
  • The technical lie: Algorithmic video-surveillance’s main application is to identify behaviors, pre-defined as “suspicious” by the police. Arbitrary and dangerous by design, the way these algorithms work has never been explained by the government: because it is not understood by most of those who decide Whether because of inexcusable incompetence or assumed diversion, in the end, the level of the parliamentary debate was extremely low, and certainly not what was expected given the dramatic issues raised by these biometric technologies.Helped by Guillaume Vuillemet and Sacha Houlié, both from the governmental party and some other MPs, what dominated the parliamentary debate was a minimisation rethoric directly inspired from surveillance companies’ marketing narratives, along with lies and technical nonsense. It clearly shows the incapacity of the Parliament to discuss technical questions. Moreover, society should legitimately fear the future, considering how Parliamentary representatives are unable to apprehend the threats of emerging technologies.

As police brutalities overwhelm people’s screens, and as the police, armed with clubs, assures the after-sales service of the most unpopular reforms, the increasing police surveillance is part of a global strategy to stifle any contestation.

Such tactics allowing the State to transform the reality of its surveillance prerogatives must be denounced. Particularly in a context where the meaning of words are deliberately twisted to make people believe that “surveillance is protection”,“security is freedom”, “democracy means forcing them through”. It is necessary to expose and counter this fake democratic game, and to question the extraordinary powers given to the French police. No need to talk about a “Chinese” dystopia to realise the height of the stakes. One can look at France’s history and present political climate to take the measure and understand the twenty years long security drift: more cameras, surveillance and databases, while depoliticising social stakes, and a loss of sense amongst politicians in charge. As a result, the Olympics’ law debates shed the light on the political disorientation of decision-makers, unable to question these security issues.

This first legalisation of automated video-surveillance is a winning step for the French security industry. Those who’ve been asking for years to test their algorithms on the people, to improve them and sell the technologies worldwide, are now satisfied. Soon, Thales, XXII, Two-I and Neuroo, will be allowed to sell biometric softwares to other states, like Idemia sold its facial recognition software to China. The startup XXII couldn’t even wait the law to be voted to loudly announce it raised 22 million euros to become, in its own words, “the European leader” of algorithmic video-surveillance.

The institutions in charge of preserving liberties, such as the Commission nationale de l’informatique et des libertés (CNIL), are totally failing. Created in 1978 and gifted with real and efficient counter powers to document the governmental surveillance ambitions, it is now the after-sales service of governmental regulations and carefully helps companies to implement “good” surveillance, in order to preserve the industry’s economic interests, without any consideration about collective rights and liberties.

This first legal authorisation creates a precedent and opens the gates to every other biometric surveillance technology: algorithmic audio-surveillance, facial recognition, biometric tracking, and more.

LQDN won’t give up the fight and will keep denouncing all of the government’s lies. They will be present as soon as the first experimentations will start and document the inevitable abuses these technologies lead to. They will find ways to contest them in courts and will fight for these experiments to remain temporary. And they’ll keep refusing these technologies and the technopolicing they incorporate, by fighting at the European level to obtain their interdiction.

Left-leaning French lawmakers are planning to challenge the adoption of this bill in the country’s top constitutional court.

This was first published by La Quadrature du Net. Read it in French.

Protect My Face: Brussels residents join the fight against biometric mass surveillance

The newly-launched Protect My Face campaign gives residents of the Brussels region of Belgium the opportunity to oppose mass facial recognition. EDRi applauds this initiative which demands that the Brussels Parliament ban these intrusive and discriminatory practices.


Eight Brussels-based organisations working across human rights and anti-surveillance have come together to launch Protect My Face. This regional campaign focusing on Brussels calls for an explicit ban on facial recognition. Among the NGOs responsible for this action are two of Belgium’s leading human rights groups: EDRi member the Liga voor Mensenrechten, and our Reclaim Your Face partner the Ligue des droits humains. For many years we have worked together to call for a ban on biometric mass surveillance across Europe – a demand which now sees unprecedented support from politicians in the European Parliament.

As one of the official seats of the European Parliament, Brussels is in some ways the beating heart of democracy in Europe. Yet with almost no transparency or oversight, people around the region have been the victims of secretive, disproportionate and rights-violating uses of facial recognition for many years. Federal police subjected people to unlawful facial recognition at the Brussels Zaventem airport in 2017 and 2019. And despite a warning from the police oversight board, the federal police also carried our several searches using the controversial Clearview AI facial recognition software in recent years.

Through the long-running Reclaim Your Face campaign, EDRi and our partners have long argued that facial recognition and other forms of biometric mass surveillance, which use our faces and bodies against us, pose an unacceptable risk to our rights and freedoms. They create the possibility to permanently track and monitor us in public spaces, and can particularly affect our right to demonstrations because of how they create a ‘chilling effect’. Biometric mass surveillance also poses a high risk of discrimination, being even more harmful for racialised people, queer people, homeless people and other minoritised groups.

This new petition is the first step in a regional campaign which gives the power to Brussels residents to demand action from the Brussels Parliament to protect our faces. In particular, the petition calls for the Parliament to ban facial recognition in public places and for identification purposes, and to grant the NGOs a hearing before the Parliament. This is an important chance to put a stop to these discriminatory, intrusive technologies of mass surveillance.

Are you a resident of the Brussels region? Join the fight against biometric mass surveillance by signing the new petition by the Protect My Face coalition:

Our movement gathered in Brussels

Between 6 and 9 November 2022, more than 20 activists from across Europe gathered in Brussels to celebrate the successes of the Reclaim You Face movement. We got to meet each other in real life after months of online organising, reflected on our wide range off decentralised actions, and learned from each other how to couple grassroots organising with EU advocacy aimed at specific events and EU institutions. Read on to see what we did.

“It’s unbelievable we did all this.”

was the summary of the event, as rightfully pointed out by Andrej Petrovski of SHARE Foundation.
Read More

Football fans are being targeted by biometric mass surveillance

Biometric mass surveillance involves the indiscriminate and/or arbitrary monitoring, tracking, and processing of biometric data related to individuals and/or groups. Biometric data encompasses, but is not limited to, fingerprints, palmprints, palm veins, hand geometry, facial recognition, DNA, iris recognition, typing rhythm, walking style, and voice recognition.

Though often targeted at specific groups, the use of mass surveillance technologies is becoming prevalent in publicly available spaces across Europe. As a result, football fans are increasingly impacted by them

Apart from its undemocratic nature, there are many reasons why biometric mass surveillance is problematic for human rights and fans’ rights. 

Firstly, in the general sense, the practices around biometric mass surveillance in and around stadia involve the collection of personal data, which may be shared with third parties and/or stored insecurely. All of this biometric data can be used in the service of mass surveillance

Secondly, fans’ culture is under threat because mass surveillance can be deployed to control or deter many of the core elements that bring people together in groups and in stadia. To be sure, biometric mass surveillance can create a ‘chilling effect’ on individuals. Knowing one is being surveilled can lead people to feel discouraged from legitimately attending pre-match gatherings and fan marches, or joining a protest. 

Moreover, women, people of colour, and fans who belong to the LGBT+ community may be at higher risk of being targeted or profiled.

Football Supporters Europe (FSE) highlighted these problems earlier in the year:

“There are two good reasons why fans should pay close attention to the question of biometric mass surveillance. First, we have a right to privacy, association, and expression, just like everybody else. And second, we’re often used as test subjects for invasive technologies and practices. With this in mind, we encourage fans to work at the local, national, and European levels to make sure that everybody’s fundamental rights are protected from such abuses.”

Football fans and mass surveillance 

The situation differs from country to country, but there are countless examples of fans being subjected to intrusive, or in some cases, unauthorised, surveillance:

  • Belgium: In 2018, second-tier club RWD Molenbeek announced plans to deploy facial recognition technology to improve queuing times at turnstiles.
  • Denmark: Facial recognition technology is used for ticketing verification at the Brøndby Stadion. The supplier claims that the Panasonic FacePro system can recognise people even if they wear sunglasses.
  • France: FC Metz allegedly used an experimental system to identify people who were subject to civil stadium bans, detect abandoned objects, and enhance counter-terror measures. Following several reports, the French data protection watchdog (CNIL) carried out an investigation which determined that the system relied on the processing of biometric data. In February 2021, CNIL ruled the use of facial recognition technology in the stadium to be unlawful.
  • Hungary: In 2015, the Hungarian Civil Liberties Union (HCUL) filed a complaint at the constitutional court challenging the use of palm “vein” scanners at some football stadia after fans of several clubs objected to the practice.
  • The Netherlands: In 2019, AFC Ajax and FC Den Bosch outlined plans to use facial recognition technology to validate and verify e-tickets.
  • Spain: Atlético Madrid declared their intention to use facial recognition systems and implement cashless payments from the 2022-23 season onwards. Valencia, meanwhile, have already deployed facial recognition technology designed by FacePhi to monitor and control access to their stadium. Several clubs, including Sevilla FC, also use fingerprint scanning to identify season ticket holders at turnstiles.
  • United Kingdom: In 2016, football fans and other community groups successfully campaigned against the introduction of facial recognition technology at Scottish football stadia. Soon after, South Wales Police began using facial recognition systems at football games to “prevent disorder”. According to the BBC, the use of the technology at the 2017 Champions League final in Cardiff led to 2,000 people being “wrongly identified as possible criminals”. In 2019 and 2020, Cardiff City and Swansea City fans joined forces to oppose its’ use considering it “completely unnecessary and disproportionate”.

EU AI Act and Biometric Mass Surveillance

In April 2021, the European Commission proposed a law to regulate the use of Artificial Intelligence (AI Act). Since becoming part of the ‘Reclaim Your Face’ coalition, FSE has joined a growing number of organisations which are calling for the act to include a ban on biometric mass surveillance.

Currently, the European Parliament is forming its opinion on the AI Act proposal. In the past, they have supported the demand for a ban, but more pressure is needed. That is why we must raise awareness among politicians about the impact of biometric mass surveillance on fans’ rights and dignity.

What can fans do?

  • Research the use of mass surveillance in football and share the findings with other fans. Write to EDRi’s campaigns and outreach officer Belen (belen.luna[at]edri.org) or email info[at]fanseurope.org if your club or local stadium operator deploys facial recognition cameras or other forms of mass surveillance.
  • Raise awareness among fans, community organisers, and local politicians as to the prevalence and impact of mass surveillance.
  • Organise locally and through national and pan-European representative bodies to contest the use of mass surveillance in football.
  • If you are part of an organisation, join the EDRi’s ‘Reclaim Your Face’ coalition.

Further reading

  1. Burgess, Matt. ‘The Met Police’s Facial Recognition Tests Are Fatally Flawed’, Wired, 4th July 2019 (accessed online at on 10th August 2022)
  2. European Digital Rights (EDRi) & Edinburgh International Justice Initiative (EIJI) (2021). ‘The rise and rise of biometric mass surveillance in the EU’. (accessed online on 10th August 2022)
  3. Football Supporters Europe (2022). ‘Facial Recognition Technology: Fans, Not Test Subjects’. (accessed online at on 10th August 2022)
  4. Football Supporters Europe (2022). ‘FSE Calls On EU Parliament To Protect Citizens From Biometric Mass Surveillance’. (accessed online on 10th August 2022)

Parliament calls loud and clear for a ban on biometric mass surveillance in AI Act

After our timely advocacy actions with over 70 organisations, the amendments to the IMCO – LIBE Committee Report for the Artificial Intelligence Act clearly state the need for a ban on Remote Biometric Identification. In fact, 24 individual MEPs representing 158 MEPs, demand a complete ban on biometric mass surveillance practices. Now we need to keep up the pressure at European and national levels to ensure that when the AI Act is officially passed, likely in 2023 or 2024, it bans biometric mass surveillance.


Remote biometric identification (RBI): what, where, why?

In April 2021, as a direct result of the work of civil society organisations like Reclaim Your Face, the European Commission put forward the draft for the EU Artificial Intelligence Act. The draft explicitly recognised the serious human rights risks of biometric mass surveillance by including a prohibition on ‘remote biometric identification’ (RBI) in publicly-accessible spaces.

However, the original RBI ban proposed by the European Commission was weak in three main ways:

  1. It banned ‘real-time’ (live) uses of RBI systems, but not the far more common ‘post’ uses. This means that authorities could use RBI after the data is collected (hours, days or even months after!) to turn back the clock, identifying journalists, people seeking reproductive healthcare, and more.
  2. It only applied the ban to law enforcement actors (i.e. police). As a result, we could all still be surveilled in public spaces by local councils, central governments, supermarket owners, shopping center managers, university administration and any other public or private actors.
  3. It also contained a series of wide and dangerous exceptions that could be used as a “blueprint” for how to conduct biometric mass surveillance practices – undermining the whole purpose and essence of the ban!

Whilst this was a big win, it has some limitations. The next steps of the process require that the EU’s 704 Members of the European Parliament (MEPs) and 27 member state governments agree to a ban for it to become law.

A hot topic in the European Parliament

In the EU Parliament, the MEPs who work in the Civil Liberties (LIBE) and Internal Markets (IMCO) working groups (also known as ‘Committees’) were given the joint responsibility to lead on the Parliament’s official position on the AI Act. As such, they presented a shared IMCO – LIBE report in March 2022.

After that, they had to present their amendments in a process by which MEPs are able to show which parts of the AI Act are most important to them, and how they would like to see improvements.

To influence this, Reclaim Your Face organised with the 76 civil society organisations part of our coalition. Many campaigners and advocates involved in the Reclaim Your Face campaign met with MEPs in the weeks and months preceding the amendments and organised an open letter. They encouraged MEPs to listen to the tens of thousands of people who signed the ECI petition calling for a ban and that the amendments that were going to be tabled, reflected five of our main demands:

  1. Extending the scope of the prohibition to cover all private as well as public actors;
  2. Ensuring that all uses of RBI (whether real-time or post) in publicly- accessible spaces are included in the prohibition;
  3. Deleting the exceptions to the prohibition, which independent human rights assessments confirm do not meet existing EU fundamental rights standards;
  4. Putting a stop to discriminatory or manipulative forms of biometric categorisation; and
  5. Properly addressing the risks of emotion recognition.

In June 2022, MEPs in the LIBE and IMCO Committees submitted ‘amendments’ to the AI Act showing the results and power of our actions: hundreds of amendments were tabled on biometrics, showing the importance MEPs put on this topic.

Amendments show major support for a ban

Download Who supported our demands?

In total, 177 MEPs across 6 out of the 7 political groups supported a stronger RBI ban in the AI Act!

  • 24 MEPs, from across 5 political groups, were champions of the Reclaim Your Face campaign! They tabled amendments for a full and unequivocal ban on all types of remote biometric identification (RBI) in publicly-accessible spaces. Two things are to be highlighted from this group. 1) it includes several of those who are responsible for the AI Act on behalf of their political group (called ‘Rapporteurs’ or ‘Shadows’) – a strong sign of broad support. This means that in fact, those 24 individual MEPs represent a staggering 158 MEPs who demand a complete ban on biometric mass surveillance practices! 2) some of the MEPs tabled these amendments ‘on behalf of’ their entire political group.
  • 18 MEPs went almost as far as their colleagues, supporting a full ban on ‘real-time’ RBI in publicly-accessible spaces, by all actors, and without conditions for exceptions. However, these MEPs did not propose to extend the ban to ‘post’ uses of RBI. Given that these MEPs clearly understand the threats and risks of biometric mass surveillance, this gives us good ground to go forward and convince them that ‘post’ uses are equally, if not even more, harmful than real-time uses.
  • Dozens of MEPs additionally proposed two new and important bans. These explicitly prohibit the police from using private biometric databases, and the creation of biometric databases through mass/untargeted methods such as online scraping or the mass scraping of CCTV footage. If accepted, this would further protect people from biometric mass surveillance, particularly through the use of services like Clearview AI.
  • Furthermore, 1 additional MEP supported removing all the exceptions to the RBI ban!

Download Who opposed our recommendations?

Opposition to a ban on RBI was very limited.

  • Just three MEPs – all from the European People’s Party (EPP) – argued that RBI in publicly-accessible spaces should only be classified as high-risk, not prohibited. Nevertheless, it is notable that these MEPs still recognised that RBI is very risky.
  • Separately, 14 MEPs supported a ban in principle, but added that it should be less restrictive. This includes both Shadow Rapporteurs for the EPP group, alongside 12 colleagues from the right-leaning Identity & Democracy (ID) group, European Conservatives and Reformists (ECR) group and their own EPP group.

Download Who said ‘yes, but…’?

7 additional MEPs from the ECR and EPP groups were ambivalent, putting forward some amendments which would strengthen the ban but also proposing amendments which would weaken it.

So what’s the balance in the European Parliament?

Overall, this is a really positive set of amendments. It shows clear and significant political will for a stronger ban on biometric mass surveillance, taking us a step closer to a genuine EU ban on these chilling practices.

The perspective of the Parliament is clear: we need a strong ban on biometric mass surveillance!

Among those calling for the most comprehensive form of a ban – which Reclaim Your Face has argued is necessary to protect people’s rights and freedoms – is MEP Brando Benifei from the S&D group. Mr Benifei is one of two MEPs who share the ultimate responsibility for the Parliament’s position on the AI Act, so his support for a full ban is very powerful and meaningful.

The other co-lead MEP is MEP Dragos Tudorache from the Renew group. He is one of the MEPs who supported all of our demands, except the one that would extend the ban to ‘post’ uses. Whilst we still, therefore, have work to do to convince Mr Tudorache and his colleagues, we can already see clear progress in his thinking. Last year he commented that he does not believe that a prohibition is the right approach to RBI. Now, Mr Tudorache says he agrees with us that RBI is a key human rights issue. His support is therefore also very important, and we believe that he will be open to learning more about how post uses of RBI pose a threat to organising, journalism and other civil freedoms.

We are also very proud of the commitment and effectiveness of organisations in the Reclaim Your Face. The amendments showed that the Parliament clearly listened and that the power of our joint actions is truly huge!

What’s next?

The fight is still far from over.

Whilst RBI in publicly-accessible spaces is a major part of biometric mass surveillance, practices such as biometric categorisation and emotion recognition (making predictions about people’s ethnicity, gender, emotions or other characteristics based on how they look or act) can also lead to biometric mass surveillance. That’s why we are also advocating for strong bans on both practices in the AI Act – which we are pleased to see have been put forward by several MEPs.

There is also a lot left to go in the political process. These amendments need to be turned into compromise amendments, and then voted on to ensure that the entire Parliament officially agrees. Only then will negotiations begin with the member state governments (Council), where more permissive home affairs ministers have clashed with more rights-protective justice ministers over whether to weaken or strengthen the RBI ban.

This emphasises why now, more than ever, we need to keep up the pressure at European and national levels to ensure that – when the AI Act is officially passed, likely in 2023 or 2024 – it bans biometric mass surveillance!

Get in contact with us to find out to support Reclaim Your Face!

Goodbye ECI, hello AI Act negotiations!

Today, 1st of August at 23:59, our European Citizens Initiative comes to an end. However, our campaign carries on to influence EU leaders as they negotiate the AI Act.


We started the ECI in January 2021, calling for a new law that bans biometric mass surveillance. 18 months later, we are ready for reflections and for a celebration of all that we have achieved together.

We campaigned during a pandemic and worked with creative efforts to gather signatures while respecting privacy and protecting data. We adapted to the political reality and managed to influence EU’s negotiations. We built a coalition with 76 organisations from over 20 EU countries. We led national actions and we won.

Campaigning for privacy with privacy

Out of the 90 ECIs ever started, only 6 have been able to reach the threshold of 1 million signatories. All 6 used social media targeted advertising. In Reclaim Your Face we have a commitment to everyone’s privacy. Therefore, we gathered almost 80 thousand signatures without using any targeted social media advertisement (or as we call them, surveillance ads). Every single ECI signatory was reached directly by one of our partners or their supporters by sharing our posts, sending newsletters and collecting signatures in the streets.

A challenge? Yes. But organic reach gave us a great opportunity to have direct interactions with other organisations, a high level of engagement from our supporters, and quality conversations about biometric mass surveillance. In fact, all of these factors played out to make our petition the “most politically powerful ECI ever”, according to an insider part of the European Economic and Social Committee.

“Most politically powerful ECI ever”

Insider part of the European Economic and Social Committee.

This is how we did it:

Coalition building: Different voices across Europe

Reclaim Your Face aimed to have a diversity of voices represented in our call to ban biometric mass surveillance. We listened and worked especially with groups most affected by this exploitative practice.

We worked with LGBTQ+ advocates at AllOut, with football supporters association Fans Europe, with Roma and Sinti rights supporters at save space e.V. as well as Workers Union UNI Europa. Everyone- migrations organisations, privacy defenders, journalists, etc- united for one cause: banning biometric mass surveillance.

In total, we were joined by 76 organisations from 20 Member States – who represent over half a million supporters. Our coalition has been the backbone of our success.

Volunteers for paper signature collection

Once the pandemic allowed us to be present in offline spaces, we decided to organise a Bootcamp for those who wanted to help us gather signatures. We trained over 80 people from more than 7 countries on 3 topics: biometric mass surveillance issues, ECI data protection practices and offline engagement methods.

The new Reclaim Your Face volunteers collected signatures in their own cities and engaged with people in the streets, at universities, in parks and in other public spaces. Activists in Portugal, Italy, Germany, Czechia and Greece made time in their days to share their thoughts on biometric mass surveillance, inform other citizens about its’ incompatibility with human rights and collect paper signatures for our ECI.

Local national campaigns

Reclaim Your Face was decentralised, building communities in more than 6 countries that led national actions and successes. Among many, here are some of our national wins:

Germany

The campaign’s German movement led by EDRi members Chaos Computer Club (CCC), Digitale Gesellschaft and Digitalcourage worked with more than 16 organisations. They organised over 14 events and were part of social media stunts, Twitter storms, as well as offline peaceful manifestations. Almost 30,000 German citizens signed the campaign’s European Citizens’ Initiative, proving that people-powered action can create meaningful change.

Italy

The Italian national campaign lead by Hermes Center, with more than 9 organisations in the coalition has coordinated many actions too, across almost 2 years.

Czechia

Leading organisation Iure has also organised many actions from creative work like comics and video clips, to paper signature collection days.

Two of the leading actions for Reclaim Your Face in Czechia has been the fight against biometric cameras at Prague airport and one of the seminars organised in the Chamber of Deputies where they talked about biometric cameras with police and political representatives.

Apart from this, in May 2022 in Prague, they spoke with people in the streets about biometric mass surveillance and its’ dangers for society. From April to July they also spoke and promote the campaign on five festivals and community screenings of their movie Digital Dissidents, which explores people that are critical to digital technologies.

Greece

Hellenic leading organisation of Reclaim Your Face, Homo Digitalis have also been active in Greece.

  • In May 2022 they organised a paper signature collection in the streets of Athens.

Serbia

As a result of international pressure, in September 2021, a Draft Law on Internal Affairs, which contained provisions for legalising a massive biometric video surveillance system, was pulled from the further procedure. This was an amazing win for human rights and a result of Share Foundation’s national campaign Thousands of Cameras, a two-and-a-half year-long battle against smart cameras in Belgrade installed by the Ministry of Interior and supplied by Chinese tech giant Huawei.

Portugal

The Portuguese lead organisation in the Reclaim Your Face coalition D3 (Defesa Dos Direitos Digitais) led actions to raise awareness, as the Portuguese government proposed video surveillance and facial recognition law. Reclaim Your Face organisations and EDRi sent a letter to representatives of Portugal’s main political parties, supporting D3’s fight against biometric mass surveillance practices. Together, we urged politicians to reject this dystopian law. The proposal was later withdrawn.

EU level successes

In parallel with our work at the national level, we unite and coordinate EU-level actions.

  • In fact, in May 2022 we could see the results of our actions. After meeting with key MEPs working on the EU’s AI Act proposal, delivering an open letter signed by 53 organisations and publishing multiple op-eds, both co-lead MEP on the AI Act announced their support for a ban. Dragos Tudorache (Renew) announced that he personally will table amendments for a more comprehensive ban on RBI in publicly-accessible spaces, calling RBI “clearly highly intrusive … in our privacy, our rights”.

Today we say goodbye to our European Citizens Initiative and are humbled by the tens of thousands of people who signed it.

However, Reclaim Your Face continues!

We envision a society in which no one is harmed by biometric mass surveillance. Such a society is only possible when biometric mass surveillance is banned by law and in practice. Together with our partners, we continue to fight for this a reality by advocating for an AI Act that puts people at its core.

A big success for Homo Digitalis: The Hellenic DPA fines CLEARVIEW AI with €20 million

On July 13 2022, following a complaint filed by Homo Digitalis in May 2021 representing our member and data subject Marina Zacharopoulou, the Hellenic Data Protection Authority (HDPA) issued Decision 35/2022 imposing a fine of 20 million euros on Clearview AI for its intrusive practices. By the same Decision, the DPA prohibits that company from collecting and processing the personal data of data subjects located in Greece using facial recognition methods and requires it to delete immediately any data it has already collected.

Specifically, in May 2021, an alliance of civil society organizations consisting of Homo Digitalis and the organisations Privacy International, Hermes Center, and noyb filed complaints before the competent authorities in Greece, the United Kingdom, Italy, Austria, France and the United Kingdom against Clearview AI for its mass surveillance practices through facial recognition.

Earlier this year, the Italian Data Protection Authority had decided to fine the company €20 million, while the UK’s equivalent authority had decided to fine it £7.5 million.

The €20 million fine imposed by the DPA today is another strong signal against intrusive business models of companies that seek to make money through the illegal processing of personal data. At the same time, it sends a clear message to law enforcement authorities working with companies of this kind that such practices are illegal and grossly violate the rights of data subjects.

Clearview AI is an American company founded in 2017 that develops facial recognition software. It claims to have “the largest known database of more than three billion facial images” which it collects from social media platforms and other online sources. It is an automated tool that visits public websites and collects any images it detects that contain human faces. Along with these images, the automated collector also collects metadata that complements these images, such as the title of the website and its source link. The collected facial images are then matched against the facial recognition software created by Clearview AI in order to build the company’s database. Clearview AI sells access to this database to private companies and law enforcement agencies, such as police authorities, internationally.

The full text of Decision 35/2022 can be found here (only in EL).

Week of actions: Reclaim Your Face Italy and the need for a real EU ban on biometric mass surveillance

During the second week of May 2022, Reclaim Your Face Italy held a week of actions for an EU ban on biometric mass surveillance in Milan, Torino and Como. They collected signatures on the streets of the 3 cities, joined an event organised by the Greens-European Free Alliance Group and made a field visit to Italy’s city, Como, the first one to implement facial recognition technology in a public park.

Background

In 2021, the Italian Data Protection Authority (DPA) rejected the police use of Automatic Image Recognition System (SARI). SARI is a real-time facial recognition system that was acquired by the Italian Police in 2017 and being under investigation by the Authority ever since. Albeit it is assured to never been used in real-time, this system was at the center of debate after it was revealed their intention to employ it to monitor arrivals of migrants and asylum seekers on the Italian coasts.

In its decision, the DPA argued that the system lacks a legal basis and, as designed, it would constitute a form of mass surveillance. Thanks to the actions of Hermes Center, Associazione Luca Coscioni, Certi Diritti, CILD, Eumans, info.nodes, The Good Lobby, Privacy Network, Progetto Winston Smith, and StraLi, a temporary ban on facial recognition technology in public spaces was introduced later. This moratorium will be in force until December 2023.

Now our Reclaim Your Face partners Hermes Center, Privacy Network, Certi Diritti, STRALI and CILD are fiercely campaigning to Ban Biometric Mass Surveillance in the EU.

Here are some of their latest actions!

Paper signature collection

On the 10th of May 2022, Reclaim Your Face hosted paper signature collection stands in three big cities of Italy: Milan, Torino, and Rome. This paper signature collection was organized by Hermes Center and two national Reclaim Your Face partners: StraLi and CILD. The activists were in front of Universities and in the city center to talk about the risks of biometric mass surveillance, giving out stickers, booklets, Reclaim Your Face T-shirts and bags.

Event with Greens-European Free Alliance Group

Colleagues from Hermes Center, Riccardo Coluccini and Davide Del Monte, joined as speakers for the event ‘Stop Biometric Surveillance – Time for an EU ban on biometric mass surveillance in public spaces’ to explain why Italy must carry on campaigning pushing for a real ban on biometric surveillance in the EU.

Visit in Como

Como was the first city to implement facial recognition technology in their park in 2019 through an offer by Huawei . The technology included also algorithms that detected different types of behaviours. Not coincidentally, in 2016 during the migration crisis, migrants were camping in this park waiting to cross the border.

After the work of activists and a journalistic investigation by Hermes Center colleagues Laura Carrer and Riccardo Coluccini, and researcher Philip Di Salvo, Como was obliged to shut down the system 2020.

In May 2022, together with representatives from the Greens- European Free Alliance Group and journalists from the Czech Republic, the researchers visited the park where Facial Recognition cameras were installed and talked about their investigation. While the cameras are still there, the Facial Recognition and other algorithmic functions are turned off at the moment. The Greens- European Free Alliance Group and Czech journalist later met with local journalist Andrea Quadroni who talked about the migrant crisis that hit Como in 2016.

The trip to Como is part of the Greens- European Free Alliance Group’s newly released mini-documentary while articles about the actions and results of Reclaim Your Face in Italy were published on national TV and radio station in the Czech Republic.

Reclaim Your Face’s coalition & 53 orgs made it: Leading EU politician speaks against biometric mass surveillance

This month we worked together on some specific actions to influence the Artificial Intelligence Act to include a ban on biometric mass surveillance and 53 organisations took part. This builds on the hard work of the whole Reclaim Your Face coalition over the last two years. Our actions have had amazing results, with even the co-lead MEP on the Artificial Intelligence Act committing to table amendments for a more comprehensive ban on RBI in publicly-accessible spaces!


Here is a snapshot of our joint actions last/this week:

  • Reclaim Your Face organisations from Italy, Germany, France and Belgium met with key MEPs working on the EU’s AI Act proposal, including co-lead MEP Dragos Tudorache, co-lead MEP Brando Benifei, and MEP Birgit Sippel.
  • 53 organisations signed our Reclaim Your Face open letter asking MEPs to protect fundamental rights in the AI Act by prohibiting all remote (i.e. generalised surveillance) uses of biometric identification (RBI) in publicly- accessible spaces.

We have five main demands to ban biometric mass surveillance in the AI Act:

  1. Extending the scope of the prohibition to cover all private as well as public actors;
  2. Ensuring that all uses of RBI (whether real-time or post) in publicly- accessible spaces are included in the prohibition;
  3. Deleting the exceptions to the prohibition, which independent human rights assessments confirm do not meet existing EU fundamental rights standards;
  4. Putting a stop to discriminatory or manipulative forms of biometric categorisation; and
  5. Properly addressing the risks of emotion recognition.

Our demands were published in various EU Policy outlets and France.

Our tireless actions to call for a Ban on Biometric Mass Surveillance in the Artificial Intelligence Act have had amazing results so far!

Following our meeting, the co-lead MEP on the AI Act, Dragos Tudorache (Renew) announced that he personally will table amendments for a more comprehensive ban on RBI in publicly-accessible spaces, calling RBI “clearly highly intrusive … in our privacy, our rights”.

This is the result of hearing the views of his colleagues in a majority of the Parliament’s political groups – with several lead MEPs committing publicly to submitting amendments for a full ban on biometric mass surveillance in the AI Act – and suggesting the big influence of the calls of the dozens of organisations and 71,000 people behind Reclaim Your Face’s European Citizens Initiative.

However, we have not won – yet. There will still be many months of negotiations in the AI Act.

You can support Reclaim Your Face individually by signing our ECI, and as an organisation by getting in contact with us so we can explore paths of collaboration.

Thank you to all our partners and supporters for making this possible! The response to our actions suggests that there is clear a majority in the Parliament supporting our call.



ReclaimYourFace is a movement led by civil society organisations across Europe:

Access Now ARTICLE19 Bits of Freedom CCC Defesa dos Direitos Digitais (D3) Digitalcourage Digitale Gesellschaft CH Digitale Gesellschaft DE Državljan D EDRi Electronic Frontier Finland epicenter.works Hermes Center for Transparency and Digital Human Rights Homo Digitalis IT-Political Association of Denmark IuRe La Quadrature du Net Liberties Metamorphosis Foundation Panoptykon Foundation Privacy International SHARE Foundation
In collaboration with our campaign partners:

AlgorithmWatch AlgorithmWatch/CH All Out Amnesty International Anna Elbe Aquilenet Associazione Luca Coscioni Ban Facial Recognition Europe Big Brother Watch Certi Diritti Chaos Computer Club Lëtzebuerg (C3L) CILD D64 Danes je nov dan Datapanik Digitale Freiheit DPO Innovation Electronic Frontier Norway European Center for Not-for-profit Law (ECNL) European Digital Society Eumans Football Supporters Europe Fundación Secretariado Gitano (FSG) Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung Germanwatch German acm chapter Gesellschaft Fur Informatik (German Informatics Society) GONG Hellenic Association of Data Protection and Privacy Hellenic League for Human Rights info.nodes irish council for civil liberties JEF, Young European Federalists Kameras Stoppen Ligue des droits de L'Homme (FR) Ligue des Droits Humains (BE) LOAD e.V. Ministry of Privacy Privacy first logo Privacy Lx Privacy Network Projetto Winston Smith Reporters United Saplinq Science for Democracy Selbstbestimmt.Digital STRALI Stop Wapenhandel The Good Lobby Italia UNI-Europa Unsurv Vrijbit Wikimedia FR Xnet


Reclaim Your Face is also supported by:

Jusos Piratenpartei DE Pirátská Strana

MEP Patrick Breyer, Germany, Greens/EFA
MEP Marcel Kolaja, Czechia, Greens/EFA
MEP Anne-Sophie Pelletier, France, The Left
MEP Kateřina Konečná, Czechia, The Left



Should your organisation be here, too?
Here's how you can get involved.
If you're an individual rather than an organisation, or your organisation type isn't covered in the partnering document, please get in touch with us directly.