France becomes the first European country to legalise biometric surveillance

EDRi member and Reclaim Your Face partner La Quadrature du Net charts out the chilling move by France to undermine human rights progress by ushering in mass algorithmic surveillance, which in a shocking move, has been authorised by national Parliamentarians.


For three years, the EDRi network has rallied against biometric mass surveillance practices through our long-running Reclaim Your Face campaign. Comprised of around eighty civil society groups and close to 100 thousand European citizens and residents, our movement has rejected the constant tracking of our faces and bodies across time and place by discriminatory algorithms. We have called for lawmakers to protect us from being treated as walking barcodes, and the European Parliament is now poised to vote to ban at least some forms of biometric mass surveillance at EU level.

In contrast, EDRi member and Reclaim Your Face partner La Quadrature du Net (LQDN) charts out the chilling move by France to undermine human rights progress by ushering in mass algorithmic surveillance, which in a shocking move, has been authorised by national Parliamentarians.

The article 7 of the Law on Olympic Games’ organisation has been adopted by the national parliament, “Assemblée Nationale”, formalising the introduction of Algorithmic Video-Surveillance in French Law, until December 2024. Due to the fuss regarding the retirements reform and following an expedited-as-usual process, French government succeeded in making acceptable one of the most dangerous technology ever deployed. Using lies and fake storytelling, the government escaped from the technical, political and judicial consequences in terms of mass surveillance. Supported by MPs from the governmental majority and far-right parties, algorithmic video-surveillance has been legalised backed by lies, undermining always more the democratic game.

  • The lie of biometrics: The government repeated and wrote down in the law that algorithmic video-surveillance is not related to biometrics. This is entirely false. This technology constantly identifies, analyses, classifies the bodies, physical attributes, gestures, body shapes, gait, which are unquestionably biometric data. LQDN explained it (see note or video), tirelessly told rapporteurs in the Sénat and Assemblée Nationale, the representatives, along with 38 other international organisations and more than 40 MEPs (members of European Parliament) who recently called out the French government. Despite this, the French government stood with its lies, concerning technical as well as legal aspects. France is once again violating EU’s law, consecrating its title of Europe surveillance’s champion.
  • The lie of usefulness: The government used the Olympic games as a pretext to achieve faster a long-running agenda of legalising these technologies. In fact, this choice is just keeping to a “tradition” widely observed of States profiting from international mega-events to pass exceptional laws. The government convinces people of the necessity to “spot suspicious packages” or “prevent massive crowd movements”. These events suddenly became the top priority for the Ministry of the Interior and deputies, who make the security of the Olympics just about these issues, which were rarely identified as a priority before. Also, these issues can also be resolved by human competency instead of these technologies, as LQDN have demonstrated in this article. Algorithmic video-surveillance acceptance relies on a well implanted myth according to which technology would magically ensure security. In this way, these opaque technologies are deemed useful without any honest evaluation or demonstration.
  • The technical lie: Algorithmic video-surveillance’s main application is to identify behaviors, pre-defined as “suspicious” by the police. Arbitrary and dangerous by design, the way these algorithms work has never been explained by the government: because it is not understood by most of those who decide Whether because of inexcusable incompetence or assumed diversion, in the end, the level of the parliamentary debate was extremely low, and certainly not what was expected given the dramatic issues raised by these biometric technologies.Helped by Guillaume Vuillemet and Sacha Houlié, both from the governmental party and some other MPs, what dominated the parliamentary debate was a minimisation rethoric directly inspired from surveillance companies’ marketing narratives, along with lies and technical nonsense. It clearly shows the incapacity of the Parliament to discuss technical questions. Moreover, society should legitimately fear the future, considering how Parliamentary representatives are unable to apprehend the threats of emerging technologies.

As police brutalities overwhelm people’s screens, and as the police, armed with clubs, assures the after-sales service of the most unpopular reforms, the increasing police surveillance is part of a global strategy to stifle any contestation.

Such tactics allowing the State to transform the reality of its surveillance prerogatives must be denounced. Particularly in a context where the meaning of words are deliberately twisted to make people believe that “surveillance is protection”,“security is freedom”, “democracy means forcing them through”. It is necessary to expose and counter this fake democratic game, and to question the extraordinary powers given to the French police. No need to talk about a “Chinese” dystopia to realise the height of the stakes. One can look at France’s history and present political climate to take the measure and understand the twenty years long security drift: more cameras, surveillance and databases, while depoliticising social stakes, and a loss of sense amongst politicians in charge. As a result, the Olympics’ law debates shed the light on the political disorientation of decision-makers, unable to question these security issues.

This first legalisation of automated video-surveillance is a winning step for the French security industry. Those who’ve been asking for years to test their algorithms on the people, to improve them and sell the technologies worldwide, are now satisfied. Soon, Thales, XXII, Two-I and Neuroo, will be allowed to sell biometric softwares to other states, like Idemia sold its facial recognition software to China. The startup XXII couldn’t even wait the law to be voted to loudly announce it raised 22 million euros to become, in its own words, “the European leader” of algorithmic video-surveillance.

The institutions in charge of preserving liberties, such as the Commission nationale de l’informatique et des libertés (CNIL), are totally failing. Created in 1978 and gifted with real and efficient counter powers to document the governmental surveillance ambitions, it is now the after-sales service of governmental regulations and carefully helps companies to implement “good” surveillance, in order to preserve the industry’s economic interests, without any consideration about collective rights and liberties.

This first legal authorisation creates a precedent and opens the gates to every other biometric surveillance technology: algorithmic audio-surveillance, facial recognition, biometric tracking, and more.

LQDN won’t give up the fight and will keep denouncing all of the government’s lies. They will be present as soon as the first experimentations will start and document the inevitable abuses these technologies lead to. They will find ways to contest them in courts and will fight for these experiments to remain temporary. And they’ll keep refusing these technologies and the technopolicing they incorporate, by fighting at the European level to obtain their interdiction.

Left-leaning French lawmakers are planning to challenge the adoption of this bill in the country’s top constitutional court.

This was first published by La Quadrature du Net. Read it in French.

Football fans are being targeted by biometric mass surveillance

Biometric mass surveillance involves the indiscriminate and/or arbitrary monitoring, tracking, and processing of biometric data related to individuals and/or groups. Biometric data encompasses, but is not limited to, fingerprints, palmprints, palm veins, hand geometry, facial recognition, DNA, iris recognition, typing rhythm, walking style, and voice recognition.

Though often targeted at specific groups, the use of mass surveillance technologies is becoming prevalent in publicly available spaces across Europe. As a result, football fans are increasingly impacted by them

Apart from its undemocratic nature, there are many reasons why biometric mass surveillance is problematic for human rights and fans’ rights. 

Firstly, in the general sense, the practices around biometric mass surveillance in and around stadia involve the collection of personal data, which may be shared with third parties and/or stored insecurely. All of this biometric data can be used in the service of mass surveillance

Secondly, fans’ culture is under threat because mass surveillance can be deployed to control or deter many of the core elements that bring people together in groups and in stadia. To be sure, biometric mass surveillance can create a ‘chilling effect’ on individuals. Knowing one is being surveilled can lead people to feel discouraged from legitimately attending pre-match gatherings and fan marches, or joining a protest. 

Moreover, women, people of colour, and fans who belong to the LGBT+ community may be at higher risk of being targeted or profiled.

Football Supporters Europe (FSE) highlighted these problems earlier in the year:

“There are two good reasons why fans should pay close attention to the question of biometric mass surveillance. First, we have a right to privacy, association, and expression, just like everybody else. And second, we’re often used as test subjects for invasive technologies and practices. With this in mind, we encourage fans to work at the local, national, and European levels to make sure that everybody’s fundamental rights are protected from such abuses.”

Football fans and mass surveillance 

The situation differs from country to country, but there are countless examples of fans being subjected to intrusive, or in some cases, unauthorised, surveillance:

  • Belgium: In 2018, second-tier club RWD Molenbeek announced plans to deploy facial recognition technology to improve queuing times at turnstiles.
  • Denmark: Facial recognition technology is used for ticketing verification at the Brøndby Stadion. The supplier claims that the Panasonic FacePro system can recognise people even if they wear sunglasses.
  • France: FC Metz allegedly used an experimental system to identify people who were subject to civil stadium bans, detect abandoned objects, and enhance counter-terror measures. Following several reports, the French data protection watchdog (CNIL) carried out an investigation which determined that the system relied on the processing of biometric data. In February 2021, CNIL ruled the use of facial recognition technology in the stadium to be unlawful.
  • Hungary: In 2015, the Hungarian Civil Liberties Union (HCUL) filed a complaint at the constitutional court challenging the use of palm “vein” scanners at some football stadia after fans of several clubs objected to the practice.
  • The Netherlands: In 2019, AFC Ajax and FC Den Bosch outlined plans to use facial recognition technology to validate and verify e-tickets.
  • Spain: Atlético Madrid declared their intention to use facial recognition systems and implement cashless payments from the 2022-23 season onwards. Valencia, meanwhile, have already deployed facial recognition technology designed by FacePhi to monitor and control access to their stadium. Several clubs, including Sevilla FC, also use fingerprint scanning to identify season ticket holders at turnstiles.
  • United Kingdom: In 2016, football fans and other community groups successfully campaigned against the introduction of facial recognition technology at Scottish football stadia. Soon after, South Wales Police began using facial recognition systems at football games to “prevent disorder”. According to the BBC, the use of the technology at the 2017 Champions League final in Cardiff led to 2,000 people being “wrongly identified as possible criminals”. In 2019 and 2020, Cardiff City and Swansea City fans joined forces to oppose its’ use considering it “completely unnecessary and disproportionate”.

EU AI Act and Biometric Mass Surveillance

In April 2021, the European Commission proposed a law to regulate the use of Artificial Intelligence (AI Act). Since becoming part of the ‘Reclaim Your Face’ coalition, FSE has joined a growing number of organisations which are calling for the act to include a ban on biometric mass surveillance.

Currently, the European Parliament is forming its opinion on the AI Act proposal. In the past, they have supported the demand for a ban, but more pressure is needed. That is why we must raise awareness among politicians about the impact of biometric mass surveillance on fans’ rights and dignity.

What can fans do?

  • Research the use of mass surveillance in football and share the findings with other fans. Write to EDRi’s campaigns and outreach officer Belen (belen.luna[at]edri.org) or email info[at]fanseurope.org if your club or local stadium operator deploys facial recognition cameras or other forms of mass surveillance.
  • Raise awareness among fans, community organisers, and local politicians as to the prevalence and impact of mass surveillance.
  • Organise locally and through national and pan-European representative bodies to contest the use of mass surveillance in football.
  • If you are part of an organisation, join the EDRi’s ‘Reclaim Your Face’ coalition.

Further reading

  1. Burgess, Matt. ‘The Met Police’s Facial Recognition Tests Are Fatally Flawed’, Wired, 4th July 2019 (accessed online at on 10th August 2022)
  2. European Digital Rights (EDRi) & Edinburgh International Justice Initiative (EIJI) (2021). ‘The rise and rise of biometric mass surveillance in the EU’. (accessed online on 10th August 2022)
  3. Football Supporters Europe (2022). ‘Facial Recognition Technology: Fans, Not Test Subjects’. (accessed online at on 10th August 2022)
  4. Football Supporters Europe (2022). ‘FSE Calls On EU Parliament To Protect Citizens From Biometric Mass Surveillance’. (accessed online on 10th August 2022)

Parliament calls loud and clear for a ban on biometric mass surveillance in AI Act

After our timely advocacy actions with over 70 organisations, the amendments to the IMCO – LIBE Committee Report for the Artificial Intelligence Act clearly state the need for a ban on Remote Biometric Identification. In fact, 24 individual MEPs representing 158 MEPs, demand a complete ban on biometric mass surveillance practices. Now we need to keep up the pressure at European and national levels to ensure that when the AI Act is officially passed, likely in 2023 or 2024, it bans biometric mass surveillance.


Remote biometric identification (RBI): what, where, why?

In April 2021, as a direct result of the work of civil society organisations like Reclaim Your Face, the European Commission put forward the draft for the EU Artificial Intelligence Act. The draft explicitly recognised the serious human rights risks of biometric mass surveillance by including a prohibition on ‘remote biometric identification’ (RBI) in publicly-accessible spaces.

However, the original RBI ban proposed by the European Commission was weak in three main ways:

  1. It banned ‘real-time’ (live) uses of RBI systems, but not the far more common ‘post’ uses. This means that authorities could use RBI after the data is collected (hours, days or even months after!) to turn back the clock, identifying journalists, people seeking reproductive healthcare, and more.
  2. It only applied the ban to law enforcement actors (i.e. police). As a result, we could all still be surveilled in public spaces by local councils, central governments, supermarket owners, shopping center managers, university administration and any other public or private actors.
  3. It also contained a series of wide and dangerous exceptions that could be used as a “blueprint” for how to conduct biometric mass surveillance practices – undermining the whole purpose and essence of the ban!

Whilst this was a big win, it has some limitations. The next steps of the process require that the EU’s 704 Members of the European Parliament (MEPs) and 27 member state governments agree to a ban for it to become law.

A hot topic in the European Parliament

In the EU Parliament, the MEPs who work in the Civil Liberties (LIBE) and Internal Markets (IMCO) working groups (also known as ‘Committees’) were given the joint responsibility to lead on the Parliament’s official position on the AI Act. As such, they presented a shared IMCO – LIBE report in March 2022.

After that, they had to present their amendments in a process by which MEPs are able to show which parts of the AI Act are most important to them, and how they would like to see improvements.

To influence this, Reclaim Your Face organised with the 76 civil society organisations part of our coalition. Many campaigners and advocates involved in the Reclaim Your Face campaign met with MEPs in the weeks and months preceding the amendments and organised an open letter. They encouraged MEPs to listen to the tens of thousands of people who signed the ECI petition calling for a ban and that the amendments that were going to be tabled, reflected five of our main demands:

  1. Extending the scope of the prohibition to cover all private as well as public actors;
  2. Ensuring that all uses of RBI (whether real-time or post) in publicly- accessible spaces are included in the prohibition;
  3. Deleting the exceptions to the prohibition, which independent human rights assessments confirm do not meet existing EU fundamental rights standards;
  4. Putting a stop to discriminatory or manipulative forms of biometric categorisation; and
  5. Properly addressing the risks of emotion recognition.

In June 2022, MEPs in the LIBE and IMCO Committees submitted ‘amendments’ to the AI Act showing the results and power of our actions: hundreds of amendments were tabled on biometrics, showing the importance MEPs put on this topic.

Amendments show major support for a ban

Download Who supported our demands?

In total, 177 MEPs across 6 out of the 7 political groups supported a stronger RBI ban in the AI Act!

  • 24 MEPs, from across 5 political groups, were champions of the Reclaim Your Face campaign! They tabled amendments for a full and unequivocal ban on all types of remote biometric identification (RBI) in publicly-accessible spaces. Two things are to be highlighted from this group. 1) it includes several of those who are responsible for the AI Act on behalf of their political group (called ‘Rapporteurs’ or ‘Shadows’) – a strong sign of broad support. This means that in fact, those 24 individual MEPs represent a staggering 158 MEPs who demand a complete ban on biometric mass surveillance practices! 2) some of the MEPs tabled these amendments ‘on behalf of’ their entire political group.
  • 18 MEPs went almost as far as their colleagues, supporting a full ban on ‘real-time’ RBI in publicly-accessible spaces, by all actors, and without conditions for exceptions. However, these MEPs did not propose to extend the ban to ‘post’ uses of RBI. Given that these MEPs clearly understand the threats and risks of biometric mass surveillance, this gives us good ground to go forward and convince them that ‘post’ uses are equally, if not even more, harmful than real-time uses.
  • Dozens of MEPs additionally proposed two new and important bans. These explicitly prohibit the police from using private biometric databases, and the creation of biometric databases through mass/untargeted methods such as online scraping or the mass scraping of CCTV footage. If accepted, this would further protect people from biometric mass surveillance, particularly through the use of services like Clearview AI.
  • Furthermore, 1 additional MEP supported removing all the exceptions to the RBI ban!

Download Who opposed our recommendations?

Opposition to a ban on RBI was very limited.

  • Just three MEPs – all from the European People’s Party (EPP) – argued that RBI in publicly-accessible spaces should only be classified as high-risk, not prohibited. Nevertheless, it is notable that these MEPs still recognised that RBI is very risky.
  • Separately, 14 MEPs supported a ban in principle, but added that it should be less restrictive. This includes both Shadow Rapporteurs for the EPP group, alongside 12 colleagues from the right-leaning Identity & Democracy (ID) group, European Conservatives and Reformists (ECR) group and their own EPP group.

Download Who said ‘yes, but…’?

7 additional MEPs from the ECR and EPP groups were ambivalent, putting forward some amendments which would strengthen the ban but also proposing amendments which would weaken it.

So what’s the balance in the European Parliament?

Overall, this is a really positive set of amendments. It shows clear and significant political will for a stronger ban on biometric mass surveillance, taking us a step closer to a genuine EU ban on these chilling practices.

The perspective of the Parliament is clear: we need a strong ban on biometric mass surveillance!

Among those calling for the most comprehensive form of a ban – which Reclaim Your Face has argued is necessary to protect people’s rights and freedoms – is MEP Brando Benifei from the S&D group. Mr Benifei is one of two MEPs who share the ultimate responsibility for the Parliament’s position on the AI Act, so his support for a full ban is very powerful and meaningful.

The other co-lead MEP is MEP Dragos Tudorache from the Renew group. He is one of the MEPs who supported all of our demands, except the one that would extend the ban to ‘post’ uses. Whilst we still, therefore, have work to do to convince Mr Tudorache and his colleagues, we can already see clear progress in his thinking. Last year he commented that he does not believe that a prohibition is the right approach to RBI. Now, Mr Tudorache says he agrees with us that RBI is a key human rights issue. His support is therefore also very important, and we believe that he will be open to learning more about how post uses of RBI pose a threat to organising, journalism and other civil freedoms.

We are also very proud of the commitment and effectiveness of organisations in the Reclaim Your Face. The amendments showed that the Parliament clearly listened and that the power of our joint actions is truly huge!

What’s next?

The fight is still far from over.

Whilst RBI in publicly-accessible spaces is a major part of biometric mass surveillance, practices such as biometric categorisation and emotion recognition (making predictions about people’s ethnicity, gender, emotions or other characteristics based on how they look or act) can also lead to biometric mass surveillance. That’s why we are also advocating for strong bans on both practices in the AI Act – which we are pleased to see have been put forward by several MEPs.

There is also a lot left to go in the political process. These amendments need to be turned into compromise amendments, and then voted on to ensure that the entire Parliament officially agrees. Only then will negotiations begin with the member state governments (Council), where more permissive home affairs ministers have clashed with more rights-protective justice ministers over whether to weaken or strengthen the RBI ban.

This emphasises why now, more than ever, we need to keep up the pressure at European and national levels to ensure that – when the AI Act is officially passed, likely in 2023 or 2024 – it bans biometric mass surveillance!

Get in contact with us to find out to support Reclaim Your Face!

A big success for Homo Digitalis: The Hellenic DPA fines CLEARVIEW AI with €20 million

On July 13 2022, following a complaint filed by Homo Digitalis in May 2021 representing our member and data subject Marina Zacharopoulou, the Hellenic Data Protection Authority (HDPA) issued Decision 35/2022 imposing a fine of 20 million euros on Clearview AI for its intrusive practices. By the same Decision, the DPA prohibits that company from collecting and processing the personal data of data subjects located in Greece using facial recognition methods and requires it to delete immediately any data it has already collected.

Specifically, in May 2021, an alliance of civil society organizations consisting of Homo Digitalis and the organisations Privacy International, Hermes Center, and noyb filed complaints before the competent authorities in Greece, the United Kingdom, Italy, Austria, France and the United Kingdom against Clearview AI for its mass surveillance practices through facial recognition.

Earlier this year, the Italian Data Protection Authority had decided to fine the company €20 million, while the UK’s equivalent authority had decided to fine it £7.5 million.

The €20 million fine imposed by the DPA today is another strong signal against intrusive business models of companies that seek to make money through the illegal processing of personal data. At the same time, it sends a clear message to law enforcement authorities working with companies of this kind that such practices are illegal and grossly violate the rights of data subjects.

Clearview AI is an American company founded in 2017 that develops facial recognition software. It claims to have “the largest known database of more than three billion facial images” which it collects from social media platforms and other online sources. It is an automated tool that visits public websites and collects any images it detects that contain human faces. Along with these images, the automated collector also collects metadata that complements these images, such as the title of the website and its source link. The collected facial images are then matched against the facial recognition software created by Clearview AI in order to build the company’s database. Clearview AI sells access to this database to private companies and law enforcement agencies, such as police authorities, internationally.

The full text of Decision 35/2022 can be found here (only in EL).

Week of actions: Reclaim Your Face Italy and the need for a real EU ban on biometric mass surveillance

During the second week of May 2022, Reclaim Your Face Italy held a week of actions for an EU ban on biometric mass surveillance in Milan, Torino and Como. They collected signatures on the streets of the 3 cities, joined an event organised by the Greens-European Free Alliance Group and made a field visit to Italy’s city, Como, the first one to implement facial recognition technology in a public park.

Background

In 2021, the Italian Data Protection Authority (DPA) rejected the police use of Automatic Image Recognition System (SARI). SARI is a real-time facial recognition system that was acquired by the Italian Police in 2017 and being under investigation by the Authority ever since. Albeit it is assured to never been used in real-time, this system was at the center of debate after it was revealed their intention to employ it to monitor arrivals of migrants and asylum seekers on the Italian coasts.

In its decision, the DPA argued that the system lacks a legal basis and, as designed, it would constitute a form of mass surveillance. Thanks to the actions of Hermes Center, Associazione Luca Coscioni, Certi Diritti, CILD, Eumans, info.nodes, The Good Lobby, Privacy Network, Progetto Winston Smith, and StraLi, a temporary ban on facial recognition technology in public spaces was introduced later. This moratorium will be in force until December 2023.

Now our Reclaim Your Face partners Hermes Center, Privacy Network, Certi Diritti, STRALI and CILD are fiercely campaigning to Ban Biometric Mass Surveillance in the EU.

Here are some of their latest actions!

Paper signature collection

On the 10th of May 2022, Reclaim Your Face hosted paper signature collection stands in three big cities of Italy: Milan, Torino, and Rome. This paper signature collection was organized by Hermes Center and two national Reclaim Your Face partners: StraLi and CILD. The activists were in front of Universities and in the city center to talk about the risks of biometric mass surveillance, giving out stickers, booklets, Reclaim Your Face T-shirts and bags.

Event with Greens-European Free Alliance Group

Colleagues from Hermes Center, Riccardo Coluccini and Davide Del Monte, joined as speakers for the event ‘Stop Biometric Surveillance – Time for an EU ban on biometric mass surveillance in public spaces’ to explain why Italy must carry on campaigning pushing for a real ban on biometric surveillance in the EU.

Visit in Como

Como was the first city to implement facial recognition technology in their park in 2019 through an offer by Huawei . The technology included also algorithms that detected different types of behaviours. Not coincidentally, in 2016 during the migration crisis, migrants were camping in this park waiting to cross the border.

After the work of activists and a journalistic investigation by Hermes Center colleagues Laura Carrer and Riccardo Coluccini, and researcher Philip Di Salvo, Como was obliged to shut down the system 2020.

In May 2022, together with representatives from the Greens- European Free Alliance Group and journalists from the Czech Republic, the researchers visited the park where Facial Recognition cameras were installed and talked about their investigation. While the cameras are still there, the Facial Recognition and other algorithmic functions are turned off at the moment. The Greens- European Free Alliance Group and Czech journalist later met with local journalist Andrea Quadroni who talked about the migrant crisis that hit Como in 2016.

The trip to Como is part of the Greens- European Free Alliance Group’s newly released mini-documentary while articles about the actions and results of Reclaim Your Face in Italy were published on national TV and radio station in the Czech Republic.

Reclaim Your Face impact in 2021

A sturdy coalition, research reports, investigations, coordination actions and gathering amazing political support at national and EU level. This was 2021 for the Reclaim Your Face coalition – a year that, despite happening in a pandemic – showed what the power of a united front looks like.


Forming a coalition in a strategic moment

In January 2021, a group of civil society organisations were meeting every 2 weeks to strategise and plan what has become one of the most politically–powerful campaigns: Reclaim Your Face.

Set on a mission from October 2020, the coalition of then 12 organisations came together to form the Reclaim Your Face coalition, aiming to ban biometric mass surveillance in Europe. Since then we welcomed dozens more organisations, which work on digital rights and civil liberties, workers’ rights, the rights of Roma and Sinti people, LGBTQ+ rights, media freedom and the protection of migrants and people on the move. We gathered activists, volunteers, technologists, lawyers, academics, policy-makers – all united in one common goal.

The launch of the campaign happened at a strategic moment when the EU began its work on a law proposal to regulate artificial intelligence (AI). The relevance and timing of the Reclaim Your Face campaign is unquestionable as AI techniques are at the centre of today’s biometric surveillance technologies such as facial recognition.

Raising awareness of the spread and harms of biometric mass surveillance

For the people in the Reclaim Your Face coalition, 2021 started with a strong focus on raising awareness about the harms associated with biometric mass surveillance. More, we showed this exploitative practice is a reality in many cities across Europe and not a dystopian fiction story.

Check out our video records.

Researching biometric mass surveillance

EDRi’s Brussels office and the leading organisations of the campaign coordinated research: mapping both technology deployments and legal frameworks that govern (or not) biometric mass surveillance practice in some EU countries.

Coordinating pandemic-proof actions

In 2021, we also coordinated online and offline actions that enabled every campaign supporter to act as part of a powerful collective. The pandemic put constraints on realising such actions, however, the creative hive mind behind the campaign made it happen!

The #PaperBagSociety stunt sparked curiosity and started discussions among curious minds as Reclaim Your Face activists wore paper bags on their heads in public spaces as a sign of protest. The #WalkTheTalk Twitter storm united activists across the Atlantic in calling on the EU Commissioner Vestager and the US Secretary Raimondo to not negotiate our rights in their trade discussions.

Politically, our success has been clear

Our European Citizens Initiative has been positioned as “perhaps the most politically powerful” of all to date. Thank you to the almost 65,000 EU citizens who have supported it so far!

Firstly, together we successfully set the agenda of the debate on AI. Not only were the words “ban” and “remote biometric identification” (a prominent technique that leads to biometric mass surveillance) included in the AI Act law proposal, but many EU and national affairs newspapers acknowledged the importance of the topic and reported heavily on it.

Secondly, we gathered support from several influential bodies that also called for a ban: EU’s top data protection regulators (the EDPS and EDPB), the Green Group in the EU Parliament, as well as Germany’s newly elected government, several national data protection authorities and UN officials. Our impact is also evident in the report Members of the EU Parliament adopted, calling for a ban on biometric mass surveillance by law enforcement.

Through our coalition, we successfully applied pressure on national governments that tried to sneak in laws that enabled biometric mass surveillance in Serbia and Portugal.  In Italy, Reclaim Your Face campaigners helped to catalyse a moratorium on facial recognition, and in Hamburg, data protection authorities agreed with us that the use of EU citizens’ face images by ClearviewAI is illegal.

Moving ahead in 2022, the Reclaim Your Face coalition is aiming to expand its reach, bringing together even more organisations fighting against biometric mass surveillance. We will train the many volunteers who have offered their support and reach a new level of political engagement.

Thank you for supporting us!

People across Switzerland reclaim their faces and public spaces!

On 18 November, three of the organisations that have long championed the Reclaim Your Face campaign – Digitale Gesellschaft (CH), Algorithm Watch CH and Amnesty International (CH) – co-launched a brand new and exciting action in the fight to curtail the sinister rise of biometric mass surveillance practices across Europe!


Called ‘Gesichtserkennung stoppen’ (DE) / ‘Stop à la reconnaissance faciale’ (FR), this action calls on Swiss supporters to take a stand for human rights and oppose the expansion of facial recognition and related biometric mass surveillance in Switzerland.

The action, in the form of a petition, complements the long-running European Citizens’ Initiative (ECI) run by the 65+ organisations in the Reclaim Your Face campaign. However, because EU laws limit those who can sign an ECI to those people who hold EU citizenship, our Swiss supporters have sadly been unable to make their opposition to a biometric mass surveillance society clear. Luckily, not any more!

The organisers explain why action is needed in Switzerland:

We have the right to move freely in public places without anyone knowing what we are doing. But automatic facial recognition allows us to be identified in the street at any time. We want to prevent such mass surveillance. Take a stand against automated facial recognition in public places in Swiss cities! Sign our petition today.”

So what are you waiting for? If you live or work in Switzerland, let the government know that you want to be treated as a person, not a walking barcode:

And if you do have EU citizenship (even if you’re a dual national) then you can give your voice legal weight by signing the ECI to ban biometric mass surveillance