France becomes the first European country to legalise biometric surveillance

EDRi member and Reclaim Your Face partner La Quadrature du Net charts out the chilling move by France to undermine human rights progress by ushering in mass algorithmic surveillance, which in a shocking move, has been authorised by national Parliamentarians.


For three years, the EDRi network has rallied against biometric mass surveillance practices through our long-running Reclaim Your Face campaign. Comprised of around eighty civil society groups and close to 100 thousand European citizens and residents, our movement has rejected the constant tracking of our faces and bodies across time and place by discriminatory algorithms. We have called for lawmakers to protect us from being treated as walking barcodes, and the European Parliament is now poised to vote to ban at least some forms of biometric mass surveillance at EU level.

In contrast, EDRi member and Reclaim Your Face partner La Quadrature du Net (LQDN) charts out the chilling move by France to undermine human rights progress by ushering in mass algorithmic surveillance, which in a shocking move, has been authorised by national Parliamentarians.

The article 7 of the Law on Olympic Games’ organisation has been adopted by the national parliament, “Assemblée Nationale”, formalising the introduction of Algorithmic Video-Surveillance in French Law, until December 2024. Due to the fuss regarding the retirements reform and following an expedited-as-usual process, French government succeeded in making acceptable one of the most dangerous technology ever deployed. Using lies and fake storytelling, the government escaped from the technical, political and judicial consequences in terms of mass surveillance. Supported by MPs from the governmental majority and far-right parties, algorithmic video-surveillance has been legalised backed by lies, undermining always more the democratic game.

  • The lie of biometrics: The government repeated and wrote down in the law that algorithmic video-surveillance is not related to biometrics. This is entirely false. This technology constantly identifies, analyses, classifies the bodies, physical attributes, gestures, body shapes, gait, which are unquestionably biometric data. LQDN explained it (see note or video), tirelessly told rapporteurs in the Sénat and Assemblée Nationale, the representatives, along with 38 other international organisations and more than 40 MEPs (members of European Parliament) who recently called out the French government. Despite this, the French government stood with its lies, concerning technical as well as legal aspects. France is once again violating EU’s law, consecrating its title of Europe surveillance’s champion.
  • The lie of usefulness: The government used the Olympic games as a pretext to achieve faster a long-running agenda of legalising these technologies. In fact, this choice is just keeping to a “tradition” widely observed of States profiting from international mega-events to pass exceptional laws. The government convinces people of the necessity to “spot suspicious packages” or “prevent massive crowd movements”. These events suddenly became the top priority for the Ministry of the Interior and deputies, who make the security of the Olympics just about these issues, which were rarely identified as a priority before. Also, these issues can also be resolved by human competency instead of these technologies, as LQDN have demonstrated in this article. Algorithmic video-surveillance acceptance relies on a well implanted myth according to which technology would magically ensure security. In this way, these opaque technologies are deemed useful without any honest evaluation or demonstration.
  • The technical lie: Algorithmic video-surveillance’s main application is to identify behaviors, pre-defined as “suspicious” by the police. Arbitrary and dangerous by design, the way these algorithms work has never been explained by the government: because it is not understood by most of those who decide Whether because of inexcusable incompetence or assumed diversion, in the end, the level of the parliamentary debate was extremely low, and certainly not what was expected given the dramatic issues raised by these biometric technologies.Helped by Guillaume Vuillemet and Sacha Houlié, both from the governmental party and some other MPs, what dominated the parliamentary debate was a minimisation rethoric directly inspired from surveillance companies’ marketing narratives, along with lies and technical nonsense. It clearly shows the incapacity of the Parliament to discuss technical questions. Moreover, society should legitimately fear the future, considering how Parliamentary representatives are unable to apprehend the threats of emerging technologies.

As police brutalities overwhelm people’s screens, and as the police, armed with clubs, assures the after-sales service of the most unpopular reforms, the increasing police surveillance is part of a global strategy to stifle any contestation.

Such tactics allowing the State to transform the reality of its surveillance prerogatives must be denounced. Particularly in a context where the meaning of words are deliberately twisted to make people believe that “surveillance is protection”,“security is freedom”, “democracy means forcing them through”. It is necessary to expose and counter this fake democratic game, and to question the extraordinary powers given to the French police. No need to talk about a “Chinese” dystopia to realise the height of the stakes. One can look at France’s history and present political climate to take the measure and understand the twenty years long security drift: more cameras, surveillance and databases, while depoliticising social stakes, and a loss of sense amongst politicians in charge. As a result, the Olympics’ law debates shed the light on the political disorientation of decision-makers, unable to question these security issues.

This first legalisation of automated video-surveillance is a winning step for the French security industry. Those who’ve been asking for years to test their algorithms on the people, to improve them and sell the technologies worldwide, are now satisfied. Soon, Thales, XXII, Two-I and Neuroo, will be allowed to sell biometric softwares to other states, like Idemia sold its facial recognition software to China. The startup XXII couldn’t even wait the law to be voted to loudly announce it raised 22 million euros to become, in its own words, “the European leader” of algorithmic video-surveillance.

The institutions in charge of preserving liberties, such as the Commission nationale de l’informatique et des libertés (CNIL), are totally failing. Created in 1978 and gifted with real and efficient counter powers to document the governmental surveillance ambitions, it is now the after-sales service of governmental regulations and carefully helps companies to implement “good” surveillance, in order to preserve the industry’s economic interests, without any consideration about collective rights and liberties.

This first legal authorisation creates a precedent and opens the gates to every other biometric surveillance technology: algorithmic audio-surveillance, facial recognition, biometric tracking, and more.

LQDN won’t give up the fight and will keep denouncing all of the government’s lies. They will be present as soon as the first experimentations will start and document the inevitable abuses these technologies lead to. They will find ways to contest them in courts and will fight for these experiments to remain temporary. And they’ll keep refusing these technologies and the technopolicing they incorporate, by fighting at the European level to obtain their interdiction.

Left-leaning French lawmakers are planning to challenge the adoption of this bill in the country’s top constitutional court.

This was first published by La Quadrature du Net. Read it in French.

Football fans are being targeted by biometric mass surveillance

Biometric mass surveillance involves the indiscriminate and/or arbitrary monitoring, tracking, and processing of biometric data related to individuals and/or groups. Biometric data encompasses, but is not limited to, fingerprints, palmprints, palm veins, hand geometry, facial recognition, DNA, iris recognition, typing rhythm, walking style, and voice recognition.

Though often targeted at specific groups, the use of mass surveillance technologies is becoming prevalent in publicly available spaces across Europe. As a result, football fans are increasingly impacted by them

Apart from its undemocratic nature, there are many reasons why biometric mass surveillance is problematic for human rights and fans’ rights. 

Firstly, in the general sense, the practices around biometric mass surveillance in and around stadia involve the collection of personal data, which may be shared with third parties and/or stored insecurely. All of this biometric data can be used in the service of mass surveillance

Secondly, fans’ culture is under threat because mass surveillance can be deployed to control or deter many of the core elements that bring people together in groups and in stadia. To be sure, biometric mass surveillance can create a ‘chilling effect’ on individuals. Knowing one is being surveilled can lead people to feel discouraged from legitimately attending pre-match gatherings and fan marches, or joining a protest. 

Moreover, women, people of colour, and fans who belong to the LGBT+ community may be at higher risk of being targeted or profiled.

Football Supporters Europe (FSE) highlighted these problems earlier in the year:

“There are two good reasons why fans should pay close attention to the question of biometric mass surveillance. First, we have a right to privacy, association, and expression, just like everybody else. And second, we’re often used as test subjects for invasive technologies and practices. With this in mind, we encourage fans to work at the local, national, and European levels to make sure that everybody’s fundamental rights are protected from such abuses.”

Football fans and mass surveillance 

The situation differs from country to country, but there are countless examples of fans being subjected to intrusive, or in some cases, unauthorised, surveillance:

  • Belgium: In 2018, second-tier club RWD Molenbeek announced plans to deploy facial recognition technology to improve queuing times at turnstiles.
  • Denmark: Facial recognition technology is used for ticketing verification at the Brøndby Stadion. The supplier claims that the Panasonic FacePro system can recognise people even if they wear sunglasses.
  • France: FC Metz allegedly used an experimental system to identify people who were subject to civil stadium bans, detect abandoned objects, and enhance counter-terror measures. Following several reports, the French data protection watchdog (CNIL) carried out an investigation which determined that the system relied on the processing of biometric data. In February 2021, CNIL ruled the use of facial recognition technology in the stadium to be unlawful.
  • Hungary: In 2015, the Hungarian Civil Liberties Union (HCUL) filed a complaint at the constitutional court challenging the use of palm “vein” scanners at some football stadia after fans of several clubs objected to the practice.
  • The Netherlands: In 2019, AFC Ajax and FC Den Bosch outlined plans to use facial recognition technology to validate and verify e-tickets.
  • Spain: Atlético Madrid declared their intention to use facial recognition systems and implement cashless payments from the 2022-23 season onwards. Valencia, meanwhile, have already deployed facial recognition technology designed by FacePhi to monitor and control access to their stadium. Several clubs, including Sevilla FC, also use fingerprint scanning to identify season ticket holders at turnstiles.
  • United Kingdom: In 2016, football fans and other community groups successfully campaigned against the introduction of facial recognition technology at Scottish football stadia. Soon after, South Wales Police began using facial recognition systems at football games to “prevent disorder”. According to the BBC, the use of the technology at the 2017 Champions League final in Cardiff led to 2,000 people being “wrongly identified as possible criminals”. In 2019 and 2020, Cardiff City and Swansea City fans joined forces to oppose its’ use considering it “completely unnecessary and disproportionate”.

EU AI Act and Biometric Mass Surveillance

In April 2021, the European Commission proposed a law to regulate the use of Artificial Intelligence (AI Act). Since becoming part of the ‘Reclaim Your Face’ coalition, FSE has joined a growing number of organisations which are calling for the act to include a ban on biometric mass surveillance.

Currently, the European Parliament is forming its opinion on the AI Act proposal. In the past, they have supported the demand for a ban, but more pressure is needed. That is why we must raise awareness among politicians about the impact of biometric mass surveillance on fans’ rights and dignity.

What can fans do?

  • Research the use of mass surveillance in football and share the findings with other fans. Write to EDRi’s campaigns and outreach officer Belen (belen.luna[at]edri.org) or email info[at]fanseurope.org if your club or local stadium operator deploys facial recognition cameras or other forms of mass surveillance.
  • Raise awareness among fans, community organisers, and local politicians as to the prevalence and impact of mass surveillance.
  • Organise locally and through national and pan-European representative bodies to contest the use of mass surveillance in football.
  • If you are part of an organisation, join the EDRi’s ‘Reclaim Your Face’ coalition.

Further reading

  1. Burgess, Matt. ‘The Met Police’s Facial Recognition Tests Are Fatally Flawed’, Wired, 4th July 2019 (accessed online at on 10th August 2022)
  2. European Digital Rights (EDRi) & Edinburgh International Justice Initiative (EIJI) (2021). ‘The rise and rise of biometric mass surveillance in the EU’. (accessed online on 10th August 2022)
  3. Football Supporters Europe (2022). ‘Facial Recognition Technology: Fans, Not Test Subjects’. (accessed online at on 10th August 2022)
  4. Football Supporters Europe (2022). ‘FSE Calls On EU Parliament To Protect Citizens From Biometric Mass Surveillance’. (accessed online on 10th August 2022)

Parliament calls loud and clear for a ban on biometric mass surveillance in AI Act

After our timely advocacy actions with over 70 organisations, the amendments to the IMCO – LIBE Committee Report for the Artificial Intelligence Act clearly state the need for a ban on Remote Biometric Identification. In fact, 24 individual MEPs representing 158 MEPs, demand a complete ban on biometric mass surveillance practices. Now we need to keep up the pressure at European and national levels to ensure that when the AI Act is officially passed, likely in 2023 or 2024, it bans biometric mass surveillance.


Remote biometric identification (RBI): what, where, why?

In April 2021, as a direct result of the work of civil society organisations like Reclaim Your Face, the European Commission put forward the draft for the EU Artificial Intelligence Act. The draft explicitly recognised the serious human rights risks of biometric mass surveillance by including a prohibition on ‘remote biometric identification’ (RBI) in publicly-accessible spaces.

However, the original RBI ban proposed by the European Commission was weak in three main ways:

  1. It banned ‘real-time’ (live) uses of RBI systems, but not the far more common ‘post’ uses. This means that authorities could use RBI after the data is collected (hours, days or even months after!) to turn back the clock, identifying journalists, people seeking reproductive healthcare, and more.
  2. It only applied the ban to law enforcement actors (i.e. police). As a result, we could all still be surveilled in public spaces by local councils, central governments, supermarket owners, shopping center managers, university administration and any other public or private actors.
  3. It also contained a series of wide and dangerous exceptions that could be used as a “blueprint” for how to conduct biometric mass surveillance practices – undermining the whole purpose and essence of the ban!

Whilst this was a big win, it has some limitations. The next steps of the process require that the EU’s 704 Members of the European Parliament (MEPs) and 27 member state governments agree to a ban for it to become law.

A hot topic in the European Parliament

In the EU Parliament, the MEPs who work in the Civil Liberties (LIBE) and Internal Markets (IMCO) working groups (also known as ‘Committees’) were given the joint responsibility to lead on the Parliament’s official position on the AI Act. As such, they presented a shared IMCO – LIBE report in March 2022.

After that, they had to present their amendments in a process by which MEPs are able to show which parts of the AI Act are most important to them, and how they would like to see improvements.

To influence this, Reclaim Your Face organised with the 76 civil society organisations part of our coalition. Many campaigners and advocates involved in the Reclaim Your Face campaign met with MEPs in the weeks and months preceding the amendments and organised an open letter. They encouraged MEPs to listen to the tens of thousands of people who signed the ECI petition calling for a ban and that the amendments that were going to be tabled, reflected five of our main demands:

  1. Extending the scope of the prohibition to cover all private as well as public actors;
  2. Ensuring that all uses of RBI (whether real-time or post) in publicly- accessible spaces are included in the prohibition;
  3. Deleting the exceptions to the prohibition, which independent human rights assessments confirm do not meet existing EU fundamental rights standards;
  4. Putting a stop to discriminatory or manipulative forms of biometric categorisation; and
  5. Properly addressing the risks of emotion recognition.

In June 2022, MEPs in the LIBE and IMCO Committees submitted ‘amendments’ to the AI Act showing the results and power of our actions: hundreds of amendments were tabled on biometrics, showing the importance MEPs put on this topic.

Amendments show major support for a ban

Download Who supported our demands?

In total, 177 MEPs across 6 out of the 7 political groups supported a stronger RBI ban in the AI Act!

  • 24 MEPs, from across 5 political groups, were champions of the Reclaim Your Face campaign! They tabled amendments for a full and unequivocal ban on all types of remote biometric identification (RBI) in publicly-accessible spaces. Two things are to be highlighted from this group. 1) it includes several of those who are responsible for the AI Act on behalf of their political group (called ‘Rapporteurs’ or ‘Shadows’) – a strong sign of broad support. This means that in fact, those 24 individual MEPs represent a staggering 158 MEPs who demand a complete ban on biometric mass surveillance practices! 2) some of the MEPs tabled these amendments ‘on behalf of’ their entire political group.
  • 18 MEPs went almost as far as their colleagues, supporting a full ban on ‘real-time’ RBI in publicly-accessible spaces, by all actors, and without conditions for exceptions. However, these MEPs did not propose to extend the ban to ‘post’ uses of RBI. Given that these MEPs clearly understand the threats and risks of biometric mass surveillance, this gives us good ground to go forward and convince them that ‘post’ uses are equally, if not even more, harmful than real-time uses.
  • Dozens of MEPs additionally proposed two new and important bans. These explicitly prohibit the police from using private biometric databases, and the creation of biometric databases through mass/untargeted methods such as online scraping or the mass scraping of CCTV footage. If accepted, this would further protect people from biometric mass surveillance, particularly through the use of services like Clearview AI.
  • Furthermore, 1 additional MEP supported removing all the exceptions to the RBI ban!

Download Who opposed our recommendations?

Opposition to a ban on RBI was very limited.

  • Just three MEPs – all from the European People’s Party (EPP) – argued that RBI in publicly-accessible spaces should only be classified as high-risk, not prohibited. Nevertheless, it is notable that these MEPs still recognised that RBI is very risky.
  • Separately, 14 MEPs supported a ban in principle, but added that it should be less restrictive. This includes both Shadow Rapporteurs for the EPP group, alongside 12 colleagues from the right-leaning Identity & Democracy (ID) group, European Conservatives and Reformists (ECR) group and their own EPP group.

Download Who said ‘yes, but…’?

7 additional MEPs from the ECR and EPP groups were ambivalent, putting forward some amendments which would strengthen the ban but also proposing amendments which would weaken it.

So what’s the balance in the European Parliament?

Overall, this is a really positive set of amendments. It shows clear and significant political will for a stronger ban on biometric mass surveillance, taking us a step closer to a genuine EU ban on these chilling practices.

The perspective of the Parliament is clear: we need a strong ban on biometric mass surveillance!

Among those calling for the most comprehensive form of a ban – which Reclaim Your Face has argued is necessary to protect people’s rights and freedoms – is MEP Brando Benifei from the S&D group. Mr Benifei is one of two MEPs who share the ultimate responsibility for the Parliament’s position on the AI Act, so his support for a full ban is very powerful and meaningful.

The other co-lead MEP is MEP Dragos Tudorache from the Renew group. He is one of the MEPs who supported all of our demands, except the one that would extend the ban to ‘post’ uses. Whilst we still, therefore, have work to do to convince Mr Tudorache and his colleagues, we can already see clear progress in his thinking. Last year he commented that he does not believe that a prohibition is the right approach to RBI. Now, Mr Tudorache says he agrees with us that RBI is a key human rights issue. His support is therefore also very important, and we believe that he will be open to learning more about how post uses of RBI pose a threat to organising, journalism and other civil freedoms.

We are also very proud of the commitment and effectiveness of organisations in the Reclaim Your Face. The amendments showed that the Parliament clearly listened and that the power of our joint actions is truly huge!

What’s next?

The fight is still far from over.

Whilst RBI in publicly-accessible spaces is a major part of biometric mass surveillance, practices such as biometric categorisation and emotion recognition (making predictions about people’s ethnicity, gender, emotions or other characteristics based on how they look or act) can also lead to biometric mass surveillance. That’s why we are also advocating for strong bans on both practices in the AI Act – which we are pleased to see have been put forward by several MEPs.

There is also a lot left to go in the political process. These amendments need to be turned into compromise amendments, and then voted on to ensure that the entire Parliament officially agrees. Only then will negotiations begin with the member state governments (Council), where more permissive home affairs ministers have clashed with more rights-protective justice ministers over whether to weaken or strengthen the RBI ban.

This emphasises why now, more than ever, we need to keep up the pressure at European and national levels to ensure that – when the AI Act is officially passed, likely in 2023 or 2024 – it bans biometric mass surveillance!

Get in contact with us to find out to support Reclaim Your Face!