Our movement gathered in Brussels

Between 6 and 9 November 2022, more than 20 activists from across Europe gathered in Brussels to celebrate the successes of the Reclaim You Face movement. We got to meet each other in real life after months of online organising, reflected on our wide range off decentralised actions, and learned from each other how to couple grassroots organising with EU advocacy aimed at specific events and EU institutions. Read on to see what we did.

“It’s unbelievable we did all this.”

was the summary of the event, as rightfully pointed out by Andrej Petrovski of SHARE Foundation.
Read More

Football fans are being targeted by biometric mass surveillance

Biometric mass surveillance involves the indiscriminate and/or arbitrary monitoring, tracking, and processing of biometric data related to individuals and/or groups. Biometric data encompasses, but is not limited to, fingerprints, palmprints, palm veins, hand geometry, facial recognition, DNA, iris recognition, typing rhythm, walking style, and voice recognition.

Though often targeted at specific groups, the use of mass surveillance technologies is becoming prevalent in publicly available spaces across Europe. As a result, football fans are increasingly impacted by them

Apart from its undemocratic nature, there are many reasons why biometric mass surveillance is problematic for human rights and fans’ rights. 

Firstly, in the general sense, the practices around biometric mass surveillance in and around stadia involve the collection of personal data, which may be shared with third parties and/or stored insecurely. All of this biometric data can be used in the service of mass surveillance

Secondly, fans’ culture is under threat because mass surveillance can be deployed to control or deter many of the core elements that bring people together in groups and in stadia. To be sure, biometric mass surveillance can create a ‘chilling effect’ on individuals. Knowing one is being surveilled can lead people to feel discouraged from legitimately attending pre-match gatherings and fan marches, or joining a protest. 

Moreover, women, people of colour, and fans who belong to the LGBT+ community may be at higher risk of being targeted or profiled.

Football Supporters Europe (FSE) highlighted these problems earlier in the year:

“There are two good reasons why fans should pay close attention to the question of biometric mass surveillance. First, we have a right to privacy, association, and expression, just like everybody else. And second, we’re often used as test subjects for invasive technologies and practices. With this in mind, we encourage fans to work at the local, national, and European levels to make sure that everybody’s fundamental rights are protected from such abuses.”

Football fans and mass surveillance 

The situation differs from country to country, but there are countless examples of fans being subjected to intrusive, or in some cases, unauthorised, surveillance:

  • Belgium: In 2018, second-tier club RWD Molenbeek announced plans to deploy facial recognition technology to improve queuing times at turnstiles.
  • Denmark: Facial recognition technology is used for ticketing verification at the Brøndby Stadion. The supplier claims that the Panasonic FacePro system can recognise people even if they wear sunglasses.
  • France: FC Metz allegedly used an experimental system to identify people who were subject to civil stadium bans, detect abandoned objects, and enhance counter-terror measures. Following several reports, the French data protection watchdog (CNIL) carried out an investigation which determined that the system relied on the processing of biometric data. In February 2021, CNIL ruled the use of facial recognition technology in the stadium to be unlawful.
  • Hungary: In 2015, the Hungarian Civil Liberties Union (HCUL) filed a complaint at the constitutional court challenging the use of palm “vein” scanners at some football stadia after fans of several clubs objected to the practice.
  • The Netherlands: In 2019, AFC Ajax and FC Den Bosch outlined plans to use facial recognition technology to validate and verify e-tickets.
  • Spain: Atlético Madrid declared their intention to use facial recognition systems and implement cashless payments from the 2022-23 season onwards. Valencia, meanwhile, have already deployed facial recognition technology designed by FacePhi to monitor and control access to their stadium. Several clubs, including Sevilla FC, also use fingerprint scanning to identify season ticket holders at turnstiles.
  • United Kingdom: In 2016, football fans and other community groups successfully campaigned against the introduction of facial recognition technology at Scottish football stadia. Soon after, South Wales Police began using facial recognition systems at football games to “prevent disorder”. According to the BBC, the use of the technology at the 2017 Champions League final in Cardiff led to 2,000 people being “wrongly identified as possible criminals”. In 2019 and 2020, Cardiff City and Swansea City fans joined forces to oppose its’ use considering it “completely unnecessary and disproportionate”.

EU AI Act and Biometric Mass Surveillance

In April 2021, the European Commission proposed a law to regulate the use of Artificial Intelligence (AI Act). Since becoming part of the ‘Reclaim Your Face’ coalition, FSE has joined a growing number of organisations which are calling for the act to include a ban on biometric mass surveillance.

Currently, the European Parliament is forming its opinion on the AI Act proposal. In the past, they have supported the demand for a ban, but more pressure is needed. That is why we must raise awareness among politicians about the impact of biometric mass surveillance on fans’ rights and dignity.

What can fans do?

  • Research the use of mass surveillance in football and share the findings with other fans. Write to EDRi’s campaigns and outreach officer Belen (belen.luna[at]edri.org) or email info[at]fanseurope.org if your club or local stadium operator deploys facial recognition cameras or other forms of mass surveillance.
  • Raise awareness among fans, community organisers, and local politicians as to the prevalence and impact of mass surveillance.
  • Organise locally and through national and pan-European representative bodies to contest the use of mass surveillance in football.
  • If you are part of an organisation, join the EDRi’s ‘Reclaim Your Face’ coalition.

Further reading

  1. Burgess, Matt. ‘The Met Police’s Facial Recognition Tests Are Fatally Flawed’, Wired, 4th July 2019 (accessed online at on 10th August 2022)
  2. European Digital Rights (EDRi) & Edinburgh International Justice Initiative (EIJI) (2021). ‘The rise and rise of biometric mass surveillance in the EU’. (accessed online on 10th August 2022)
  3. Football Supporters Europe (2022). ‘Facial Recognition Technology: Fans, Not Test Subjects’. (accessed online at on 10th August 2022)
  4. Football Supporters Europe (2022). ‘FSE Calls On EU Parliament To Protect Citizens From Biometric Mass Surveillance’. (accessed online on 10th August 2022)

Reclaim Your Face impact in 2021

A sturdy coalition, research reports, investigations, coordination actions and gathering amazing political support at national and EU level. This was 2021 for the Reclaim Your Face coalition – a year that, despite happening in a pandemic – showed what the power of a united front looks like.


Forming a coalition in a strategic moment

In January 2021, a group of civil society organisations were meeting every 2 weeks to strategise and plan what has become one of the most politically–powerful campaigns: Reclaim Your Face.

Set on a mission from October 2020, the coalition of then 12 organisations came together to form the Reclaim Your Face coalition, aiming to ban biometric mass surveillance in Europe. Since then we welcomed dozens more organisations, which work on digital rights and civil liberties, workers’ rights, the rights of Roma and Sinti people, LGBTQ+ rights, media freedom and the protection of migrants and people on the move. We gathered activists, volunteers, technologists, lawyers, academics, policy-makers – all united in one common goal.

The launch of the campaign happened at a strategic moment when the EU began its work on a law proposal to regulate artificial intelligence (AI). The relevance and timing of the Reclaim Your Face campaign is unquestionable as AI techniques are at the centre of today’s biometric surveillance technologies such as facial recognition.

Raising awareness of the spread and harms of biometric mass surveillance

For the people in the Reclaim Your Face coalition, 2021 started with a strong focus on raising awareness about the harms associated with biometric mass surveillance. More, we showed this exploitative practice is a reality in many cities across Europe and not a dystopian fiction story.

Check out our video records.

Researching biometric mass surveillance

EDRi’s Brussels office and the leading organisations of the campaign coordinated research: mapping both technology deployments and legal frameworks that govern (or not) biometric mass surveillance practice in some EU countries.

Coordinating pandemic-proof actions

In 2021, we also coordinated online and offline actions that enabled every campaign supporter to act as part of a powerful collective. The pandemic put constraints on realising such actions, however, the creative hive mind behind the campaign made it happen!

The #PaperBagSociety stunt sparked curiosity and started discussions among curious minds as Reclaim Your Face activists wore paper bags on their heads in public spaces as a sign of protest. The #WalkTheTalk Twitter storm united activists across the Atlantic in calling on the EU Commissioner Vestager and the US Secretary Raimondo to not negotiate our rights in their trade discussions.

Politically, our success has been clear

Our European Citizens Initiative has been positioned as “perhaps the most politically powerful” of all to date. Thank you to the almost 65,000 EU citizens who have supported it so far!

Firstly, together we successfully set the agenda of the debate on AI. Not only were the words “ban” and “remote biometric identification” (a prominent technique that leads to biometric mass surveillance) included in the AI Act law proposal, but many EU and national affairs newspapers acknowledged the importance of the topic and reported heavily on it.

Secondly, we gathered support from several influential bodies that also called for a ban: EU’s top data protection regulators (the EDPS and EDPB), the Green Group in the EU Parliament, as well as Germany’s newly elected government, several national data protection authorities and UN officials. Our impact is also evident in the report Members of the EU Parliament adopted, calling for a ban on biometric mass surveillance by law enforcement.

Through our coalition, we successfully applied pressure on national governments that tried to sneak in laws that enabled biometric mass surveillance in Serbia and Portugal.  In Italy, Reclaim Your Face campaigners helped to catalyse a moratorium on facial recognition, and in Hamburg, data protection authorities agreed with us that the use of EU citizens’ face images by ClearviewAI is illegal.

Moving ahead in 2022, the Reclaim Your Face coalition is aiming to expand its reach, bringing together even more organisations fighting against biometric mass surveillance. We will train the many volunteers who have offered their support and reach a new level of political engagement.

Thank you for supporting us!

No biometric surveillance for Italian students during exams

In September 2021 the Italian Data Protection Authority (DPA) fined Luigi Bocconi University €200 000 for using Respondus, a proctoring software, without sufficiently informing students of the processing of their personal data and, among other violations, for processing their biometric data without a legal basis. Bocconi is a private University based in Milan and during the COVID-19 pandemic introduced Respondus tools to monitor students during remote exams. 


Respondus offers two different modules: Lockdown browser and Respondus Monitor. The former prevents a student from using their computer as usual, meaning that the person for example cannot open other programs. Respondus Monitor checks that the person in front of the screen is the one that should be taking the exam, in order to prevent someone else from replacing the student or passing notes. To do this, the software uses algorithms that analyse the biometric data of the person’s face in order to confirm their presence and it also records keystrokes, mouse movements and the duration of the exam. After processing the data, the software sends the professor a report showing the student’s image for identification purposes and alerts of any anomalies, with details on the reason for the alert. 

The University initially tried to walk back from what they stated in their own privacy policy, claiming that no biometric data was processed given that the only identification happening was the one concerning the initial picture taken by the software and used by an operator (in this case the professor) to confirm the identity of the student. Something that didn’t match the real functioning of the system. In fact, in their decision, the DPA says that Respondus declared that their software creates a biometric template to monitor the presence of the same person in front of the screen throughout the exam. For this reason, the “software performs a specific technical processing of a physical characteristic of the persons,” says the DPA and, currently, in Italy there is no legal provision expressly authorising the processing of biometric data for the purposes of verifying the regularity of exams. The DPA highlights also that, considering that the processing was carried out by the University for the purpose of issuing degrees with legal value and the specific imbalance in the position of students with respect to the University, consent does not constitute the legal basis of the processing nor can it be considered as freely given. 

In addition, the DPA considers the functionalities of the ‘Respondus Monitor’ component as a “partially automated processing operation for the analysis of the behaviour of the data subjects, in relation to the subsequent assessment by the teacher,” and this “gives rise to the ‘profiling’ of the students.”

This processing of personal data, according to the DPA, may have an impact on the emotional and psychological sphere of the persons concerned which “may also derive from the specific functionalities of the supervision system, such as, in this case, facial recognition and behavioural profiling, with possible repercussions on the accuracy of the anomalies detected by the algorithm and therefore, indirectly, also on the overall outcome of the test.” 

Laptop and book, both open

Bocconi is not the only Italian University using proctoring software. In June 2020 in Italy there were at least ten Universities using (or planning to use) similar tools such as Proctorio, ProctorExam, and Safe Exam Browser. This Authority’s decision would prohibit other Italian Universities from using software similar to Respondus that collect and process students’ biometric data.

Despite this push back on student monitoring, this decision also reminds us that biometric surveillance is increasingly expanding into every sphere of our lives and the only solution is to call for a ban on these technologies.

Contribution by: Laura Carrer, Research and Advocacy at Digital Rights Unit, Hermes Center & Riccardo Coluccini, Reclaim Your Face national campaign contributor.