News




Our movement gathered in Brussels

Between 6 and 9 November 2022, more than 20 activists from across Europe gathered in Brussels to celebrate the successes of the Reclaim You Face movement. We got to meet each other in real life after months of online organising, reflected on our wide range off decentralised actions, and learned from each other how to couple grassroots organising with EU advocacy aimed at specific events and EU institutions. Read on to see what we did.

“It’s unbelievable we did all this.”

was the summary of the event, as rightfully pointed out by Andrej Petrovski of SHARE Foundation.
Read More

Football fans are being targeted by biometric mass surveillance

Biometric mass surveillance involves the indiscriminate and/or arbitrary monitoring, tracking, and processing of biometric data related to individuals and/or groups. Biometric data encompasses, but is not limited to, fingerprints, palmprints, palm veins, hand geometry, facial recognition, DNA, iris recognition, typing rhythm, walking style, and voice recognition.

Though often targeted at specific groups, the use of mass surveillance technologies is becoming prevalent in publicly available spaces across Europe. As a result, football fans are increasingly impacted by them

Apart from its undemocratic nature, there are many reasons why biometric mass surveillance is problematic for human rights and fans’ rights. 

Firstly, in the general sense, the practices around biometric mass surveillance in and around stadia involve the collection of personal data, which may be shared with third parties and/or stored insecurely. All of this biometric data can be used in the service of mass surveillance

Secondly, fans’ culture is under threat because mass surveillance can be deployed to control or deter many of the core elements that bring people together in groups and in stadia. To be sure, biometric mass surveillance can create a ‘chilling effect’ on individuals. Knowing one is being surveilled can lead people to feel discouraged from legitimately attending pre-match gatherings and fan marches, or joining a protest. 

Moreover, women, people of colour, and fans who belong to the LGBT+ community may be at higher risk of being targeted or profiled.

Football Supporters Europe (FSE) highlighted these problems earlier in the year:

“There are two good reasons why fans should pay close attention to the question of biometric mass surveillance. First, we have a right to privacy, association, and expression, just like everybody else. And second, we’re often used as test subjects for invasive technologies and practices. With this in mind, we encourage fans to work at the local, national, and European levels to make sure that everybody’s fundamental rights are protected from such abuses.”

Football fans and mass surveillance 

The situation differs from country to country, but there are countless examples of fans being subjected to intrusive, or in some cases, unauthorised, surveillance:

  • Belgium: In 2018, second-tier club RWD Molenbeek announced plans to deploy facial recognition technology to improve queuing times at turnstiles.
  • Denmark: Facial recognition technology is used for ticketing verification at the Brøndby Stadion. The supplier claims that the Panasonic FacePro system can recognise people even if they wear sunglasses.
  • France: FC Metz allegedly used an experimental system to identify people who were subject to civil stadium bans, detect abandoned objects, and enhance counter-terror measures. Following several reports, the French data protection watchdog (CNIL) carried out an investigation which determined that the system relied on the processing of biometric data. In February 2021, CNIL ruled the use of facial recognition technology in the stadium to be unlawful.
  • Hungary: In 2015, the Hungarian Civil Liberties Union (HCUL) filed a complaint at the constitutional court challenging the use of palm “vein” scanners at some football stadia after fans of several clubs objected to the practice.
  • The Netherlands: In 2019, AFC Ajax and FC Den Bosch outlined plans to use facial recognition technology to validate and verify e-tickets.
  • Spain: Atlético Madrid declared their intention to use facial recognition systems and implement cashless payments from the 2022-23 season onwards. Valencia, meanwhile, have already deployed facial recognition technology designed by FacePhi to monitor and control access to their stadium. Several clubs, including Sevilla FC, also use fingerprint scanning to identify season ticket holders at turnstiles.
  • United Kingdom: In 2016, football fans and other community groups successfully campaigned against the introduction of facial recognition technology at Scottish football stadia. Soon after, South Wales Police began using facial recognition systems at football games to “prevent disorder”. According to the BBC, the use of the technology at the 2017 Champions League final in Cardiff led to 2,000 people being “wrongly identified as possible criminals”. In 2019 and 2020, Cardiff City and Swansea City fans joined forces to oppose its’ use considering it “completely unnecessary and disproportionate”.

EU AI Act and Biometric Mass Surveillance

In April 2021, the European Commission proposed a law to regulate the use of Artificial Intelligence (AI Act). Since becoming part of the ‘Reclaim Your Face’ coalition, FSE has joined a growing number of organisations which are calling for the act to include a ban on biometric mass surveillance.

Currently, the European Parliament is forming its opinion on the AI Act proposal. In the past, they have supported the demand for a ban, but more pressure is needed. That is why we must raise awareness among politicians about the impact of biometric mass surveillance on fans’ rights and dignity.

What can fans do?

  • Research the use of mass surveillance in football and share the findings with other fans. Write to EDRi’s campaigns and outreach officer Belen (belen.luna[at]edri.org) or email info[at]fanseurope.org if your club or local stadium operator deploys facial recognition cameras or other forms of mass surveillance.
  • Raise awareness among fans, community organisers, and local politicians as to the prevalence and impact of mass surveillance.
  • Organise locally and through national and pan-European representative bodies to contest the use of mass surveillance in football.
  • If you are part of an organisation, join the EDRi’s ‘Reclaim Your Face’ coalition.

Further reading

  1. Burgess, Matt. ‘The Met Police’s Facial Recognition Tests Are Fatally Flawed’, Wired, 4th July 2019 (accessed online at on 10th August 2022)
  2. European Digital Rights (EDRi) & Edinburgh International Justice Initiative (EIJI) (2021). ‘The rise and rise of biometric mass surveillance in the EU’. (accessed online on 10th August 2022)
  3. Football Supporters Europe (2022). ‘Facial Recognition Technology: Fans, Not Test Subjects’. (accessed online at on 10th August 2022)
  4. Football Supporters Europe (2022). ‘FSE Calls On EU Parliament To Protect Citizens From Biometric Mass Surveillance’. (accessed online on 10th August 2022)

Parliament calls loud and clear for a ban on biometric mass surveillance in AI Act

After our timely advocacy actions with over 70 organisations, the amendments to the IMCO – LIBE Committee Report for the Artificial Intelligence Act clearly state the need for a ban on Remote Biometric Identification. In fact, 24 individual MEPs representing 158 MEPs, demand a complete ban on biometric mass surveillance practices. Now we need to keep up the pressure at European and national levels to ensure that when the AI Act is officially passed, likely in 2023 or 2024, it bans biometric mass surveillance.


Remote biometric identification (RBI): what, where, why?

In April 2021, as a direct result of the work of civil society organisations like Reclaim Your Face, the European Commission put forward the draft for the EU Artificial Intelligence Act. The draft explicitly recognised the serious human rights risks of biometric mass surveillance by including a prohibition on ‘remote biometric identification’ (RBI) in publicly-accessible spaces.

However, the original RBI ban proposed by the European Commission was weak in three main ways:

  1. It banned ‘real-time’ (live) uses of RBI systems, but not the far more common ‘post’ uses. This means that authorities could use RBI after the data is collected (hours, days or even months after!) to turn back the clock, identifying journalists, people seeking reproductive healthcare, and more.
  2. It only applied the ban to law enforcement actors (i.e. police). As a result, we could all still be surveilled in public spaces by local councils, central governments, supermarket owners, shopping center managers, university administration and any other public or private actors.
  3. It also contained a series of wide and dangerous exceptions that could be used as a “blueprint” for how to conduct biometric mass surveillance practices – undermining the whole purpose and essence of the ban!

Whilst this was a big win, it has some limitations. The next steps of the process require that the EU’s 704 Members of the European Parliament (MEPs) and 27 member state governments agree to a ban for it to become law.

A hot topic in the European Parliament

In the EU Parliament, the MEPs who work in the Civil Liberties (LIBE) and Internal Markets (IMCO) working groups (also known as ‘Committees’) were given the joint responsibility to lead on the Parliament’s official position on the AI Act. As such, they presented a shared IMCO – LIBE report in March 2022.

After that, they had to present their amendments in a process by which MEPs are able to show which parts of the AI Act are most important to them, and how they would like to see improvements.

To influence this, Reclaim Your Face organised with the 76 civil society organisations part of our coalition. Many campaigners and advocates involved in the Reclaim Your Face campaign met with MEPs in the weeks and months preceding the amendments and organised an open letter. They encouraged MEPs to listen to the tens of thousands of people who signed the ECI petition calling for a ban and that the amendments that were going to be tabled, reflected five of our main demands:

  1. Extending the scope of the prohibition to cover all private as well as public actors;
  2. Ensuring that all uses of RBI (whether real-time or post) in publicly- accessible spaces are included in the prohibition;
  3. Deleting the exceptions to the prohibition, which independent human rights assessments confirm do not meet existing EU fundamental rights standards;
  4. Putting a stop to discriminatory or manipulative forms of biometric categorisation; and
  5. Properly addressing the risks of emotion recognition.

In June 2022, MEPs in the LIBE and IMCO Committees submitted ‘amendments’ to the AI Act showing the results and power of our actions: hundreds of amendments were tabled on biometrics, showing the importance MEPs put on this topic.

Amendments show major support for a ban

Download Who supported our demands?

In total, 177 MEPs across 6 out of the 7 political groups supported a stronger RBI ban in the AI Act!

  • 24 MEPs, from across 5 political groups, were champions of the Reclaim Your Face campaign! They tabled amendments for a full and unequivocal ban on all types of remote biometric identification (RBI) in publicly-accessible spaces. Two things are to be highlighted from this group. 1) it includes several of those who are responsible for the AI Act on behalf of their political group (called ‘Rapporteurs’ or ‘Shadows’) – a strong sign of broad support. This means that in fact, those 24 individual MEPs represent a staggering 158 MEPs who demand a complete ban on biometric mass surveillance practices! 2) some of the MEPs tabled these amendments ‘on behalf of’ their entire political group.
  • 18 MEPs went almost as far as their colleagues, supporting a full ban on ‘real-time’ RBI in publicly-accessible spaces, by all actors, and without conditions for exceptions. However, these MEPs did not propose to extend the ban to ‘post’ uses of RBI. Given that these MEPs clearly understand the threats and risks of biometric mass surveillance, this gives us good ground to go forward and convince them that ‘post’ uses are equally, if not even more, harmful than real-time uses.
  • Dozens of MEPs additionally proposed two new and important bans. These explicitly prohibit the police from using private biometric databases, and the creation of biometric databases through mass/untargeted methods such as online scraping or the mass scraping of CCTV footage. If accepted, this would further protect people from biometric mass surveillance, particularly through the use of services like Clearview AI.
  • Furthermore, 1 additional MEP supported removing all the exceptions to the RBI ban!

Download Who opposed our recommendations?

Opposition to a ban on RBI was very limited.

  • Just three MEPs – all from the European People’s Party (EPP) – argued that RBI in publicly-accessible spaces should only be classified as high-risk, not prohibited. Nevertheless, it is notable that these MEPs still recognised that RBI is very risky.
  • Separately, 14 MEPs supported a ban in principle, but added that it should be less restrictive. This includes both Shadow Rapporteurs for the EPP group, alongside 12 colleagues from the right-leaning Identity & Democracy (ID) group, European Conservatives and Reformists (ECR) group and their own EPP group.

Download Who said ‘yes, but…’?

7 additional MEPs from the ECR and EPP groups were ambivalent, putting forward some amendments which would strengthen the ban but also proposing amendments which would weaken it.

So what’s the balance in the European Parliament?

Overall, this is a really positive set of amendments. It shows clear and significant political will for a stronger ban on biometric mass surveillance, taking us a step closer to a genuine EU ban on these chilling practices.

The perspective of the Parliament is clear: we need a strong ban on biometric mass surveillance!

Among those calling for the most comprehensive form of a ban – which Reclaim Your Face has argued is necessary to protect people’s rights and freedoms – is MEP Brando Benifei from the S&D group. Mr Benifei is one of two MEPs who share the ultimate responsibility for the Parliament’s position on the AI Act, so his support for a full ban is very powerful and meaningful.

The other co-lead MEP is MEP Dragos Tudorache from the Renew group. He is one of the MEPs who supported all of our demands, except the one that would extend the ban to ‘post’ uses. Whilst we still, therefore, have work to do to convince Mr Tudorache and his colleagues, we can already see clear progress in his thinking. Last year he commented that he does not believe that a prohibition is the right approach to RBI. Now, Mr Tudorache says he agrees with us that RBI is a key human rights issue. His support is therefore also very important, and we believe that he will be open to learning more about how post uses of RBI pose a threat to organising, journalism and other civil freedoms.

We are also very proud of the commitment and effectiveness of organisations in the Reclaim Your Face. The amendments showed that the Parliament clearly listened and that the power of our joint actions is truly huge!

What’s next?

The fight is still far from over.

Whilst RBI in publicly-accessible spaces is a major part of biometric mass surveillance, practices such as biometric categorisation and emotion recognition (making predictions about people’s ethnicity, gender, emotions or other characteristics based on how they look or act) can also lead to biometric mass surveillance. That’s why we are also advocating for strong bans on both practices in the AI Act – which we are pleased to see have been put forward by several MEPs.

There is also a lot left to go in the political process. These amendments need to be turned into compromise amendments, and then voted on to ensure that the entire Parliament officially agrees. Only then will negotiations begin with the member state governments (Council), where more permissive home affairs ministers have clashed with more rights-protective justice ministers over whether to weaken or strengthen the RBI ban.

This emphasises why now, more than ever, we need to keep up the pressure at European and national levels to ensure that – when the AI Act is officially passed, likely in 2023 or 2024 – it bans biometric mass surveillance!

Get in contact with us to find out to support Reclaim Your Face!

Goodbye ECI, hello AI Act negotiations!

Today, 1st of August at 23:59, our European Citizens Initiative comes to an end. However, our campaign carries on to influence EU leaders as they negotiate the AI Act.


We started the ECI in January 2021, calling for a new law that bans biometric mass surveillance. 18 months later, we are ready for reflections and for a celebration of all that we have achieved together.

We campaigned during a pandemic and worked with creative efforts to gather signatures while respecting privacy and protecting data. We adapted to the political reality and managed to influence EU’s negotiations. We built a coalition with 76 organisations from over 20 EU countries. We led national actions and we won.

Campaigning for privacy with privacy

Out of the 90 ECIs ever started, only 6 have been able to reach the threshold of 1 million signatories. All 6 used social media targeted advertising. In Reclaim Your Face we have a commitment to everyone’s privacy. Therefore, we gathered almost 80 thousand signatures without using any targeted social media advertisement (or as we call them, surveillance ads). Every single ECI signatory was reached directly by one of our partners or their supporters by sharing our posts, sending newsletters and collecting signatures in the streets.

A challenge? Yes. But organic reach gave us a great opportunity to have direct interactions with other organisations, a high level of engagement from our supporters, and quality conversations about biometric mass surveillance. In fact, all of these factors played out to make our petition the “most politically powerful ECI ever”, according to an insider part of the European Economic and Social Committee.

“Most politically powerful ECI ever”

Insider part of the European Economic and Social Committee.

This is how we did it:

Coalition building: Different voices across Europe

Reclaim Your Face aimed to have a diversity of voices represented in our call to ban biometric mass surveillance. We listened and worked especially with groups most affected by this exploitative practice.

We worked with LGBTQ+ advocates at AllOut, with football supporters association Fans Europe, with Roma and Sinti rights supporters at save space e.V. as well as Workers Union UNI Europa. Everyone- migrations organisations, privacy defenders, journalists, etc- united for one cause: banning biometric mass surveillance.

In total, we were joined by 76 organisations from 20 Member States – who represent over half a million supporters. Our coalition has been the backbone of our success.

Volunteers for paper signature collection

Once the pandemic allowed us to be present in offline spaces, we decided to organise a Bootcamp for those who wanted to help us gather signatures. We trained over 80 people from more than 7 countries on 3 topics: biometric mass surveillance issues, ECI data protection practices and offline engagement methods.

The new Reclaim Your Face volunteers collected signatures in their own cities and engaged with people in the streets, at universities, in parks and in other public spaces. Activists in Portugal, Italy, Germany, Czechia and Greece made time in their days to share their thoughts on biometric mass surveillance, inform other citizens about its’ incompatibility with human rights and collect paper signatures for our ECI.

Local national campaigns

Reclaim Your Face was decentralised, building communities in more than 6 countries that led national actions and successes. Among many, here are some of our national wins:

Germany

The campaign’s German movement led by EDRi members Chaos Computer Club (CCC), Digitale Gesellschaft and Digitalcourage worked with more than 16 organisations. They organised over 14 events and were part of social media stunts, Twitter storms, as well as offline peaceful manifestations. Almost 30,000 German citizens signed the campaign’s European Citizens’ Initiative, proving that people-powered action can create meaningful change.

Italy

The Italian national campaign lead by Hermes Center, with more than 9 organisations in the coalition has coordinated many actions too, across almost 2 years.

Czechia

Leading organisation Iure has also organised many actions from creative work like comics and video clips, to paper signature collection days.

Two of the leading actions for Reclaim Your Face in Czechia has been the fight against biometric cameras at Prague airport and one of the seminars organised in the Chamber of Deputies where they talked about biometric cameras with police and political representatives.

Apart from this, in May 2022 in Prague, they spoke with people in the streets about biometric mass surveillance and its’ dangers for society. From April to July they also spoke and promote the campaign on five festivals and community screenings of their movie Digital Dissidents, which explores people that are critical to digital technologies.

Greece

Hellenic leading organisation of Reclaim Your Face, Homo Digitalis have also been active in Greece.

  • In May 2022 they organised a paper signature collection in the streets of Athens.

Serbia

As a result of international pressure, in September 2021, a Draft Law on Internal Affairs, which contained provisions for legalising a massive biometric video surveillance system, was pulled from the further procedure. This was an amazing win for human rights and a result of Share Foundation’s national campaign Thousands of Cameras, a two-and-a-half year-long battle against smart cameras in Belgrade installed by the Ministry of Interior and supplied by Chinese tech giant Huawei.

Portugal

The Portuguese lead organisation in the Reclaim Your Face coalition D3 (Defesa Dos Direitos Digitais) led actions to raise awareness, as the Portuguese government proposed video surveillance and facial recognition law. Reclaim Your Face organisations and EDRi sent a letter to representatives of Portugal’s main political parties, supporting D3’s fight against biometric mass surveillance practices. Together, we urged politicians to reject this dystopian law. The proposal was later withdrawn.

EU level successes

In parallel with our work at the national level, we unite and coordinate EU-level actions.

  • In fact, in May 2022 we could see the results of our actions. After meeting with key MEPs working on the EU’s AI Act proposal, delivering an open letter signed by 53 organisations and publishing multiple op-eds, both co-lead MEP on the AI Act announced their support for a ban. Dragos Tudorache (Renew) announced that he personally will table amendments for a more comprehensive ban on RBI in publicly-accessible spaces, calling RBI “clearly highly intrusive … in our privacy, our rights”.

Today we say goodbye to our European Citizens Initiative and are humbled by the tens of thousands of people who signed it.

However, Reclaim Your Face continues!

We envision a society in which no one is harmed by biometric mass surveillance. Such a society is only possible when biometric mass surveillance is banned by law and in practice. Together with our partners, we continue to fight for this a reality by advocating for an AI Act that puts people at its core.

A big success for Homo Digitalis: The Hellenic DPA fines CLEARVIEW AI with €20 million

On July 13 2022, following a complaint filed by Homo Digitalis in May 2021 representing our member and data subject Marina Zacharopoulou, the Hellenic Data Protection Authority (HDPA) issued Decision 35/2022 imposing a fine of 20 million euros on Clearview AI for its intrusive practices. By the same Decision, the DPA prohibits that company from collecting and processing the personal data of data subjects located in Greece using facial recognition methods and requires it to delete immediately any data it has already collected.

Specifically, in May 2021, an alliance of civil society organizations consisting of Homo Digitalis and the organisations Privacy International, Hermes Center, and noyb filed complaints before the competent authorities in Greece, the United Kingdom, Italy, Austria, France and the United Kingdom against Clearview AI for its mass surveillance practices through facial recognition.

Earlier this year, the Italian Data Protection Authority had decided to fine the company €20 million, while the UK’s equivalent authority had decided to fine it £7.5 million.

The €20 million fine imposed by the DPA today is another strong signal against intrusive business models of companies that seek to make money through the illegal processing of personal data. At the same time, it sends a clear message to law enforcement authorities working with companies of this kind that such practices are illegal and grossly violate the rights of data subjects.

Clearview AI is an American company founded in 2017 that develops facial recognition software. It claims to have “the largest known database of more than three billion facial images” which it collects from social media platforms and other online sources. It is an automated tool that visits public websites and collects any images it detects that contain human faces. Along with these images, the automated collector also collects metadata that complements these images, such as the title of the website and its source link. The collected facial images are then matched against the facial recognition software created by Clearview AI in order to build the company’s database. Clearview AI sells access to this database to private companies and law enforcement agencies, such as police authorities, internationally.

The full text of Decision 35/2022 can be found here (only in EL).

Week of actions: Reclaim Your Face Italy and the need for a real EU ban on biometric mass surveillance

During the second week of May 2022, Reclaim Your Face Italy held a week of actions for an EU ban on biometric mass surveillance in Milan, Torino and Como. They collected signatures on the streets of the 3 cities, joined an event organised by the Greens-European Free Alliance Group and made a field visit to Italy’s city, Como, the first one to implement facial recognition technology in a public park.

Background

In 2021, the Italian Data Protection Authority (DPA) rejected the police use of Automatic Image Recognition System (SARI). SARI is a real-time facial recognition system that was acquired by the Italian Police in 2017 and being under investigation by the Authority ever since. Albeit it is assured to never been used in real-time, this system was at the center of debate after it was revealed their intention to employ it to monitor arrivals of migrants and asylum seekers on the Italian coasts.

In its decision, the DPA argued that the system lacks a legal basis and, as designed, it would constitute a form of mass surveillance. Thanks to the actions of Hermes Center, Associazione Luca Coscioni, Certi Diritti, CILD, Eumans, info.nodes, The Good Lobby, Privacy Network, Progetto Winston Smith, and StraLi, a temporary ban on facial recognition technology in public spaces was introduced later. This moratorium will be in force until December 2023.

Now our Reclaim Your Face partners Hermes Center, Privacy Network, Certi Diritti, STRALI and CILD are fiercely campaigning to Ban Biometric Mass Surveillance in the EU.

Here are some of their latest actions!

Paper signature collection

On the 10th of May 2022, Reclaim Your Face hosted paper signature collection stands in three big cities of Italy: Milan, Torino, and Rome. This paper signature collection was organized by Hermes Center and two national Reclaim Your Face partners: StraLi and CILD. The activists were in front of Universities and in the city center to talk about the risks of biometric mass surveillance, giving out stickers, booklets, Reclaim Your Face T-shirts and bags.

Event with Greens-European Free Alliance Group

Colleagues from Hermes Center, Riccardo Coluccini and Davide Del Monte, joined as speakers for the event ‘Stop Biometric Surveillance – Time for an EU ban on biometric mass surveillance in public spaces’ to explain why Italy must carry on campaigning pushing for a real ban on biometric surveillance in the EU.

Visit in Como

Como was the first city to implement facial recognition technology in their park in 2019 through an offer by Huawei . The technology included also algorithms that detected different types of behaviours. Not coincidentally, in 2016 during the migration crisis, migrants were camping in this park waiting to cross the border.

After the work of activists and a journalistic investigation by Hermes Center colleagues Laura Carrer and Riccardo Coluccini, and researcher Philip Di Salvo, Como was obliged to shut down the system 2020.

In May 2022, together with representatives from the Greens- European Free Alliance Group and journalists from the Czech Republic, the researchers visited the park where Facial Recognition cameras were installed and talked about their investigation. While the cameras are still there, the Facial Recognition and other algorithmic functions are turned off at the moment. The Greens- European Free Alliance Group and Czech journalist later met with local journalist Andrea Quadroni who talked about the migrant crisis that hit Como in 2016.

The trip to Como is part of the Greens- European Free Alliance Group’s newly released mini-documentary while articles about the actions and results of Reclaim Your Face in Italy were published on national TV and radio station in the Czech Republic.

Reclaim Your Face’s coalition & 53 orgs made it: Leading EU politician speaks against biometric mass surveillance

This month we worked together on some specific actions to influence the Artificial Intelligence Act to include a ban on biometric mass surveillance and 53 organisations took part. This builds on the hard work of the whole Reclaim Your Face coalition over the last two years. Our actions have had amazing results, with even the co-lead MEP on the Artificial Intelligence Act committing to table amendments for a more comprehensive ban on RBI in publicly-accessible spaces!


Here is a snapshot of our joint actions last/this week:

  • Reclaim Your Face organisations from Italy, Germany, France and Belgium met with key MEPs working on the EU’s AI Act proposal, including co-lead MEP Dragos Tudorache, co-lead MEP Brando Benifei, and MEP Birgit Sippel.
  • 53 organisations signed our Reclaim Your Face open letter asking MEPs to protect fundamental rights in the AI Act by prohibiting all remote (i.e. generalised surveillance) uses of biometric identification (RBI) in publicly- accessible spaces.

We have five main demands to ban biometric mass surveillance in the AI Act:

  1. Extending the scope of the prohibition to cover all private as well as public actors;
  2. Ensuring that all uses of RBI (whether real-time or post) in publicly- accessible spaces are included in the prohibition;
  3. Deleting the exceptions to the prohibition, which independent human rights assessments confirm do not meet existing EU fundamental rights standards;
  4. Putting a stop to discriminatory or manipulative forms of biometric categorisation; and
  5. Properly addressing the risks of emotion recognition.

Our demands were published in various EU Policy outlets and France.

Our tireless actions to call for a Ban on Biometric Mass Surveillance in the Artificial Intelligence Act have had amazing results so far!

Following our meeting, the co-lead MEP on the AI Act, Dragos Tudorache (Renew) announced that he personally will table amendments for a more comprehensive ban on RBI in publicly-accessible spaces, calling RBI “clearly highly intrusive … in our privacy, our rights”.

This is the result of hearing the views of his colleagues in a majority of the Parliament’s political groups – with several lead MEPs committing publicly to submitting amendments for a full ban on biometric mass surveillance in the AI Act – and suggesting the big influence of the calls of the dozens of organisations and 71,000 people behind Reclaim Your Face’s European Citizens Initiative.

However, we have not won – yet. There will still be many months of negotiations in the AI Act.

You can support Reclaim Your Face individually by signing our ECI, and as an organisation by getting in contact with us so we can explore paths of collaboration.

Thank you to all our partners and supporters for making this possible! The response to our actions suggests that there is clear a majority in the Parliament supporting our call.

Members of the European Parliament: will you stand up for our rights?

Today, a global coalition of 53 civil society organisations have joined together to call on Members of the European Parliament to use their democratically-elected powers to protect us all from biometric mass surveillance practices. The EU must not legitimise these dangerous practices. Otherwise, EU lawmakers risk setting a precedent for uses of AI-based technology which could destroy people’s anonymity forever and suppress a broad range of our rights and freedoms.

Brussels, Tuesday 10 May, 2022

Dear honourable Members of the European Parliament,

We write to you today as 53 organisations to ask: Will you stand up for our rights by prohibiting biometric mass surveillance in the Artificial Intelligence Act?

In Europe and across the world, the use of remote biometric identification (RBI) systems such as facial recognition, in our publicly accessible spaces, represents one of the greatest threats to fundamental rights and democracy that we have ever seen.

The remote use of such systems destroys the possibility of anonymity in public, and undermines the essence of our rights to privacy and data protection, the right to freedom of expression, rights to free assembly and association (leading to the criminalisation of protest and causing a chilling effect), and rights to equality and non-discrimination.

Without an outright ban on the remote use of these technologies in publicly accessible spaces, all the places where we exercise our rights and come together as communities will be turned into sites of mass surveillance where we are all treated as suspects.

These harms are not hypothetical. Uyghur Muslims have been systematically persecuted by the Chinese government through the use of facial recognition. Pro-democracy protesters and political opponents have been suppressed or targeted in Russia, Serbia and Hong Kong through the use – and in some cases, even just the fear of the use of – RBI in publicly-accessible spaces. And many people have been wrongfully and traumatically arrested around the world.1

In response to the ever-increasing proliferation of these uses and their harms, people are pushing back and calling for prohibitions. More than 24 US states have taken steps against facial recognition or other forms of biometric mass surveillance. In South America, two recent rulings in São Paulo and Buenos Aires have ordered the suspension of facial recognition systems.

Some of the world’s biggest providers of biometric surveillance systems – Microsoft, Amazon and IBM – have even adopted self-imposed moratoriums due to the major risks and harms that they know their systems perpetuate; and Facebook has deleted its mass facial image database.

Despite the strong protections afforded to biometric data in EU data protection law, we see companies and public authorities systematically misusing “consent” and vague security justifications as a basis for the use of facial recognition and other biometric systems in ways that amount to inherently disproportionate mass surveillance practices.

While democratic countries around the world are taking steps to protect their communities, the EU is heading in the opposite direction.

A clear, unambiguous prohibition is needed in the AI Act to put a stop to the dangerous status quo.2 In 2021, the European Parliament adopted a powerful stance against biometric mass surveillance practices in the AI in criminal law report, which calls for: “a ban on any processing of biometric data, including facial images, for law enforcement purposes that leads to mass surveillance in publicly accessible spaces” (Article 31).

The AI Act is the obvious way for this important European Parliament resolution to be translated into binding, impactful law.

The urgent need for further action has also been recognised at EU Member State level. Italy has introduced Europe’s first moratorium on public facial recognition. The German coalition government has called for an EU-wide ban on biometric mass surveillance practices. Portugal dropped a law which would have legalised some biometric mass surveillance practices. And the Belgian Parliament is considering a moratorium on biometric surveillance.

Will you make (the right kind of) history?

There is already significant evidence that European residents havebeen systematically subjected to biometric mass surveillance practices. From football fans, to school children, to commuters, to shoppers, to people visiting LGBTQ+ bars and places of worship, the harms are real and prevalent. Via the Reclaim Your Face campaign, over 70,000 EU citizens urge you and your fellow lawmakers to better protect us from these undemocratic and harmful biometric systems.

Around the world, over 200 civil society organisations, from Burundi to Taiwan, have signed a letter calling for a global ban on biometric surveillance. As the first region to comprehensively regulate artificial intelligence, the EU’s actions – or inaction – will have major ramifications on biometric mass surveillance practices in every corner of the globe.

While dozens of US states are learning from horrendous mistakes such as the facial recognition-enabled suppression of Black Lives Matter protesters, governments in India, China and Russia are moving in the opposite direction. Which side of history will the EU be on: legitimising authoritarian technological surveillance, or choosing fundamental rights?

How can we make this a reality in the AI Act?

The AI Act must prohibit all remote(i.e. generalised surveillance) uses of biometric identification (RBI) in publicly-accessible spaces. This means that uses like unlocking a smartphone or using an ePassport gate would not be prohibited. While Article 5(1)(d) already aims to prohibit some uses of RBI, its scope is so narrow and contains so many exceptions that it practically provides a legal basis for practices that should, in fact, already be prohibited under existing data protection rules.

We therefore call on you to propose amendments to Article 5(1)(d)3 which would:

  • Extend the scope of the prohibition to cover all private as well as public actors;
  • Ensure that all uses of RBI (whether real-time or post) in publicly-accessible spaces are included in the prohibition; and
  • Delete the exceptions to the prohibition, which independent human rights assessments confirm do not meet existing EU fundamental rights standards.

To ensure a comprehensive approach to the protection of biometric data, we additionally urge you to use the opportunity provided by the AI Act to put a stop to discriminatory or manipulative forms of biometric categorisation, and to properly address the risks of emotion recognition.

The EU aims to create an “ecosystem of trust and excellence” for AI and to be the world leader in trustworthy AI. Accomplishing these aims will mean putting a stop to applications of AI that undermine trust, violate our rights, and turn our public spaces into surveillance nightmares. We can promote AI that really serves people, while stamping out the most dangerous applications of this powerful technology.

That’s why the EU’s way must be to truly put people at the heart, and to put forward amendments to the IMCO-LIBE report on the AI Act which willensure a genuine ban on biometric mass surveillance practices.

Signed,

Reclaim Your Face

Organisational signatories:

Access Now (International)

AlgorithmWatch (European)

Alternatif Bilisim (AIA- Alternative Informatics Association) (Turkey)

anna elbe – Weitblick für Hamburg (Germany)

ARTICLE 19: Global Campaign for Free Expression (International)

Asociatia pentru Tehnologie si Internet – ApTI (Romania)

Barracón Digital (Honduras)

Big Brother Watch (UK)

Bits of Freedom (the Netherlands)

Blueprint for Free Speech (International)

Center for Civil Liberties (Ukraine)

Chaos Computer Club (Germany)

Civil Liberties Union for Europe (European)

D3 – Defesa dos Direitos Digitais (Portugal)

Digital Rights Watch (Australia)

Digitalcourage (Germany)

Digitale Freiheit (Germany)

Digitale Gesellschaft (Germany)

Digitale Gesellschaft CH (Switzerland)

Državljan D / Citizen D (Slovenia / European)

Eticas Foundation (European / International)

European Center For Not-For-Profit Law Stichting (ECNL) (European)

European Digital Rights (EDRi) (International)

European Disability Forum (EDF) (European)

Fachbereich Informatik und Gesellschaft, Gesellschaft für Informatik e.V. (Germany)

Fair Trials (International)

Fight for the Future (United States)

Football Supporters Europe (FSE) (European)

Hermes Center (Italy)

Hiperderecho (Perú)

Homo Digitalis (Greece)

Internet Law Reform Dialogue (iLaw) (Thailand)

Internet Protection Society (Russia / European)

Intersection Association for Rights and Freedoms (Tunisia)

IT-Pol Denmark (Denmark)

International Legal Initiative (Kazakhstan)

Iuridicum Remedium (IuRe) (Czech Republic)

JCA-NET (Japan)

Korean Progressive Network Jinbonet (Republic of Korea)

La Quadrature du Net (France)

Lady Lawyer Foundation (International)

LaLibre.net Tecnologías Counitarias (Ecuador / Latin America)

Ligue des droits de l’Homme (LDH) (France)

Ligue des droits humains (Belgium)

LOAD e.V. – Association for liberal internet policy (Germany)

Masaar – Technology and Law Community (Egypt)

Panoptykon Foundation (Poland)

Privacy International (International)

Privacy Network (Italy)

Statewatch (Europe)

Usuarios Digitales (Ecuador)

Wikimedia Deutschland (Germany / European)

Wikimedia France (France / European)

Individual signatories:

Douwe Korff, Emeritus Professor of International Law

Dr Vita Peacock, Anthropologist

Edson Prestes, Full Professor, Federal University of Rio Grande do Sul (Brazil)

1 For example: https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway; https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.html; https://www.wired.com/story/wrongful-arrests-ai-derailed-3-mens-lives/; https://edri.org/our-work/dangerous-by-design-a-cautionary-tale-about-facial-recognition/; https://www.law.georgetown.edu/privacy-technology-center/publications/garbage-in-garbage-out-face-recognition-on-flawed-data/

2 The General Data Protection Regulation, Article 9, paragraph 4, foresees additional protections of biometric data: “Member States may maintain or introduce further conditions, including limitations, with regard to the processing of … biometric data”.

3 This must be supported by a new Recital to better define “remote” use cases as those where cameras/devices are installed at a distance that creates the capacity to scan multiple persons, which in theory could identify one or more of them without their knowledge. Warning notices do not annul such a definition.

How can you influence the AI Act in order to ban biometric mass surveillance across Europe?

The EU is currently negotiating the Artificial Intelligence (AI) Act. This future law offers the chance to effectively ban biometric mass surveillance. This article aims to offer an overview of how the EU negotiates its laws and the key AI Act moments in which people can make their voices heard.


Two months after Reclaim Your Face launched our European Citizens’ Initiative (February 2021), the EU proposed a new law: the AI Act. In April 2021, the draft law included a ban on some forms of biometric surveillance. Despite its shortcomings, the mere mention of the word “prohibit” in the draft law was a huge success for our campaign.

The AI Act draft showed that, if it wants, the EU has the power to truly ban biometric mass surveillance practices. As a result, we decided that the negotiations around this law will be crucial to make our Reclaim Your Face campaign demands real.

Most importantly, it showed that the calls launched by tens of thousands of people and civil society organisations across Europe since October 2020 have had a real impact.

How are EU laws negotiated?

The process of EU law-making can be difficult to grasp. The graphic below explains the role of the European Commission, the negotiations between the European Parliament and the Council of the EU, as well as the different actions we took/take during these steps. 

As you can see, the European Commission (EC) is the body that proposes a new EU law. After preparatory work, the EC writes up a draft, publishes and sends it to the Parliament and the Council. Both the Parliament and the Council debate internally. As a result, each of them will form a position on the EC draft. Next, they meet – together with the EC – in a negotiation step called ‘trilogues’. Unfortunately, trilogues are notorious for their opacity and lack of opportunities for public scrutiny

If you want to know more about where EU legislative and non-legislative Proposals come from, check EDRi’s Activist Guide to the Brussels Maze.

The role of the Parliament is crucial

The European Parliament is the only directly-elected EU body. For this reason, it is probably the body that most takes into account people’s voices. Influencing the opinion of the Parliament – before the trilogues start – is therefore a key component of civil society’s work on EU laws.

The European Parliament is formed of 705 Members (MEPs) from all 27 EU Member States. Most MEPs are also part of Parliament Committees. The Committees have a crucial role in forming the Parliament’s position.

One or more Committees are assigned to write a report that forms the basis of the entire European Parliament’s position. The AI Act is handled jointly in the Parliament by two Committees: LIBE (Civil Liberties, Justice and Home Affairs) and IMCO (Internal Markets and Consumer Protection). 

Each Committee has a Rapporteur (overall lead) and several Shadow Rapporteurs (lead for their political group). The important thing to remember is that these MEPs are all key players in shaping the report of the Parliament on the AI Act. They may also be influenced by other MEPs in their Committee(s), who can suggest changes to the draft report (“amendments”) as well as the heads of their political group. See more below.

When can people strategically influence the negotiations on the AI Act?

During the negotiations of the lead committees

he lead committees in the Parliament (IMCO and LIBE) are working on their report on the AI Act up until October 2022. This means that already we should raise awareness of our work and our demands to MEPs in those two committees. What are some crucial steps of the negotiation around the LIBE–IMCO Committee report?

First, the lead Rapporteurs Benifei and Tudorache publish an IMCO–LIBE draft report (expected April 2022) which represents the parts of the position on which they could agree. Afterwards, the other MEPs in IMCO and LIBE can propose amendments to this draft report, including the areas that need more democratic scrutiny. The tabling of amendments is expected to happen until 18 May 2022.

The amendments are then negotiated among a selected number of MEPs in the LIBE-IMCO committees (the Rapporteurs and Shadow Rapporteurs), and agreed upon by coming up with compromises. The negotiations around these amendments and the agreement on a compromise text for the LIBE–IMCO report are expected to happen between May and October 2022.

After lead committees conclude their report, and just before the Plenary vote

Once out of committee negotiations, this joint IMCO–LIBE report will be presented to the full Parliament, known as the Plenary. When the report is presented to the Plenary, there is also an opportunity for last-minute amendments to the committees’ report to be put forward. This tabling of amendments before the Plenary vote is yet another moment in which MEPs may introduce protections for people against biometric mass surveillance. 

After any final amendments to the IMCO–LIBE report are voted on, all 705 MEPs will vote on whether or not to accept this final version of the IMCO–LIBE report as the Parliament’s position. Currently, the 705 MEPs are scheduled to vote on the final report in November 2022.

In parallel, the Council of Member States is currently trying to make the in-principle ban on biometric surveillance from the Commission‘s draft weaker and more narrow. If the Parliament agrees on the need for a full ban on biometric mass surveillance practices, we have a chance to fight back against the Council’s proposal.

Supporters of Reclaim Your Face can play an important role in the negotiations of the EU’s Artificial Intelligence Act. Are you ready for bold, strategic and direct action? Subscribe to EDRi’s mailing list to be kept in the loop and follow our social media channels.

About ClearviewAI’s mockery of human rights, those fighting it, and the need for EU to intervene

Clearview AI describes itself as ‘The World’s Largest Facial Network’. However, a quick search online would reveal that the company has been involved in several scandals, covering the front page of many publications for all the wrong reasons. In fact, since New York Times broke the story about Clearview AI in 2020, the company has been constantly criticised by activists, politicians, and data protection authorities around the world. Read below a summary of the many actions taken against the company that hoarded 10 billion images of our faces.


How did Clearview AI build a database of 10 billion images?

Clearview AI gathers data automatically, through a process called (social media) online scraping. The specific way Clearview AI gathers its data enables biometric mass surveillance, being a practice also adopted by actors such as PimsEyes among others.

The company scrapes the pictures of our faces from the entire internet – including social media applications – and stores them on its servers. Following the gathering and storage of data through online scraping, Clearview AI managed to create a database of 10 BILLION images. Now, the company uses an algorithm that matches a given face to all the faces in its 10B database: (virtually) everyone and anyone.

Creepily enough, this database can be available to any company, law enforcement agency and government that can pay for access. 

This will go on, as long as we don’t put a stop to Clearview AI and its peers. Reclaim Your Face partners and other organisations have taken several actions to limit Clearview AI in France, Italy, Germany, Belgium, Sweden, the United Kingdom, Australia and Canada.

In several EU countries many activists, Data Protection Authorities and watchdogs took action.

In May 2021, a coalition of organisations (including noyb, Privacy International (PI), Hermes Center and Homo Digitalis) filed a series of submissions against Clearview AI, Inc. The complaints were submitted to data protection regulators in France, Austria, Italy, Greece and the United Kingdom.

Here are some of the Data Protection Authorities and watchdogs’ decisions:

France

Following Reclaim Your Face Partner’s Privacy International and individual complaints about Clearview AI’s facial recognition software, the French data protection authority (‘CNIL’) decided in December 2021 that Clearview AI should cease collecting and using data from data subjects in France.

Italy

After individuals (including Riccardo Coluccini) and Reclaim Your Face organisations (among them Hermes Centre for Transparency and Digital Human Rights and Privacy Network) filed complaint against Clearview AI, Italian’s data privacy watchdog (Garante per la Protezione dei dati personali) fined Clearview AI the highest amount possible: 20 million Euros. The decision includes an order to erase the data relating to individuals in Italy and banned any further collection and processing of the data through the company’s facial recognition system.

Germany

Following an individual complaint from Reclaim Your Face activist Matthias Marx, the Hamburg Data Protection Agency ordered Clearview to delete the mathematical hash representing a user’s profile. As a result, the Hamburg DPA deemed Clearview AI’s biometric photo database illegal in the EU. However, Clearview AI has only deleted Matthias Marx’s data and the DPA’s case is not yet closed.

Belgium

While the use of this software has never been legal in Belgium and after denying its deployment, the Ministry of the Interior confirmed in October 2021 that the Belgian Federal Police used the services of Clearview AI. This was derived from a trial period the company provided to the Europol Task Force on Victim Identification. Albeit admitting the use, the Ministry of Interior also confirmed and emphasized that Belgian law does not allow this. This was later confirmed by the Belgian police watchdog ruling that stated its use was unlawful.

Sweden

In February 2021, the Swedish data protection authority (IMY), decided that a Swedish local police’s use of Clearview’s technology involved unlawfully processed biometric data for facial recognition. The DPA also pointed the Police failed to conduct a data protection impact assessment. As such, the authority fined the local police authority €250,000 and ordered to inform people whose personal data was sent to the company.

Clearview AI is in trouble outside of the European Union too

United Kingdom 

On 27 May 2021, Privacy International (PI) filed complaints against Clearview AI with the UK’s independent regulator for data protection and information rights law and Information Commissioner’s Office (ICO). Jointly with OAIC, which regulates the Australian Privacy Act, conducted a joint investigation on Clearview AI from 2020. Last year, ICO announced its provisional intent to impose a potential fine of over £17 million based on Clearview’s failure to comply with UK data protection laws.

In May 2022, ICO issued a fine to Clearview AI Inc of £7,552,800 and an enforcement notice, ordering the company to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.

Australia 

On the other hand, OAIC has reached a decision and ordered the company to stop collecting facial biometrics and biometric templates from people in Australian territory; and to destroy all existing images and templates that it holds. 

Canada 

Canadian authorities were unequivocal in ruling that Clearview AI was a violation of their citizen’s right to privacy, and furthermore, that this use constitutes mass surveillance. Their statement highlights the clear link between ClearviewAI and biometric mass surveillance and assumes that all citizens are suspects of crime. 

War and Clearview AI

In an already distressing war context, Ukraine’s defence ministry is using Clearview AI’s facial recognition technology for allegedly vetting people at checkpoints, unmasking Russian assailants, combating misinformation and identifying the dead.

What happens when Clearview AI decides to offer its services to military forces with whom we disagree?

It is deeply worrying that Clearview AI’s technologies are reportedly being used in warfare. What happens when Clearview AI decides to offer its services to military forces with whom we disagree? What does this say about the geopolitical power we allow – as a society – for surveillance of private actors? By allowing Clearview AI into military operations, we are opening Pandora’s box for technologies that have been ruled incompatible with people’s rights and freedoms to be deployed into a situation of literal life-or-death. Clearview AI’s systems show documented racial bias and have facilitated several traumatic wrongful arrests of innocent people around the world. Even a system that is highly accurate in lab conditions will not perform as accurately in a war zone. This can lead to fatal results.

Clearview AI mocks national data protection authorities. We must act united! The EU must step up and use the AI Act to end the mocking of people’s rights and dignity.

Pressure is mounting, but Clearview AI is not stepping down. Instead, the company started as a service for law enforcement uses only, but is now telling investors they are extending towards the monitoring of gig workers – among others.

All the data protection authorities mentioned above have strong arguments to justify their decisions. In response to fines, Clearview AI’s CEO issues statements that mock all the fines national authorities issue. 

National agencies and watchdogs cannot reign in Clearview AI alone. The EU must step in and ensure Clearview AI does not also mock the fundamental nature of human rights. 

Reclaim Your Face campaigns to stop Clearview AI and any other uses of technology that create biometric mass surveillance. 

Further readings:



ReclaimYourFace is a movement led by civil society organisations across Europe:

Access Now ARTICLE19 Bits of Freedom CCC Defesa dos Direitos Digitais (D3) Digitalcourage Digitale Gesellschaft CH Digitale Gesellschaft DE Državljan D EDRi Electronic Frontier Finland epicenter.works Hermes Center for Transparency and Digital Human Rights Homo Digitalis IT-Political Association of Denmark IuRe La Quadrature du Net Liberties Metamorphosis Foundation Panoptykon Foundation Privacy International SHARE Foundation
In collaboration with our campaign partners:

AlgorithmWatch AlgorithmWatch/CH All Out Amnesty International Anna Elbe Aquilenet Associazione Luca Coscioni Ban Facial Recognition Europe Big Brother Watch Certi Diritti Chaos Computer Club Lëtzebuerg (C3L) CILD D64 Danes je nov dan Datapanik Digitale Freiheit DPO Innovation Electronic Frontier Norway European Center for Not-for-profit Law (ECNL) European Digital Society Eumans Football Supporters Europe Fundación Secretariado Gitano (FSG) Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung Germanwatch German acm chapter Gesellschaft Fur Informatik (German Informatics Society) GONG Hellenic Association of Data Protection and Privacy Hellenic League for Human Rights info.nodes irish council for civil liberties JEF, Young European Federalists Kameras Stoppen Ligue des droits de L'Homme (FR) Ligue des Droits Humains (BE) LOAD e.V. Ministry of Privacy Privacy Lx Privacy Network Projetto Winston Smith Reporters United Saplinq Science for Democracy Selbstbestimmt.Digital STRALI Stop Wapenhandel The Good Lobby Italia UNI-Europa Unsurv Vrijbit Wikimedia FR Xnet


Reclaim Your Face is also supported by:

Jusos Piratenpartei DE Pirátská Strana

MEP Patrick Breyer, Germany, Greens/EFA
MEP Marcel Kolaja, Czechia, Greens/EFA
MEP Anne-Sophie Pelletier, France, The Left
MEP Kateřina Konečná, Czechia, The Left



Should your organisation be here, too?
Here's how you can get involved.
If you're an individual rather than an organisation, or your organisation type isn't covered in the partnering document, please get in touch with us directly.