Between 6 and 9 November 2022, more than 20 activists from across Europe gathered in Brussels to celebrate the successes of the Reclaim You Face movement. We got to meet each other in real life after months of online organising, reflected on our wide range off decentralised actions, and learned from each other how to couple grassroots organising with EU advocacy aimed at specific events and EU institutions. Read on to see what we did.
Biometric mass surveillance involves the indiscriminate and/or arbitrary monitoring, tracking, and processing of biometric data related to individuals and/or groups. Biometric data encompasses, but is not limited to, fingerprints, palmprints, palm veins, hand geometry, facial recognition, DNA, iris recognition, typing rhythm, walking style, and voice recognition.
Though often targeted at specific groups, the use of mass surveillance technologies is becoming prevalent in publicly available spaces across Europe. As a result, football fans are increasingly impacted by them.
Apart from its undemocratic nature, there are many reasons why biometric mass surveillance is problematic for human rights and fans’ rights.
Firstly, in the general sense, the practices around biometric mass surveillance in and around stadia involve the collection of personal data, which may be shared with third parties and/or stored insecurely. All of this biometric data can be used in the service of mass surveillance.
Secondly, fans’ culture is under threat because mass surveillance can be deployed to control or deter many of the core elements that bring people together in groups and in stadia. To be sure, biometric mass surveillance can create a ‘chilling effect’ on individuals. Knowing one is being surveilled can lead people to feel discouraged from legitimately attending pre-match gatherings and fan marches, or joining a protest.
Moreover, women, people of colour, and fans who belong to the LGBT+ community may be at higher risk of being targeted or profiled.
“There are two good reasons why fans should pay close attention to the question of biometric mass surveillance. First, we have a right to privacy, association, and expression, just like everybody else. And second, we’re often used as test subjects for invasive technologies and practices. With this in mind, we encourage fans to work at the local, national, and European levels to make sure that everybody’s fundamental rights are protected from such abuses.”
Football fans and mass surveillance
The situation differs from country to country, but there are countless examples of fans being subjected to intrusive, or in some cases, unauthorised, surveillance:
Belgium: In 2018, second-tier club RWD Molenbeek announced plans to deploy facial recognition technology to improve queuing times at turnstiles.
Denmark: Facial recognition technology is used for ticketing verification at the Brøndby Stadion. The supplier claims that the Panasonic FacePro system can recognise people even if they wear sunglasses.
France: FC Metz allegedly used an experimental system to identify people who were subject to civil stadium bans, detect abandoned objects, and enhance counter-terror measures. Following several reports, the French data protection watchdog (CNIL) carried out an investigation which determined that the system relied on the processing of biometric data. In February 2021, CNIL ruled the use of facial recognition technology in the stadium to be unlawful.
Hungary: In 2015, the Hungarian Civil Liberties Union (HCUL) filed a complaint at the constitutional court challenging the use of palm “vein” scanners at some football stadia after fans of several clubs objected to the practice.
The Netherlands: In 2019, AFC Ajax and FC Den Bosch outlined plans to use facial recognition technology to validate and verify e-tickets.
Spain: Atlético Madrid declared their intention to use facial recognition systems and implement cashless payments from the 2022-23 season onwards. Valencia, meanwhile, have already deployed facial recognition technology designed by FacePhi to monitor and control access to their stadium. Several clubs, including Sevilla FC, also use fingerprint scanning to identify season ticket holders at turnstiles.
In April 2021, the European Commission proposed a law to regulate the use of Artificial Intelligence (AI Act). Since becoming part of the ‘Reclaim Your Face’ coalition, FSE has joined a growing number of organisations which are calling for the act to include a ban on biometric mass surveillance.
Currently, the European Parliament is forming its opinion on the AI Act proposal. In the past, they have supported the demand for a ban, but more pressure is needed. That is why we must raise awareness among politicians about the impact of biometric mass surveillance on fans’ rights and dignity.
What can fans do?
Research the use of mass surveillance in football and share the findings with other fans. Write to EDRi’s campaigns and outreach officer Belen (belen.luna[at]edri.org) or email info[at]fanseurope.org if your club or local stadium operator deploys facial recognition cameras or other forms of mass surveillance.
Raise awareness among fans, community organisers, and local politicians as to the prevalence and impact of mass surveillance.
Organise locally and through national and pan-European representative bodies to contest the use of mass surveillance in football.
After our timely advocacy actions with over 70 organisations, the amendments to the IMCO – LIBE Committee Report for the Artificial Intelligence Act clearly state the need for a ban on Remote Biometric Identification. In fact, 24 individual MEPs representing 158 MEPs, demand a complete ban on biometric mass surveillance practices. Now we need to keep up the pressure at European and national levels to ensure that when the AI Act is officially passed, likely in 2023 or 2024, it bans biometric mass surveillance.
In April 2021, as a direct result of the work of civil society organisations like Reclaim Your Face, the European Commission put forward the draft for the EU Artificial Intelligence Act. The draft explicitly recognised the serious human rights risks of biometric mass surveillance by including a prohibition on ‘remote biometric identification’ (RBI) in publicly-accessible spaces.
However, the original RBI ban proposed by the European Commission was weak in three main ways:
It banned ‘real-time’ (live) uses of RBI systems, but not the far more common ‘post’ uses. This means that authorities could use RBI after the data is collected (hours, days or even months after!) to turn back the clock, identifying journalists, people seeking reproductive healthcare, and more.
It only applied the ban to law enforcement actors (i.e. police). As a result, we could all still be surveilled in public spaces by local councils, central governments, supermarket owners, shopping center managers, university administration and any other public or private actors.
It also contained a series of wide and dangerous exceptions that could be used as a “blueprint” for how to conduct biometric mass surveillance practices – undermining the whole purpose and essence of the ban!
Whilst this was a big win, it has some limitations. The next steps of the process require that the EU’s 704 Members of the European Parliament (MEPs) and 27 member state governments agree to a ban for it to become law.
A hot topic in the European Parliament
In the EU Parliament, the MEPs who work in the Civil Liberties (LIBE) and Internal Markets (IMCO) working groups (also known as ‘Committees’) were given the joint responsibility to lead on the Parliament’s official position on the AI Act. As such, they presented a shared IMCO – LIBE report in March 2022.
After that, they had to present their amendments in a process by which MEPs are able to show which parts of the AI Act are most important to them, and how they would like to see improvements.
Putting a stop to discriminatory or manipulative forms of biometric categorisation; and
Properly addressing the risks of emotion recognition.
In June 2022, MEPs in the LIBE and IMCO Committees submitted ‘amendments’ to the AI Act showing the results and power of our actions: hundreds of amendments were tabled on biometrics, showing the importance MEPs put on this topic.
Amendments show major support for a ban
Who supported our demands?
In total, 177 MEPs across 6 outof the 7 political groups supported a stronger RBI ban in the AI Act!
24 MEPs, from across 5 political groups, were champions of the Reclaim Your Face campaign! They tabled amendments for a full and unequivocal ban on all types of remote biometric identification (RBI) in publicly-accessible spaces. Two things are to be highlighted from this group. 1) it includes several of those who are responsible for the AI Act on behalf of their political group (called ‘Rapporteurs’ or ‘Shadows’) – a strong sign of broad support. This means that in fact, those 24 individual MEPs represent a staggering 158 MEPs who demand a complete ban on biometric mass surveillance practices! 2) some of the MEPs tabled these amendments ‘on behalf of’ their entire political group.
18 MEPs went almost as far as their colleagues, supporting a full ban on ‘real-time’ RBI in publicly-accessible spaces, by all actors, and without conditions for exceptions. However, these MEPs did not propose to extend the ban to ‘post’ uses of RBI. Given that these MEPs clearly understand the threats and risks of biometric mass surveillance, this gives us good ground to go forward and convince them that ‘post’ uses are equally, if not even more, harmful than real-time uses.
Dozens of MEPs additionally proposed two new and important bans. These explicitly prohibit the police from using private biometric databases, and the creation of biometric databases through mass/untargeted methods such as online scraping or the mass scraping of CCTV footage. If accepted, this would further protect people from biometric mass surveillance, particularly through the use of services like Clearview AI.
Furthermore, 1 additional MEP supported removing all the exceptions to the RBI ban!
Who opposed our recommendations?
Opposition to a ban on RBI was very limited.
Just three MEPs – all from the European People’s Party (EPP) – argued that RBI in publicly-accessible spaces should only be classified as high-risk, not prohibited. Nevertheless, it is notable that these MEPs still recognised that RBI is very risky.
Separately, 14 MEPs supported a ban in principle, but added that it should be less restrictive. This includes both Shadow Rapporteurs for the EPP group, alongside 12 colleagues from the right-leaning Identity & Democracy (ID) group, European Conservatives and Reformists (ECR) group and their own EPP group.
Who said ‘yes, but…’?
7 additional MEPs from the ECR and EPP groups were ambivalent, putting forward some amendments which would strengthen the ban but also proposing amendments which would weaken it.
So what’s the balance in the European Parliament?
Overall, this is a really positive set of amendments. It showsclear and significantpolitical willfor a stronger ban on biometric mass surveillance, taking us a step closer to a genuine EU ban on these chilling practices.
The perspective of the Parliament is clear: we need a strong ban on biometric mass surveillance!
Among those calling for the most comprehensive form of a ban – which Reclaim Your Face has argued is necessary to protect people’s rights and freedoms – is MEP Brando Benifei from the S&D group.Mr Benifei is one of two MEPs who share the ultimate responsibility for the Parliament’s position on the AI Act, so his support for a full ban is very powerful and meaningful.
The other co-lead MEP is MEP Dragos Tudorache from the Renew group. He is one of the MEPs who supported all of our demands, except the one that would extend the ban to ‘post’ uses. Whilst we still, therefore, have work to do to convince Mr Tudorache and his colleagues, we can already see clear progress in his thinking. Last year he commented that he does not believe that a prohibition is the right approach to RBI. Now, Mr Tudorache says he agrees with us that RBI is a key human rights issue. His support is therefore also very important, and we believe that he will be open to learning more about how post uses of RBI pose a threat to organising, journalism and other civil freedoms.
We are also very proud of the commitment and effectiveness of organisations in the Reclaim Your Face. The amendments showed that the Parliament clearly listened and that the power of our joint actions is truly huge!
The fight is still far from over.
Whilst RBI in publicly-accessible spaces is a major part of biometric mass surveillance, practices such as biometric categorisation and emotion recognition (making predictions about people’s ethnicity, gender, emotions or other characteristics based on how they look or act) can also lead to biometric mass surveillance. That’s why we are also advocating for strong bans on both practices in the AI Act – which we are pleased to see have been put forward by several MEPs.
There is also a lot left to go in the political process. These amendments need to be turned into compromise amendments, and then voted on to ensure that the entire Parliament officially agrees. Only then will negotiations begin with the member state governments (Council), where more permissive home affairs ministers have clashed with more rights-protective justice ministers over whether to weaken or strengthen the RBI ban.
This emphasises why now, more than ever, we need to keep up the pressure at European and national levels to ensure that – when the AI Act is officiallypassed, likelyin 2023 or 2024 – it bans biometric mass surveillance!
Get in contact with us to find out to support Reclaim Your Face!
We campaigned during a pandemic and worked with creative efforts to gather signatures while respecting privacy and protecting data. We adapted to the political reality and managed to influence EU’s negotiations. We built a coalition with 76 organisations from over 20 EU countries. We led national actions and we won.
Campaigning for privacy with privacy
Out of the 90 ECIs ever started, only 6 have been able to reach the threshold of 1 million signatories. All 6 used social media targeted advertising. In Reclaim Your Face we have a commitment to everyone’s privacy. Therefore, we gathered almost 80 thousand signatures without using any targeted social media advertisement (or as we call them, surveillance ads). Every single ECI signatory was reached directly by one of our partners or their supporters by sharing our posts, sending newsletters and collecting signatures in the streets.
A challenge? Yes. But organic reach gave us a great opportunity to have direct interactions with other organisations, a high level of engagement from our supporters, and quality conversations about biometric mass surveillance. In fact, all of these factors played out to make our petition the “most politically powerful ECI ever”, according to an insider part of the European Economic and Social Committee.
“Most politically powerful ECI ever”
Insider part of the European Economic and Social Committee.
This is how we did it:
Coalition building: Different voices across Europe
Reclaim Your Face aimed to have a diversity of voices represented in our call to ban biometric mass surveillance. We listened and worked especially with groups most affected by this exploitative practice.
In total, we were joined by 76 organisations from 20 Member States – who represent over half a million supporters.Our coalition has been the backbone of our success.
Volunteers for paper signature collection
Once the pandemic allowed us to be present in offline spaces, we decided to organise a Bootcamp for those who wanted to help us gather signatures. We trained over 80 people from more than 7 countries on 3 topics: biometric mass surveillance issues, ECI data protection practices and offline engagement methods.
The new Reclaim Your Face volunteers collected signatures in their own cities and engaged with people in the streets, at universities, in parks and in other public spaces. Activists in Portugal, Italy, Germany, Czechia and Greece made time in their days to share their thoughts on biometric mass surveillance, inform other citizens about its’ incompatibility with human rights and collect paper signatures for our ECI.
Local national campaigns
Reclaim Your Face was decentralised, building communities in more than 6 countries that led national actions and successes. Among many, here are some of our national wins:
The campaign’s German movement led by EDRi members Chaos Computer Club (CCC), Digitale Gesellschaft and Digitalcourage worked with more than 16 organisations. They organised over 14 events and were part of social media stunts, Twitter storms, as well as offline peaceful manifestations. Almost 30,000 German citizens signed the campaign’s European Citizens’ Initiative, proving that people-powered action can create meaningful change.
In November 2021, the new German government announced their highly-anticipated coalition deal, includingthe strongest commitments seen so far in Europe to “rule … out” “biometric recognition in public”. They further called to “reject comprehensive video surveillance and the use of biometric recording for surveillance purposes”.
The Italian national campaign lead by Hermes Center, with more than 9 organisations in the coalition has coordinated many actions too, across almost 2 years.
Leading organisation Iure has also organised many actions from creative work like comics and video clips, to paper signature collection days.
Two of the leading actions for Reclaim Your Face in Czechia has been the fight against biometric cameras at Prague airport and one of the seminars organised in the Chamber of Deputies where they talked about biometric cameras with police and political representatives.
As a result of international pressure, in September 2021, a Draft Law on Internal Affairs, which contained provisions for legalising a massive biometric video surveillance system, was pulled from the further procedure. This was an amazing win for human rights and a result of Share Foundation’s national campaign Thousands of Cameras, a two-and-a-half year-long battle against smart cameras in Belgrade installed by the Ministry of Interior and supplied by Chinese tech giant Huawei.
The Portuguese lead organisation in the Reclaim Your Face coalition D3 (Defesa Dos Direitos Digitais) led actions to raise awareness, as the Portuguese government proposed video surveillance and facial recognition law. Reclaim Your Face organisations and EDRi sent a letter to representatives of Portugal’s main political parties, supporting D3’s fight against biometric mass surveillance practices. Together, we urged politicians to reject this dystopian law. The proposal was later withdrawn.
EU level successes
In parallel with our work at the national level, we unite and coordinate EU-level actions.
In fact, in May 2022 we could see the results of our actions. After meeting with key MEPs working on the EU’s AI Act proposal, delivering an open letter signed by 53 organisations and publishing multiple op-eds, both co-lead MEP on the AI Act announced their support for a ban. Dragos Tudorache (Renew) announced that he personally will table amendments for a more comprehensive ban on RBI in publicly-accessible spaces, calling RBI “clearly highly intrusive … in our privacy, our rights”.
Today we say goodbye to our European Citizens Initiative and are humbled by the tens of thousands of people who signed it.
However, Reclaim Your Face continues!
We envision a society in which no one is harmed by biometric mass surveillance. Such a society is only possible when biometric mass surveillance is banned by law and in practice. Together with our partners, we continue to fight for this a reality by advocating for an AI Act that puts people at its core.
A big success for Homo Digitalis: The Hellenic DPA fines CLEARVIEW AI with €20 million
On July 13 2022, following a complaint filed by Homo Digitalis in May 2021 representing our member and data subject Marina Zacharopoulou, the Hellenic Data Protection Authority (HDPA) issued Decision 35/2022 imposing a fine of 20 million euros on Clearview AI for its intrusive practices. By the same Decision, the DPA prohibits that company from collecting and processing the personal data of data subjects located in Greece using facial recognition methods and requires it to delete immediately any data it has already collected.
Specifically, in May 2021, an alliance of civil society organizations consisting of Homo Digitalis and the organisations Privacy International, Hermes Center, and noyb filed complaints before the competent authorities in Greece, the United Kingdom, Italy, Austria, France and the United Kingdom against Clearview AI for its mass surveillance practices through facial recognition.
Earlier this year, the Italian Data Protection Authority had decided to fine the company €20 million, while the UK’s equivalent authority had decided to fine it £7.5 million.
The €20 million fine imposed by the DPA today is another strong signal against intrusive business models of companies that seek to make money through the illegal processing of personal data. At the same time, it sends a clear message to law enforcement authorities working with companies of this kind that such practices are illegal and grossly violate the rights of data subjects.
Clearview AI is an American company founded in 2017 that develops facial recognition software. It claims to have “the largest known database of more than three billion facial images” which it collects from social media platforms and other online sources. It is an automated tool that visits public websites and collects any images it detects that contain human faces. Along with these images, the automated collector also collects metadata that complements these images, such as the title of the website and its source link. The collected facial images are then matched against the facial recognition software created by Clearview AI in order to build the company’s database. Clearview AI sells access to this database to private companies and law enforcement agencies, such as police authorities, internationally.
The full text of Decision 35/2022 can be found here (only in EL).
Week of actions: Reclaim Your Face Italy and the need for a real EU ban on biometric mass surveillance
During the second week of May 2022, Reclaim Your Face Italy held a week of actions for an EU ban on biometric mass surveillance in Milan, Torino and Como. They collected signatures on the streets of the 3 cities, joined an event organised by the Greens-European Free Alliance Group and made a field visit to Italy’s city, Como, the first one to implement facial recognition technology in a public park.
In its decision, the DPA argued that the system lacks a legal basis and, as designed, it would constitute a form of mass surveillance. Thanks to the actions of Hermes Center, Associazione Luca Coscioni, Certi Diritti, CILD, Eumans, info.nodes, The Good Lobby, Privacy Network, Progetto Winston Smith, and StraLi, a temporary ban on facial recognition technology in public spaces was introduced later. This moratorium will be in force until December 2023.
On the 10th of May 2022, Reclaim Your Face hosted paper signature collection stands in three big cities of Italy: Milan, Torino, and Rome. This paper signature collection was organized by Hermes Center and two national Reclaim Your Face partners: StraLi and CILD. The activists were in front of Universities and in the city center to talk about the risks of biometric mass surveillance, giving out stickers, booklets, Reclaim Your Face T-shirts and bags.
Event with Greens-European Free Alliance Group
Colleagues from Hermes Center, Riccardo Coluccini and Davide Del Monte, joined as speakers for the event ‘Stop Biometric Surveillance – Time for an EU ban on biometric mass surveillance in public spaces’ to explain why Italy must carry on campaigning pushing for a real ban on biometric surveillance in the EU.
In May 2022, together with representatives from the Greens- European Free Alliance Group and journalists from the Czech Republic, the researchers visited the park where Facial Recognition cameras were installed and talked about their investigation. While the cameras are still there, the Facial Recognition and other algorithmic functions are turned off at the moment. The Greens- European Free Alliance Group and Czech journalist later met with local journalist Andrea Quadroni who talked about the migrant crisis that hit Como in 2016.
The trip to Como is part of the Greens- European Free Alliance Group’s newly released mini-documentary while articles about the actions and results of Reclaim Your Face in Italy were published on national TV and radio station in the Czech Republic.
Reclaim Your Face’s coalition & 53 orgs made it: Leading EU politician speaks against biometric mass surveillance
This month we worked together on some specific actions to influence the Artificial Intelligence Act to include a ban on biometric mass surveillance and 53 organisations took part. This builds on the hard work of the whole Reclaim Your Face coalition over the last two years. Our actions have had amazing results, with even the co-lead MEP on the Artificial Intelligence Act committing to table amendments for a more comprehensive ban on RBI in publicly-accessible spaces!
Here is a snapshot of our joint actions last/this week:
Reclaim Your Face organisations from Italy, Germany, France and Belgium met with key MEPs working on the EU’s AI Act proposal, including co-lead MEP Dragos Tudorache, co-lead MEP Brando Benifei, and MEP Birgit Sippel.
53 organisations signed our Reclaim Your Face open letter asking MEPs to protect fundamental rights in the AI Act by prohibiting all remote (i.e. generalised surveillance) uses of biometric identification (RBI) in publicly- accessible spaces.
Putting a stop to discriminatory or manipulative forms of biometric categorisation; and
Properly addressing the risks of emotion recognition.
Our demands were published in various EU Policy outlets and France.
Our tireless actions to call for a Ban on Biometric Mass Surveillance in the Artificial Intelligence Act have had amazing results so far!
Following our meeting, the co-lead MEP on the AI Act, Dragos Tudorache (Renew) announced that he personally will table amendments for a more comprehensive ban on RBI in publicly-accessible spaces, calling RBI “clearly highly intrusive … in our privacy, our rights”.
This is the result of hearing the views of his colleagues in a majority of the Parliament’s political groups – with several lead MEPs committing publicly to submitting amendments for a full ban on biometric mass surveillance in the AI Act – and suggesting the big influence of the calls of the dozens of organisations and 71,000 people behind Reclaim Your Face’s European Citizens Initiative.
However, we have not won – yet. There will still be many months of negotiations in the AI Act.
You can support Reclaim Your Face individually by signing our ECI, and as an organisation by getting in contact with us so we can explore paths of collaboration.
Thank you to all our partners and supporters for making this possible! The response to our actions suggests that there is clear a majority in the Parliament supporting our call.
How can you influence the AI Act in order to ban biometric mass surveillance across Europe?
The EU is currently negotiating the Artificial Intelligence (AI) Act. This future law offers the chance to effectively ban biometric mass surveillance. This article aims to offer an overview of how the EU negotiates its laws and the key AI Act moments in which people can make their voices heard.
Two months after Reclaim Your Face launched our European Citizens’ Initiative (February 2021), the EU proposed a new law: the AI Act. In April 2021, the draft law included a ban on some forms of biometric surveillance. Despite its shortcomings, the mere mention of the word “prohibit” in the draft law was a huge success for our campaign.
The AI Act draft showed that, if it wants, the EU has the power to truly ban biometric mass surveillance practices. As a result, we decided that the negotiations around this law will be crucial to make our Reclaim Your Face campaign demands real.
Most importantly, it showed that the calls launched by tens of thousands of people and civil society organisations across Europe since October 2020 have had a real impact.
How are EU laws negotiated?
The process of EU law-making can be difficult to grasp. The graphic below explains the role of the European Commission, the negotiations between the European Parliament and the Council of the EU, as well as the different actions we took/take during these steps.
As you can see, the European Commission (EC) is the body that proposes a new EU law. After preparatory work, the EC writes up a draft, publishes and sends it to the Parliament and the Council. Both the Parliament and the Council debate internally. As a result, each of them will form a position on the EC draft. Next, they meet – together with the EC – in a negotiation step called ‘trilogues’. Unfortunately, trilogues are notorious for their opacity and lack of opportunities for public scrutiny.
The European Parliament is the only directly-elected EU body. For this reason, it is probably the body that most takes into account people’s voices. Influencing the opinion of the Parliament – before the trilogues start – is therefore a key component of civil society’s work on EU laws.
The European Parliament is formed of 705 Members (MEPs) from all 27 EU Member States. Most MEPs are also part of Parliament Committees. The Committees have a crucial role in forming the Parliament’s position.
One or more Committees are assigned to write a report that forms the basis of the entire European Parliament’s position. The AI Act is handled jointly in the Parliament by two Committees: LIBE (Civil Liberties, Justice and Home Affairs) and IMCO (Internal Markets and Consumer Protection).
Each Committee has a Rapporteur (overall lead) and several Shadow Rapporteurs (lead for their political group). The important thing to remember is that these MEPs are all key players in shaping the report of the Parliament on the AI Act. They may also be influenced by other MEPs in their Committee(s), who can suggest changes to the draft report (“amendments”) as well as the heads of their political group. See more below.
When can people strategically influence the negotiations on the AI Act?
During the negotiations of the lead committees
he lead committees in the Parliament (IMCO and LIBE) are working on their report on the AI Act up until October 2022. This means that already we should raise awareness of our work and our demands to MEPs in those two committees. What are some crucial steps of the negotiation around the LIBE–IMCO Committee report?
First, the lead Rapporteurs Benifei and Tudorache publish an IMCO–LIBE draft report (expected April 2022) which represents the parts of the position on which they could agree. Afterwards, the other MEPs in IMCO and LIBE can propose amendments to this draft report, including the areas that need more democratic scrutiny. The tabling of amendments is expected to happen until 18 May 2022.
The amendments are then negotiated among a selected number of MEPs in the LIBE-IMCO committees (the Rapporteurs and Shadow Rapporteurs), and agreed upon by coming up with compromises. The negotiations around these amendments and the agreement on a compromise text for the LIBE–IMCO report are expected to happen between May and October 2022.
After lead committees conclude their report, and just before the Plenary vote
Once out of committee negotiations, this joint IMCO–LIBE report will be presented to the full Parliament, known as the Plenary. When the report is presented to the Plenary, there is also an opportunity for last-minute amendments to the committees’ report to be put forward. This tabling of amendments before the Plenary vote is yet another moment in which MEPs may introduce protections for people against biometric mass surveillance.
After any final amendments to the IMCO–LIBE report are voted on, all 705 MEPs will vote on whether or not to accept this final version of the IMCO–LIBE report as the Parliament’s position. Currently, the 705 MEPs are scheduled to vote on the final report in November 2022.
In parallel, the Council of Member States is currently trying to make the in-principle ban on biometric surveillance from the Commission‘s draft weaker and more narrow. If the Parliament agrees on the need for a full ban on biometric mass surveillance practices, we have a chance to fight back against the Council’s proposal.
Supporters of Reclaim Your Face can play an important role in the negotiations of the EU’s Artificial Intelligence Act. Are you ready for bold, strategic and direct action? Subscribe to EDRi’s mailing list to be kept in the loop and follow our social media channels.
About ClearviewAI’s mockery of human rights, those fighting it, and the need for EU to intervene
Clearview AI describes itself as ‘The World’s Largest Facial Network’. However, a quick search online would reveal that the company has been involved in several scandals, covering the front page of many publications for all the wrong reasons. In fact, since New York Times broke the story about Clearview AI in 2020, the company has been constantly criticised by activists, politicians, and data protection authorities around the world. Read below a summary of the many actions taken against the company that hoarded 10 billion images of our faces.
How did Clearview AI build a database of 10 billion images?
Creepily enough, this database can be available to any company, law enforcement agency and government that can pay for access.
This will go on, as long as we don’t put a stop to Clearview AI and its peers. Reclaim Your Face partners and other organisations have taken several actions to limit Clearview AI in France, Italy, Germany, Belgium, Sweden, the United Kingdom, Australia and Canada.
In several EU countries many activists, Data Protection Authorities and watchdogs took action.
In May 2021, a coalition of organisations (including noyb, Privacy International (PI), Hermes Center and Homo Digitalis) filed a series of submissions against Clearview AI, Inc. The complaints were submitted to data protection regulators in France, Austria, Italy, Greece and the United Kingdom.
Here are some of the Data Protection Authorities and watchdogs’ decisions:
Following Reclaim Your Face Partner’s Privacy International and individual complaints about Clearview AI’s facial recognition software, the French data protection authority (‘CNIL’) decided in December 2021 that Clearview AI should cease collecting and using data from data subjects in France.
In February 2021, the Swedish data protection authority (IMY), decided that a Swedish local police’s use of Clearview’s technology involved unlawfully processed biometric data for facial recognition. The DPA also pointed the Police failed to conduct a data protection impact assessment. As such, the authority fined the local police authority €250,000 and ordered to inform people whose personal data was sent to the company.
Clearview AI is in trouble outside of the European Union too
On 27 May 2021, Privacy International (PI) filed complaints against Clearview AI with the UK’s independent regulator for data protection and information rights law and Information Commissioner’s Office (ICO). Jointly with OAIC, which regulates the Australian Privacy Act, conducted a joint investigation on Clearview AI from 2020. Last year, ICO announced its provisional intent to impose a potential fine of over £17 million based on Clearview’s failure to comply with UK data protection laws.
On the other hand, OAIC has reached a decision and ordered the company to stop collecting facial biometrics and biometric templates from people in Australian territory; and to destroy all existing images and templates that it holds.
Canadian authorities were unequivocal in ruling that Clearview AI was a violation of their citizen’s right to privacy, and furthermore, that this use constitutes mass surveillance. Their statement highlights the clear link between ClearviewAI and biometric mass surveillance and assumes that all citizens are suspects of crime.
It is deeply worrying that Clearview AI’s technologies are reportedly being used in warfare. What happens when Clearview AI decides to offer its services to military forces with whom we disagree? What does this say about the geopolitical power we allow – as a society – for surveillance of private actors? By allowing Clearview AI into military operations, we are opening Pandora’s box for technologies that have been ruled incompatible with people’s rights and freedoms to be deployed into a situation of literal life-or-death. Clearview AI’s systems show documented racial bias and have facilitated several traumatic wrongful arrests of innocent people around the world. Even a system that is highly accurate in lab conditions will not perform as accurately in a war zone. This can lead to fatal results.
Clearview AI mocks national data protection authorities. We must act united! The EU must step up and use the AI Act to end the mocking of people’s rights and dignity.
Pressure is mounting, but Clearview AI is not stepping down. Instead, the company started as a service for law enforcement uses only, but is now telling investors they are extending towards the monitoring of gig workers – among others.
Laura Carrer, Research and Advocacy at Digital Rights Unit, Hermes Center & Riccardo Coluccini, Reclaim Your Face national campaign contributor
On 9 March 2022 the Italian Data Protection Authority (DPA) fined the US-based company Clearview AI EUR 20 million after finding that the company monitored and processed biometric data of individuals on Italian territory without a legal basis.
The company reportedly owns a database including over 10 billion facial images which are scraped from public web sources such as websites, social media, online videos. It offers a sophisticated search service which creates profiles on the basis of the biometric data extracted from the image.
The fine is the highest expected according to the General Data Protection Regulation (GDPR), and it was motivated by a complaint sent by the Hermes Centre in May 2021 in a joint action with EDRi members Privacy International, noyb, and Homo Digitalis—in addition to complaints sent by some individuals and to a series of investigations launched in the wake of the 2020 revelations of Clearview AI business practices.
In addition to the fine, the Italian DPA ordered the company to delete personal and biometric data relating to individuals from Italy, to stop any further processing of data belonging to Italian people, and to designate a representative in the EU. Pictures were analysed by the facial recognition algorithm created by Clearview AI to build up a gigantic database of biometric data and access to the same database was sold to law enforcement agencies. The company also extracts any associated metadata from the image: title of the image or webpage, geolocation, date of birth, source link, nationality, gender.
According to the Italian DPA, biometric and personal data were processed unlawfully without an appropriate legal basis, the company failed to adequately inform people of how their images were collected and analysed, and processed people’s data for purposes other than those for which they had been made available online. In fact, a line of argument of Clearview AI was to equate themself to Google Search for faces. However, the DPA stated that, by selling access to a database and a proprietary face matching algorithm intended for certain categories of customers, “Clearview has specific characteristics that differentiate it from a common search engine that does not process or enrich images on the web […] creates a database of image snapshots that are stored as present at the time of collection and not updated.”
In addition, the DPA highlights that “the company’s legitimate interest in free economic initiative cannot but be subordinate to the rights and freedoms of the persons concerned.”
At the moment Clearview has 30 days to communicate to the Italian DPA what measures they are adopting and up to 60 days to either pay the fine or appeal to a court.
This decision is an other step in the right direction to ban all sorts of biometric surveillance practices that, as higlighted by EDRi-led campaign Reclaim Your Face, have a huge impact on fundamental human rights.