Letter to EU Commissioner for Justice

On 1 April, a coalition of 51 human rights, digital rights and social justice organisations sent a letter to European Commissioner for Justice, Didier Reynders, calling on the Commissioner to prohibit uses of biometrics that enable mass surveillance or other dangerous and harmful uses of AI. The letter comes ahead of the long-awaited proposal for new EU laws on artificial intelligence.

Read More

Evidence shows why we need a law against biometric mass surveillance

You often hear:

“Facial recognition of whole populations? But that’s just in China. We’re democratic in the EU. It’ll never happen to us.”

Unfortunately, it is already happening. Read below a summary of the extensive evidence we’ve compiled, documenting the rapid spread of biometric mass surveillance in EU countries. The (only) good thing? We can still stop it.

Read More

New petition calls Europeans to unite for their future

The Reclaim Your Face coalition launches a European Citizens’ Initiative (ECI) today to ban biometric mass surveillance. The ECI represents the voice of those who oppose a dystopian future and instead want a future in which choices are made by us, not by algorithms. The initiative needs to collect 1 million signatures in at least 7 EU countries during the next year.

As the European Commission is preparing new laws on Artificial Intelligence, a growing coalition of digital and human rights organisations led by European Digital Rights (EDRi) warns of the many dangers of biometric mass surveillance on our freedom and dignity. The Reclaim Your Face coalition is demanding a ban on the use of harmful AI such as biometric mass surveillance by launching a European Citizens’ Initiative (ECI) today. The ECI is a unique tool the EU puts at the disposal of European citizens to organise and collectively demand new legislative frameworks.

Biometric data are data about our bodies and behaviours, which can divulge sensitive information about who we are. For example, our faces can be used for facial recognition to make a prediction or assessment about us – and so can our eyes, veins, voices, the way we walk or type on a keyboard, and much more.

Governments, police forces and corporations are using recording devices (like CCTV cameras) and facial recognition software to gather our unique biometric data. They track us by using our unique characteristics in order to permanently identify each of us. The blanket capture of every person’s biometric data in public spaces like parks, train stations and shops simply for trying to live our lives is biometric mass surveillance.

“Biometric mass surveillance brings “Internet-style” omnipresent tracking to the offline world. This would eradicate the few remaining refuges of privacy.”

Linus Neumann, EDRi member Chaos Computer Club

The processing of any biometric data is already prohibited the EU. However, EU law allows some very problematic exceptions to this general prohibition. Powerful actors take advantage of this legal confusion by introducing – outside of democratic oversight – these harmful technologies in our public spaces, dehumanising and treating us all as walking barcodes.

“Europeans have a proud and diverse history of rising up against injustice: the fight for universal suffrage, the Solidarność (Solidarity) workers’ movement, and so much more. Today, we launch a 1-million person petition that asks the EU to protect everyone from the harms of facial recognition and other biometric mass surveillance. By doing this, the Reclaim Your Face campaign fights for a future where people can live freely, express, think and organise without fear.”

Ella Jakubowska, Policy and Campaigns Officer, EDRi

The coalition has already built up evidence of vast and systemic abuses of people’s biometric data across Europe and is successfully challenging such deployments. Serbian authorities are surveilling the population on the streets of Belgrade. Italian authorities targeted migrants communities with biometric surveillance systems in the city of Como, and despite the fact that the practice was declared illegal, they are now trying to introduce it in other cities. In Greece, national authorities are investigating the use of biometric surveillance during police stops. In France, civil society fought in court the use of biometric mass surveillance targeting high schools and protesters. In the Netherlands, national authorities criticised biometric mass surveillance in supermarkets. The Reclaim Your Face coalition has been essential in challenging these investigations. For example Homo Digitalis triggered data protection investigations in Greece and Bits of Freedom supported statements by national authorities against biometric mass surveillance in the Netherlands. However, it has become clear that we need to act united, across countries, at European level.

The ECI represents the voice of those who oppose a dystopian future and instead want a future in which choices are made by us, not by algorithms, in which our bodies belong to us, in which we are not discriminated against based on how we look, how much money we have, or where we are from, and in which we have equity and justice.

The ECI needs to collect 1 million signatures in at least 7 EU countries during the next year. Succeeding will oblige the European Commission to respond to our formal demand for a new law and open a debate among the Members of the European Parliament.

This is a truly unique opportunity for all EU citizens to request a law that protects them by outlawing biometric surveillance practices in public spaces. A ban is our only hope to prevent harms that arise with the identification and judgment of people based on their face, body, features and behaviour.

European citizens have a historical chance to stop the harm before biometric mass surveillance becomes permanent in our society.

How he reclaimed his face from ClearviewAI

The Hamburg Data Protection Authority deemed Clearview AI’s biometric photo database illegal in the EU as a result of a complaint Matthias Marx, a member of the Chaos Computer Club (an EDRi member) filed.

By ReclaimYourFace campaign lead organisation Chaos Computer Club (CCC)
Originally published by noyb here.

In January 2020, two days after the New York Times revealed the existence of the face search engine Clearview AI, Matthias Marx, a member of the Chaos Computer Club (an EDRi member), sent a data subject access request to Clearview AI. He was surprised to learn that the company was processing his biometric data and lodged a complaint to the Hamburg data protection authority (DPA). As a result, the Hamburg DPA has now deemed Clearview AI’s biometric photo database illegal in the EU.

Chronological Review

In order to facilitate the data subject access request, Matthias shared a photo of his face. To confirm Matthias’ identity to guard against fraudulent access requests, Clearview AI additionally requested a government issued ID. Although Matthias ignored the request, Clearview AI sent him search results based on the photo he provided and confirmed deletion of the search photo in February 2020.

Matthias then electronically submitted a complaint to the Hamburg DPA that Clearview AI was processing his biometric data without consent. The DPA first rejected the complaint, arguing that the GDPR is not applicable. After further submissions, the DPA then eventually launched preliminary investigations. At the same time, noyb offered their support.

In May 2020, Clearview AI sent another, unsolicited, response to Matthias’ request and included new search results. Apparently, Clearview AI had not deleted the search photo as promised. While Clearview AI’s first answer only showed two photos of Matthias, this time it also contained eight photos of other people.

In August 2020, the Hamburg DPA ordered Clearview AI to answer a set of 17 questions under threat of penalties. Clearview AI replied in September 2020. In January 2021, Hamburg DPA initiated administrative proceedings against Clearview AI.


The decision acknowledges the territorial scope of the GDPR, which is triggered for entities outside the EU if they monitor the behaviour of data subjects in the EU. Clearview AI had argued against the applicability of the GDPR, saying that they do not monitor the behaviour of individuals but provide only a “snapshot of some photos available on the internet”.

The Hamburg DPA discarded this argument for two reasons. For one, Clearview AI’s results include information that stretches over a period of time. For another, Clearview AI’s database links photos with their source and associated data. As such, it records information in a targeted manner – the definition of monitoring. Moreover, the Hamburg DPA noted that the subsequent use of collected personal data for profiling purposes, as happens with Clearview AI’s results, can be seen as a strong indicator for the existence of monitoring.

Taking into account subsequent use is important because it underlines that downstream processing by other entities can, to a certain extent, be used to classify the nature of the upstream processing. In other words, entities cannot fully launder their data processing by handing off the dirty work downstream.

Despite clearly stating that Clearview AI lacked a legal basis for its biometric profile, the Hamburg DPA unfortunately only ordered the deletion of the complainant’s biometric profile – it neither ordered the deletion of the complainant’s photos already collected, nor did it issue an EU-wide ban on Clearview AI’s processing. noyb had submitted arguments on why the Hamburg DPA could issue an EU-wide ban against Clearview.

In conclusion, this decision is only a first step. Further litigation is necessary. While Europeans now have precedent to rely on, we need decisions that also declare the harvesting of photos for totally incompatible purposes to their initial publication illegal.

Face recognition at Italian borders shows why we need a ban

As part of Reclaim Your Face’s investigation in rights-violating deployments of biometric mass surveillance, EDRi member Hermes Center explains how the Italian Police are deploying dehumanising biometric systems against people at Italy’s border.

By Reclaim Your Face campaign lead organisation Hermes Center
Originally published in Italian here.

The Reclaim Your Face campaign has been investigating and exposing abusive and rights-violating uses of facial recognition tech, and other biometric mass surveillance, since its launch last year. In the latest in a long line of examples that show that these inherently discriminatory technologies are being used to further exclude some of society’s most marginalised people, Hermes Center explain how the Italian Police are deploying dehumanising biometric systems against people at Italy’s borders. Now, more than ever, we need to call for a ban on these biometric mass surveillance practices. The Reclaim Your Face campaign’s major EU petition (European Citizens’ Initiative), launching on 17th February, will gives us the legal means to demand just that.

The introduction of facial recognition systems in Italy continues to show the same symptoms that we denounce in the Reclaim Your Face campaign: lack of transparency, absolute disregard for the respect of human rights and inability to admit that some uses of this technology are too dangerous.

The latest episode concerns the Automatic Image Recognition System (SARI), initially acquired by the Italian police in 2017 and now, as revealed in an investigation by IrpiMedia, at the center of a new public tender with the aim of upgrading the system and employing it to monitor arrivals of migrants and asylum seekers on the Italian coasts and related activities.

“In order to do so, the Ministry of Interior has used two strategies: taking advantage of the European Internal Security Funds and, as shown by some documents obtained by IrpiMedia thanks to a FOIA request, ignoring the questions of the Italian Data Protection Authority (DPA) that has been waiting for two years to close an investigation on the facial recognition system that the police wants to use,” reads the article.

In our Reclaim Your Face requests, we ask the Ministry of the Interior to publish all the evaluations of the algorithms used, the numbers on the use of the system, and all the data on the type of faces in the database used by SARI.

This information is fundamental in order to understand the effects of the algorithms that act on a database that is already strongly unbalanced and discriminatory: as revealed almost two years ago by Wired Italia, 8 out of 10 people in SARI’s database are foreigners. It is not clear how many of these are migrants and asylum seekers.

The biometric and digital identity processing of migrants and refugees in Italy has been studied in a Data&Society report carried out in 2019 by researcher Mark Latonero in partnership with Reclaim Your Face partner CILD, an Italian NGO. The field analysis uncovered an entire ecosystem composed of NGOs, government, researchers, media, and the private sector that collects, analyses, and manages digital information about migrants and refugees to provide them with support, regulate them, and study their behaviors. Collecting this data can lead to varying degrees of discrimination due to existing biases related to the vulnerability of migrants and refugees. Mindful of this study, we imagine how pervasive and unprotected a facial recognition system adopted on precisely one specific category of people could be. An additional level of scrutiny that we do not want to be normalised and become part of the daily lives of us all.

While requests on transparency of the algorithm and the database are not met and even the DPA is still waiting for an impact assessment of the system, the Ministry is also exploiting European money from the Internal Security Fund.

IrpiMedia details the subject of the contract as follows: “The budget allocated for the enhancement of the system is 246000€ and the enhancement includes the purchase of a license for a facial recognition software owned by Neurotechnology, one of the best known manufacturers in the world, able to process the video stream from at least two cameras and the management of a watch-list that includes up to 10 thousand subjects. In addition, the hardware and software configuration must be of small dimensions, to be inserted in a backpack and allow to carry out ‘strategic installations in places that are difficult to access with the equipment provided,’ reads the technical specs of the public tender of the Ministry of Interior.”

Biometric surveillance dehumanises us into lifeless bits of data, depriving us of our autonomy and the ability to express who we are. This is even more dangerous when applied to people who reach our countries escaping from violence, economic disasters, and environmental catastrophes. Meeting human beings with biometric surveillance technologies destroys our humanity.

The story is shocking, but it is not inevitable. Brutal technologies that amplify already persecurtory anti-migration strategies are the latest tool that show the extent of these structural problems. Banning biometric mass surveillance means not only stopping the use of such tools, but addressing the underlying inequalities and discrimination of our societies. You can support Reclaim Your Face’s campaign against discriminatory and intrusive biometric mass surveillance.

Reclaiming faces: Greece and the Netherlands

The Reclaim Your Face movement is growing, and our demands for transparency, limiting the accepted uses and respect for humans are becoming more and more common across Europe. New organisations are joining the coalition each week, and people across Europe continue to sign the petition to add their voices to our demands. Now, thanks to campaigning by Homo Digitalis in Greece and Bits of Freedom in the Netherlands, we’re getting closer to real political and legislative changes that will protect our faces and our public spaces from biometric mass surveillance.

By RYF leads Homo Digitalis and Bits of Freedom

Dutch DPA speaks out against biometric surveillance in public space

On 23 November, Dutch broadcasting station BNR did a morning show on facial recognition in publicly-accessible areas. In the Netherlands, these so called ‘smart’ cameras are used more and more in supermarkets, companies and football stadiums. In the 2-hour radio show, Lotte Houwing from Bits of Freedom put forward the case for a ban and more effective enforcement against biometric surveillance technologies in publicly-accessible spaces.

Later in the show the vice-president of the Dutch data protection authority (DPA), the Autoriteit Persoonsgegevens, was interviewed. She explained that the increase in the use of facial recognition technologies in the Netherlands is because of the abuse of the legal ground of “substantial public interest”. The explanatory memorandum which accompanies Dutch privacy laws gives the example of the security of a nuclear power plant as a possible justification for the use of such technologies. This means that you cannot use these technologies against petty thieves.

The Dutch DPA thinks that part of the problem of these harmful deployments is caused by misunderstanding and lack of knowledge about what is a legitimate use of facial recognition and other biometric surveillance technology. To tackle illegitimate uses, the DPA has now sent directed guidelines to explain the law to the different industry associations that have been the biggest users of this invasive technology. So far, there has not been a direct promise regarding enforcement measures against illegitimate deployments to follow this phase of targeted information. However, the vice-president was very clear about the DPA’s perspective on biometric surveillance in public space: “It is as if somebody is following you around with a camera and a notebook throughout the whole day. That’s a surveillance society we do not want.”

We embrace these statements of the DPA and encourage them to take concrete action for our fundamental rights and freedoms in publicly-accessible spaces. Reclaim Your Face and BanThisBS!

Civil society complaints against biometric surveillance lead to official investigations in Greece

In Greece, the work of EDRi member Homo Digitalis has started to bear fruit. In June 2020 the Greek watchdog submitted two strategic complaints before the Hellenic DPA against a centralised biometric database of the Hellenic Police. The database contains fingerprints and facial images of all Greek passport holders. However, based on the EU laws on passports (namely, Regulation 2252/2004 & Regulation 444/2009) as well as the set case-law of the Court of Justice of EU (for example, the Willems case and the Schwarz case), biometric data shall be stored in the storage medium of the passport itself – the document we carry in our pockets and bags. More precisely, the EU laws neither prohibit nor allow for national central databases of biometric data to exist at Member State’s level. So, the EU countries that are interested in establishing such biometric databases must take their own legislative initiatives on this matter.

However, as Homo Digitalis claims, this is not the case for Greece. Specifically, there is no Greek law in place providing in detail all the necessary safeguards about this centralised database, such as its functions, the rules for the related data processing activities, as well as the security and organisational measures that shall be in place. It is crucial to remember that based on EU law -Article 10 of the Law Enforcement Directive – processing of biometric data is allowed where it is strictly necessary, subject to appropriate safeguards, and authorised by the European Union or Member States law. So, the lack of such national legislation clearly violates European data protection laws.

In August 2020, the Hellenic DPA launched an official investigation regarding this centralised database following Homo Digitalis’ complaints. But, the positive news from Greece does not stop here! Do you remember the actions of Homo Digitalis in March 2020, described in a previous EDRi-gram, against a smart-policing contract of the Hellenic Police? It related to smart devices enabling facial recognition and automated fingerprint identification during police stops. In August 2020 the Hellenic DPA started an official investigation regarding this complaint, as well! Stay tuned for related developments and closely follow the ReclaimYourFace campaign for more.

People across Europe challenge biometric mass surveillance as Reclaim Your Face launches

Civil society across Europe slams biometric mass surveillance in a series of successful actions, protecting the dignity of people in the public space. Human rights groups Hermes Center (Italy), Homo Digitalis (Greece), Bits of Freedom (The Netherlands), Iuridicum Remedium (Czechia), SHARE Foundation (Serbia), Access Now and European Digital Rights (EDRi) today announce the launch of a broad European public campaign – Reclaim Your Face: Ban Biometric Mass Surveillance.

In past months, these civil society groups have already successfully mobilised their communities. This has culminated in stopping the use of facial recognition technology in French schools; calling for the Data Protection Authority’s investigations against the use of facial recognition by the Hellenic police; celebrating the City of Prague for refusing the introduction of facial recognition technologies in public; stopping an unlawful deployment of biometric surveillance in the Italian city of Como; as well as crowdsourcing a comprehensive mapping of all live facial recognition cameras in the city of Belgrade, Serbia. The newly-launched “Reclaim Your Face” coalition therefore calls on local and national authorities to reveal the risks and reject the use of biometric surveillance in public spaces.

“Public spaces and organised dissent are crucial for all major political progress in Serbia, as we strive for democracy. Biometric mass surveillance threatens our freedoms and ability to organise – we have to stop it.”

Filip Milošević, SHARE Foundation, Serbia

This civil society call to ban biometric mass surveillance comes in reaction to the rapid and secretive roll out of invasive and unlawful technologies by police forces and local authorities in many European countries. The European Commission is currently considering all options for protecting people from harmful uses of biometric surveillance technology, including a ban. We urge them to take the need for a ban seriously and put an end to this enormous threat to our rights and freedoms.

“Biometric surveillance dehumanises us into lifeless bits of data, stripping our autonomy and ability to express who we are. It forces us into an unaccountable, automated system in which we are unfairly categorised. Only a ban on biometric mass surveillance can ensure strong, joyful and organised communities can thrive.”

Laura Carrer, Hermes Center, Italy

Other organisations involved in the campaign coalition include Privacy International, ARTICLE 19, Chaos Computer Club (Germany), Panoptykon Foundation (Poland) and La Quadrature du Net (France).

Get in touch:

#ReclaimYourFace Launched in November 2020, #ReclaimYourFace is a European movement that brings people’s voices into the democratic debate about the use of our biometric data. The coalition challenges the use of this sensitive data in public spaces and its impact on our freedoms. Our coalition is made up of 12 European civil society organisations united to protect fundamental rights in the digital environment. SIGN UP TO THE CAMPAIGNS MAILING LIST

Online Action


Email banner 200×600 Download

Email banner 100×300 Download

Facebook banner 820×360 Download

Twitter banner 1500×500 Download

Offline Action


Printable Sticker Orange Download

Printable Sticker Blue Download

Printable Sticker Label Download



EN Download

Black & White

EN Download

B&W Low ink

EN Download



EN Download

Black & White Low Ink

EN Download

Social media and other online scraping

Social media is part of almost everyone’s lives. When you put photos or videos online, you probably imagine that your friends, your family and your followers will see them. You probably don’t imagine that they will be taken and processed by shady private companies and used to train algorithms. You’re probably even less likely to imagine that this information could be kept in a massive biometric database, and even used by police to identify you after you were in the area of a protest or were walking through a zone secretly surveilled.

WHAT is scraping?

Social media Scraping is the process of gathering data automatically. Now you might wonder, what type of data can they get from social media? Well, the data can go from usernames or followers to very sensitive data, like where do you live and … your biometric data such as your facial features. This is generally performed by bots and can gather and process large amounts of data in very little time.

While it might seem like we are far away from that, this is happening now, right in our faces.

Scraping is happening already

Probably the most well-known cuplrit of mass social media scraping of our faces is the notorious ClearviewAI. In this case, just being in the background of a friend’s photo that gets uploaded to the internet could be enough for you to end up in ClearviewAI’s 3 billion photo database. Yes, that’s billion with a “b”. Services like the ones kindly offered by the likes of CleaviewAI have been used for biometric mass surveillance practices by police forces all around the EU and elsewhere in the world.

Gladly, thanks to the efforts of one of the ReclaimYourFace’s campaigners, the inclusion of images of people in the EU in the ClearviewAI database has already been proven to be illegal. However, that hasn’t stopped it from happening.

Another well-known case is the one from Poland-founded company PimEyes (who suddenly relocated to the Seychelles, allegedly to avoid regulatory scrutiny in the EU). This is another company offering similar services – although they claim that it is not social media sites that they scrape but other sites (as if that made it more ethical!). However, unlike ClearviewAI, which tends to offer their services to law enforcement, PimEyes are offered to any individual that would like to access it.

Yes, anyone walking on the street could scan your face and know everything about you.

Just imagine if wherever you went, any person could know your name, your interests, where you live and many other sensitive things about you just by scanning your face at a distance using their phone. You might not even know that they had done this – but they would be able to know a lot about you.

I don’t use social media

Even if you don’t use social media, you can still be part of those databases. For instance, if people that you know have uploaded pictures with you in them to social media giants like Facebook, Twitter and YouTube, or if you use certain other online services with your picture, your biometric data could be scraped. And yes, these Big Tech companies will know probably as much as if you were a social media user. In fact, there have even been reports of so-called ‘shadow profiles’, where Facebook knows so much information about people who don’t have accounts that it’s as if they have an active profile!

Why do we need to stop this?

Summarise harms and the need to stop. Harms connect to bigger things than privacy ad anonymity e. g. stalkers, ex partners, being judged by authorities based on social media posts, etc…


PimEyes is a company with an enormous facial recognition database, reportedly scraped from the internet without people’s knowledge, and available for members of the public to use to spy on whoever they like:

Not only did Clearview AI illegally scrape our biometric data from social media, the safety of this data has been compromised in past years. All of our data is not only used against us by police forces around the world, but now leaked to anyone else:

News and reports

Read our News Section to learn our latest updates, and SIGN the ECI!

Borders, migrants and failed humanitarism

Some of the most chilling biometric systems we have seen are those deployed against people on the move.

Whether it’s for migration, tourism, to see their family, for work, or to seek asylum, people on the move rely on border guards/police and governments to grant them permission to enter.

It should be obvious that they have a right to be treated humanely and with dignity while doing so. However, we know that non-EU nationals traveling into the EU are frequently treated as a frontier for experimentation with biometric technologies.

Governments and tech companies exploit their power to treat people on the move as lab rats for the development of invasive technologies under the guise of “innovation” or “security”. These, sometimes, are subsequently deployed as part of general monitoring systems.

Experimenting on people on the move

The authorities and companies that are deploying these technologies exploit the secrecy and discretion that surrounds immigration decisions. In result, they experiment with technologies that are worryingly pseudo-scientific and with some that rely on discredited and harmful theories like phrenology.

For instance, real examples from Europe include:

1. The analysis of people’s “micro-expressions” to supposedly detect if they are lying

2. Testing on voices and bones as a way to interrogate their asylum claims.

Experimenting on: people in need of humanitarian aid

Similar patterns to border administration are increasingly being seen in the humanitarian aid context. Aid agencies or organisations sometimes see biometric identification systems as a silver bullet for managing complex humanitarian programmes without properly assessing the risks and the rights implications. This is often driven by the unrealistic promises made by biometric tech companies.

In the context of humanitarian action, people who are relying on aid are rarely in a position to refuse to provide their biometric data, meaning that any ‘consent’ given cannot be considered legitimate.

Therefore, their acceptance could never form a legal basis for these experiments, because they are forced by the circumstance to accept. For people who are not in this position, it would never be legal to act in this way. In humanitarian aid, people’s fragility is exploited. There are also major concerns about how such data are stored. For instance, there are still many questions about how it may be used in ways that are actually incredibly harmful to the people they are claiming to help. If these concerns were not enough, such practices also have a high chance of leading to mass surveillance and other rights violations by creating enormous databases of sensitive data.

Paragraph: Governments and companies treat people who are in need of protection instead as if they are liars, test subjects, or intrinsically suspects just for being migrants. In the right bottom corner there's a blue fence.


Not only are these unreliable tests often unnecessarily invasive and undignified, but they treat people who are in need of protection instead as if they are liars, test subjects, or intrinsically suspicious just for being a migrant.

Further more, EU and EU countries fund heavily this. A lot of the money that goes into funding these projects comes from EU agencies or Member States, for example through the EU’s Horizon2020 programme. In their public database, you can read about how EU money is funding biometric experiments that would not be out of place in a science fiction film.

This is happening in tandem with a rise in funding for the EU’s Frontex border agency, which has been accused of violently militarising European borders and persecuting people on the move.


In Italy, the police have used biometric mass surveillance against people at Italy’s borders and have attempted to roll out a real-time system to monitor potentially the whole population with the aim of targeting migrants and asylum seekers:

In the Netherlands, the government has created a huge pseudo-criminal database of personal data which can be used for performing facial recognition solely for the reason that those people are foreign: [p.67]

In Greece, the European Commission funded a notorious project called iBorderCTRL which used artificial intelligence and emotion recognition to predict whether people in immigration interviews are lying or not. The project has been widely criticised for having no scientific basis and for exploiting people’s migration status to try out untested technologies:

News and reports:

Learn more about the different ways that biometric mass surveillance affects us, and SIGN the ECI!

ReclaimYourFace is a movement led by civil society organisations across Europe:

Access Now ARTICLE19 Bits of Freedom CCC Defesa dos Direitos Digitais (D3) Digitalcourage Digitale Gesellschaft CH Digitale Gesellschaft DE Državljan D EDRi Electronic Frontier Finland Hermes Center for Transparency and Digital Human Rights Homo Digitalis IT-Political Association of Denmark IuRe La Quadrature du Net Liberties Metamorphosis Foundation Panoptykon Foundation Privacy International SHARE Foundation
In collaboration with our campaign partners:

AlgorithmWatch AlgorithmWatch/CH All Out Amnesty International Anna Elbe Aquilenet Associazione Luca Coscioni Ban Facial Recognition Europe Big Brother Watch Certi Diritti Chaos Computer Club Lëtzebuerg (C3L) CILD D64 Danes je nov dan Datapanik Digitale Freiheit DPO Innovation Electronic Frontier Norway European Center for Not-for-profit Law (ECNL) European Digital Society Eumans Football Supporters Europe Fundación Secretariado Gitano (FSG) Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung Germanwatch German acm chapter Gesellschaft Fur Informatik (German Informatics Society) GONG Hellenic Association of Data Protection and Privacy Hellenic League for Human Rights info.nodes irish council for civil liberties JEF, Young European Federalists Kameras Stoppen Ligue des droits de L'Homme (FR) Ligue des Droits Humains (BE) LOAD e.V. Ministry of Privacy Privacy first logo Privacy Lx Privacy Network Projetto Winston Smith Reporters United Saplinq Science for Democracy Selbstbestimmt.Digital STRALI Stop Wapenhandel The Good Lobby Italia UNI-Europa Unsurv Vrijbit Wikimedia FR Xnet

Reclaim Your Face is also supported by:

Jusos Piratenpartei DE Pirátská Strana

MEP Patrick Breyer, Germany, Greens/EFA
MEP Marcel Kolaja, Czechia, Greens/EFA
MEP Anne-Sophie Pelletier, France, The Left
MEP Kateřina Konečná, Czechia, The Left

Should your organisation be here, too?
Here's how you can get involved.
If you're an individual rather than an organisation, or your organisation type isn't covered in the partnering document, please get in touch with us directly.