News




Italian DPA fines Clearview AI for illegally monitoring and processing biometric data of Italian citizens

Laura Carrer, Research and Advocacy at Digital Rights Unit, Hermes Center & Riccardo Coluccini, Reclaim Your Face national campaign contributor

On 9 March 2022 the Italian Data Protection Authority (DPA) fined the US-based company Clearview AI EUR 20 million after finding that the company monitored and processed biometric data of individuals on Italian territory without a legal basis.

The company reportedly owns a database including over 10 billion facial images which are scraped from public web sources such as websites, social media, online videos. It offers a sophisticated search service which creates profiles on the basis of the biometric data extracted from the image.

The fine is the highest expected according to the General Data Protection Regulation (GDPR), and it was motivated by a complaint sent by the Hermes Centre in May 2021 in a joint action with EDRi members Privacy International, noyb, and Homo Digitalis—in addition to complaints sent by some individuals and to a series of investigations launched in the wake of the 2020 revelations of Clearview AI business practices.

In addition to the fine, the Italian DPA ordered the company to delete personal and biometric data relating to individuals from Italy, to stop any further processing of data belonging to Italian people, and to designate a representative in the EU. Pictures were analysed by the facial recognition algorithm created by Clearview AI to build up a gigantic database of biometric data and access to the same database was sold to law enforcement agencies. The company also extracts any associated metadata from the image: title of the image or webpage, geolocation, date of birth, source link, nationality, gender.

According to the Italian DPA, biometric and personal data were processed unlawfully without an appropriate legal basis, the company failed to adequately inform people of how their images were collected and analysed, and processed people’s data for purposes other than those for which they had been made available online. In fact, a line of argument of Clearview AI was to equate themself to Google Search for faces. However, the DPA stated that, by selling access to a database and a proprietary face matching algorithm intended for certain categories of customers, “Clearview has specific characteristics that differentiate it from a common search engine that does not process or enrich images on the web […] creates a database of image snapshots that are stored as present at the time of collection and not updated.”

In addition, the DPA highlights that “the company’s legitimate interest in free economic initiative cannot but be subordinate to the rights and freedoms of the persons concerned.”

At the moment Clearview has 30 days to communicate to the Italian DPA what measures they are adopting and up to 60 days to either pay the fine or appeal to a court.

This decision is an other step in the right direction to ban all sorts of biometric surveillance practices that, as higlighted by EDRi-led campaign Reclaim Your Face, have a huge impact on fundamental human rights.


Further reading:

Italian DPA decision on Clearview AI (in Italian): https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9751362

Hermes Center press release on fine to Clearview AI: https://www.hermescenter.org/clearview-ai-ha-monitorato-i-cittadini-italiani-garante-privacy-illegale/

Challenge against Clearview AI in Europe: https://privacyinternational.org/legal-action/challenge-against-clearview-ai-europe

Reclaim Your Face impact in 2021

A sturdy coalition, research reports, investigations, coordination actions and gathering amazing political support at national and EU level. This was 2021 for the Reclaim Your Face coalition – a year that, despite happening in a pandemic – showed what the power of a united front looks like.


Forming a coalition in a strategic moment

In January 2021, a group of civil society organisations were meeting every 2 weeks to strategise and plan what has become one of the most politically–powerful campaigns: Reclaim Your Face.

Set on a mission from October 2020, the coalition of then 12 organisations came together to form the Reclaim Your Face coalition, aiming to ban biometric mass surveillance in Europe. Since then we welcomed dozens more organisations, which work on digital rights and civil liberties, workers’ rights, the rights of Roma and Sinti people, LGBTQ+ rights, media freedom and the protection of migrants and people on the move. We gathered activists, volunteers, technologists, lawyers, academics, policy-makers – all united in one common goal.

The launch of the campaign happened at a strategic moment when the EU began its work on a law proposal to regulate artificial intelligence (AI). The relevance and timing of the Reclaim Your Face campaign is unquestionable as AI techniques are at the centre of today’s biometric surveillance technologies such as facial recognition.

Raising awareness of the spread and harms of biometric mass surveillance

For the people in the Reclaim Your Face coalition, 2021 started with a strong focus on raising awareness about the harms associated with biometric mass surveillance. More, we showed this exploitative practice is a reality in many cities across Europe and not a dystopian fiction story.

Check out our video records.

Researching biometric mass surveillance

EDRi’s Brussels office and the leading organisations of the campaign coordinated research: mapping both technology deployments and legal frameworks that govern (or not) biometric mass surveillance practice in some EU countries.

Coordinating pandemic-proof actions

In 2021, we also coordinated online and offline actions that enabled every campaign supporter to act as part of a powerful collective. The pandemic put constraints on realising such actions, however, the creative hive mind behind the campaign made it happen!

The #PaperBagSociety stunt sparked curiosity and started discussions among curious minds as Reclaim Your Face activists wore paper bags on their heads in public spaces as a sign of protest. The #WalkTheTalk Twitter storm united activists across the Atlantic in calling on the EU Commissioner Vestager and the US Secretary Raimondo to not negotiate our rights in their trade discussions.

Politically, our success has been clear

Our European Citizens Initiative has been positioned as “perhaps the most politically powerful” of all to date. Thank you to the almost 65,000 EU citizens who have supported it so far!

Firstly, together we successfully set the agenda of the debate on AI. Not only were the words “ban” and “remote biometric identification” (a prominent technique that leads to biometric mass surveillance) included in the AI Act law proposal, but many EU and national affairs newspapers acknowledged the importance of the topic and reported heavily on it.

Secondly, we gathered support from several influential bodies that also called for a ban: EU’s top data protection regulators (the EDPS and EDPB), the Green Group in the EU Parliament, as well as Germany’s newly elected government, several national data protection authorities and UN officials. Our impact is also evident in the report Members of the EU Parliament adopted, calling for a ban on biometric mass surveillance by law enforcement.

Through our coalition, we successfully applied pressure on national governments that tried to sneak in laws that enabled biometric mass surveillance in Serbia and Portugal.  In Italy, Reclaim Your Face campaigners helped to catalyse a moratorium on facial recognition, and in Hamburg, data protection authorities agreed with us that the use of EU citizens’ face images by ClearviewAI is illegal.

Moving ahead in 2022, the Reclaim Your Face coalition is aiming to expand its reach, bringing together even more organisations fighting against biometric mass surveillance. We will train the many volunteers who have offered their support and reach a new level of political engagement.

Thank you for supporting us!

No biometric surveillance for Italian students during exams

In September 2021 the Italian Data Protection Authority (DPA) fined Luigi Bocconi University €200 000 for using Respondus, a proctoring software, without sufficiently informing students of the processing of their personal data and, among other violations, for processing their biometric data without a legal basis. Bocconi is a private University based in Milan and during the COVID-19 pandemic introduced Respondus tools to monitor students during remote exams. 


Respondus offers two different modules: Lockdown browser and Respondus Monitor. The former prevents a student from using their computer as usual, meaning that the person for example cannot open other programs. Respondus Monitor checks that the person in front of the screen is the one that should be taking the exam, in order to prevent someone else from replacing the student or passing notes. To do this, the software uses algorithms that analyse the biometric data of the person’s face in order to confirm their presence and it also records keystrokes, mouse movements and the duration of the exam. After processing the data, the software sends the professor a report showing the student’s image for identification purposes and alerts of any anomalies, with details on the reason for the alert. 

The University initially tried to walk back from what they stated in their own privacy policy, claiming that no biometric data was processed given that the only identification happening was the one concerning the initial picture taken by the software and used by an operator (in this case the professor) to confirm the identity of the student. Something that didn’t match the real functioning of the system. In fact, in their decision, the DPA says that Respondus declared that their software creates a biometric template to monitor the presence of the same person in front of the screen throughout the exam. For this reason, the “software performs a specific technical processing of a physical characteristic of the persons,” says the DPA and, currently, in Italy there is no legal provision expressly authorising the processing of biometric data for the purposes of verifying the regularity of exams. The DPA highlights also that, considering that the processing was carried out by the University for the purpose of issuing degrees with legal value and the specific imbalance in the position of students with respect to the University, consent does not constitute the legal basis of the processing nor can it be considered as freely given. 

In addition, the DPA considers the functionalities of the ‘Respondus Monitor’ component as a “partially automated processing operation for the analysis of the behaviour of the data subjects, in relation to the subsequent assessment by the teacher,” and this “gives rise to the ‘profiling’ of the students.”

This processing of personal data, according to the DPA, may have an impact on the emotional and psychological sphere of the persons concerned which “may also derive from the specific functionalities of the supervision system, such as, in this case, facial recognition and behavioural profiling, with possible repercussions on the accuracy of the anomalies detected by the algorithm and therefore, indirectly, also on the overall outcome of the test.” 

Laptop and book, both open

Bocconi is not the only Italian University using proctoring software. In June 2020 in Italy there were at least ten Universities using (or planning to use) similar tools such as Proctorio, ProctorExam, and Safe Exam Browser. This Authority’s decision would prohibit other Italian Universities from using software similar to Respondus that collect and process students’ biometric data.

Despite this push back on student monitoring, this decision also reminds us that biometric surveillance is increasingly expanding into every sphere of our lives and the only solution is to call for a ban on these technologies.

Contribution by: Laura Carrer, Research and Advocacy at Digital Rights Unit, Hermes Center & Riccardo Coluccini, Reclaim Your Face national campaign contributor.

People across Switzerland reclaim their faces and public spaces!

On 18 November, three of the organisations that have long championed the Reclaim Your Face campaign – Digitale Gesellschaft (CH), Algorithm Watch CH and Amnesty International (CH) – co-launched a brand new and exciting action in the fight to curtail the sinister rise of biometric mass surveillance practices across Europe!


Called ‘Gesichtserkennung stoppen’ (DE) / ‘Stop à la reconnaissance faciale’ (FR), this action calls on Swiss supporters to take a stand for human rights and oppose the expansion of facial recognition and related biometric mass surveillance in Switzerland.

The action, in the form of a petition, complements the long-running European Citizens’ Initiative (ECI) run by the 65+ organisations in the Reclaim Your Face campaign. However, because EU laws limit those who can sign an ECI to those people who hold EU citizenship, our Swiss supporters have sadly been unable to make their opposition to a biometric mass surveillance society clear. Luckily, not any more!

The organisers explain why action is needed in Switzerland:

We have the right to move freely in public places without anyone knowing what we are doing. But automatic facial recognition allows us to be identified in the street at any time. We want to prevent such mass surveillance. Take a stand against automated facial recognition in public places in Swiss cities! Sign our petition today.”

So what are you waiting for? If you live or work in Switzerland, let the government know that you want to be treated as a person, not a walking barcode:

And if you do have EU citizenship (even if you’re a dual national) then you can give your voice legal weight by signing the ECI to ban biometric mass surveillance

SUCCESS! New German government calls for European ban on biometric mass surveillance

The newly-agreed German government coalition has called for a Europe-wide ban on public facial recognition and other biometric surveillance. This echoes the core demands of the Reclaim Your Face campaign which EDRi has co-led since 2020, through which over 65 civil society groups ask the EU and their national governments to outlaw biometric data mass surveillance.

Read More

Portugal: Proposed law tries to sneak in biometric mass surveillance.

Whilst the European Parliament has been fighting bravely for the rights of everyone in the EU to exist freely and with dignity in publicly accessible spaces, the government of Portugal is attempting to push their country in the opposite direction: one of digital authoritarianism.

The Portuguese lead organisation in the Reclaim Your Face coalition D3 (Defesa Dos Direitos Digitais) are raising awareness of how the Portuguese government’s new proposed video surveillance and facial recognition law amounts to illiberal biometric mass surveillance. Why? Ministers are trying to rush the law through the Parliament, endangering the very foundations of democracy on which the Republic of Portugal rests.

Eerily reminiscent of the failed attempts by the Serbian government just two months ago to rush in a biometric mass surveillance law, Portugal now asked its Parliament to approve a law in a shocking absence of democratic scrutiny. Just two weeks before the national Assembly will be dissolved, the government wants Parliamentarians to quickly approve a law, without public consultation or evidence. The law would enable and encourage widespread biometric mass surveillance – even though we have repeatedly shown just how harmful these practices are.

Reclaim Your Face lead organisation EDRi sent a letter to representatives of Portugal’s main political parties, supporting D3’s fight against biometric mass surveillance practices that treat each and every person as a potential criminal. Together, we urged politicians to reject this dystopian law.

Read EDRi’s letter below.

You can also read D3’s thread (in Portuguese) explaining further why this proposed law is such a problem.



Re: Serious fundamental rights concerns about proposed Portuguese video surveillance Law 111/XIV/2


I am writing to you and your colleagues on behalf of European Digital Rights (EDRi), a network of 45 digital human rights groups from across Europe, including D3 – Defesa dos Direitos Digitais, to urge you to oppose this proposed law.

We want to express our deep concern about the Proposed Law 111/XIV/2 on the use of video surveillance by security forces and services. Despite providing no evidence of effectiveness, necessity or proportionality of these measures, the proposal puts forward sweeping measures which would permit the constant video and biometric mass surveillance of each and every person.

There are many reasons why this proposal is likely to be incompatible with the essence of Portugal’s constitutional obligations to ensure that restrictions on fundamental rights are necessary and proportionate (article 18/2); with Portugal’s obligations under the Charter of Fundamental Rights of the European Union (including but not limited to articles 1, 7, 8, 11, 12, 20, 21, 41, 47, 48 and 49); and the European Convention on Human Rights (ECHR).

The proposed law 111/XIV/2:

1.Removes current legal safeguards limiting the use of invasive video surveillance, such that the permanent and nearly omnipresent use of these systems in publicly accessible spaces may be permitted;

2.Permits video surveillance by aerial drones without limits, further creating possibilities for the pervasive public surveillance; and

3.Establishes that these vast video surveillance networks may be combined with facial recognition and other AI-based systems in public spaces. Such practices enable the omnipotent tracking of individuals, and can thus unduly interfere with a wide range of people’s rights including to: privacy and data protection; as well as to express, associate and assemble freely; to have respect for their rights to equality, non-discrimination and dignity; as well as rights to the presumption of innocence and other due process rights.


Furthermore, the proposal recklessly removes existing safeguards:

Law 111/XIV/2 proposes to withdraw vital powers from the national data protection authority
, the Comissão Nacional de Protecção de Dados (CNPD). This means that not only has the government proposed measures which contradict Portugal’s data protection obligations, but that the very authorities designated to protect people’s from undue violations of their rights will be deliberately prevented from being able to carry out their vital public duties. The CNPD have called this proposal a “gross violation of the principle of proportionality” and have emphasised that it is likely incompatible with the Portuguese Constitution.

The proposal enables biometric mass surveillance practices:

The combined effect of these measures would be highly likely to unduly restrict the rights and freedoms of large parts of the Portuguese population and to constitute unjustified biometric mass surveillance practices. Such measures treat each person as a potential suspect, and they obscure the possibility of targeted use, as passers-by are an inherent feature of public spaces. Over 63.000 EU citizens have already objected to these practices via the Reclaim Your Face campaign, including close to 750 Portuguese nationals.
The Italian Data Protection Authority has further confirmed that uses of facial recognition and other biometric identification in public spaces constitutes mass surveillance, even when authorities are searching for specific individuals on a watch-list. This is because, as the European Data Protection Supervisor (EDPS) and Board (EDPB) have emphasised, the personal data and privacy of anyone passing through that space is unduly infringed upon by such surveillance.

Another particular risk arises from the fact that the proposal requires the processing of especially sensitive data. People’s biometric data, such as their faces, are central to their personal identity and sometimes their protected characteristics. Their processing can therefore infringe on rights to dignity, equality and non-discrimination, autonomy and self-determination.

The proposal is at odds with the European Parliament and the United Nations:

The proposed law stands in direct contradiction to the position of the European Parliament, which voted in October 2021 to adopt the ‘AI and criminal law’ report. This official report call to ban biometric mass surveillance, including of the kind that is being proposed in law 111/XIV/2. Other EU opposition to such practices includes the proposed ban on real-time remote biometric identification (RBI) by law enforcement in the EU’s Artificial Intelligence Act, and a call from the EDPS and EDPB to implement a “general ban any use of AI for an automated recognition of human features in publicly accessible space.”

The need to prohibit, rather than legalise, such practices has also been confirmed by the UN High Commissioner for Human Rights, who warned that it “dramatically increases the ability of State authorities to systematically identity and track individuals in public spaces, undermining the ability of people to go about their lives unobserved” and should therefore be limited or banned.

The proposal undermines the essence of a democratic society:

Mass surveillance is not just bad for individuals, but also for communities. The landmark Census judgement of the German Constitutional Court articulated the threats not only to people’s political rights and civil rights, but also to democracy and “the common good, because self-determination [which is harmed by mass surveillance] is an essential prerequisite for a free and democratic society that is based on the capacity and solidarity of its citizens.”

European and international human rights groups have raised the severe harms of biometric mass surveillance. Constant, invasive surveillance disincentivises people from protesting; suppresses anti-corruption efforts by making it harder for sources to blow the whistle anonymously; and has a general chilling effect on people’s rights and freedoms. Biometric mass surveillance systems have been used across Europe and the world to spy on groups including human rights defenders, LGBT+ communities and people going to church.

Lastly, the hurried manner in which this proposal has been brought forward is grave cause for concern. With the upcoming dissolution of the Portuguese Parliamentary Assembly, the government aims to push through this rights-violating proposal in a rushed manner and without public consultation. This prevents proper democratic scrutiny of the proposal and will undermine people’s trust in the legislative process.

We urge you to consider the rights and freedoms of the people of Portugal and your obligations under constitutional, EU and international law, to reject the proposed video surveillance law 111/XIV/2. We are at your disposal should you wish to discuss any of the above.

Yours sincerely, Diego Naranjo
Head of Policy, EDRi

Building on your support, we can to show EU leaders that we do not support the use of technologies that turn our publicly accessible spaces into a permanent police line-up.


Help us grow stronger and sign our citizens’ initiative to ban biometric mass surveillance!

Facebook deleting facial recognition: Five reasons to take it with a pinch of salt

On Nov.2 , the company formerly known as Facebook announced that by the end of the year, it will delete the entire database of facial templates used for automated photo tagging on the Facebook app. Yes, that Facebook – the notorious social media platform most recently in the news for a major whistleblowing scandal and a subsequent change of company name from Facebook to “Meta”.

Early reactions praised Facebook for this bold and surprising move. So, has Christmas come early in the digital rights world? Well, not so fast.


This move seems on the surface to be a good thing because it chips away at the group’s power and control over face data from around 13% of the world’s population. However, the reality is that things are not as rosy as Facebook would like you to think.

The latest Facebook announcement reveals exactly why voluntary “goodwill” self-regulation is superficial, and why we need strong EU rules in future legislation like the AI Act – as the Reclaim Your Face campaign demands. 

Here’s five pinches of salt for your reality-check on Facebook deleting facial recognition:

1. The Facebook app will delete a database containing the face templates (“faceprints”) of over a billion people, which underpin the facial recognition system used to flag when people’s faces appear in photos and videos, for example for tagging purposes. But what about the underlying algorithm (the eerily named ‘DeepFace’) that powers this facial recognition? According to the New York Times, Facebook stated that DeepFace algorithms will be retained, and the company is very much open to using them in future products.

2. This means that whenever it suits their commercial interests, Facebook can flick the switch to turn their vast facial recognition capacity back on.

3. The Meta group’s initial statement does not say whether or not the database is the group’s (or even the app’s) only database used for identifying people, or whether they have others. As Gizmodo points out, the commitment doesn’t affect other Meta companies, such as Instagram, which will continue to use people’s biometric data.

4. Facebook has had a lot of bad press recently. So is this a convenient distraction to get praise from their long-time critics, the privacy community? It is probably also no coincidence that, as The Verge reports, this move comes after Facebook had to pay well over half a billion dollars in the US because the Face Recognition feature had been violating people’s privacy.

5. Meta’s press release outlines a desire by the company to do more with face authentication. People’s biometric data is always sensitive, and we increasingly see how authentication can pose serious risks to people’s privacy and equality rights as well as to their access to economic and public services. Given Facebook’s sprawling plans for a “metaverse”, their privacy-invading RayBan glasses, and their track record of massive and systemic privacy intrusions, we cannot trust that they will only use face data in benign and rights-respecting ways.

At its core, the Facebook app’s business model is based on exploiting your data. Far from being an all-out win, this move to delete their face recognition database shows more than ever why we simply cannot rely on the apparent ‘goodwill’ of companies in the place of rigorous regulation. When companies self-regulate, they also have the power to de-regulate as and when they wish.

As Amnesty International’s Dr. Matt Mahmoudi points out, the truly good news in this story is that the international pressure against facial recognition – thanks to movements like Reclaim Your Face and Ban The Scan – is making companies sweat. Mass facial recognition is becoming less socially acceptable as people become more and more aware of its inherent dangers. Much like IBM’s vague 2019 commitment to end general-purpose facial recognition and Amazon’s recently-extended self-imposed pause on the Rekognition facial recognition for law enforcement, it is naive at best to expect that companies will sufficiently rein in their harmful uses of facial recognition and other biometric data.

That’s why Reclaim Your Face continues to fight for a world free from pervasive facial recognition and other forms of biometric mass surveillance.

All EU citizens can sign our official EU initiative which calls to ban these practices and to hold companies like Facebook to account

Serbia withdraws a proposed Biometric Surveillance Bill following national and international pressure

On 23 September, the Serbian Minister of Interior Aleksandar Vulin announced that the Draft Law on Internal Affairs, which contained provisions for legalising a massive biometric video surveillance system, was pulled from the further procedure. This turn of events presents a key victory in SHARE Foundation’s two and a half year-long battle against smart cameras in Belgrade, which were installed by the Ministry of Interior and supplied by Chinese tech giant Huawei.

During public consultations, SHARE Foundation sent comments on the Draft Law, which put Serbia in danger of becoming the first country in Europe with permanent indiscriminate surveillance of citizens in public spaces. EDRi’s Brussels office also warned the Serbian government of the dangers to privacy and other human rights if such a law was passed.

Gathering and inspiring the community

This success would not have been possible without SHARE’s community and international partners, such as the EDRi office and other member organisations, as well as related initiatives such as Reclaim Your Face and Ban Biometric Surveillance. Our battle was also supported by Members of the European Parliament which put international pressure on the Serbian government.

Huge awareness raising efforts were needed to highlight the importance of this issue, especially in a society with low privacy priorities such as Serbia. Through SHARE’s initiative called “Thousands of Cameras” (#hiljadekamera), we gathered a community of experts with various backgrounds (law, policy, tech, art, activism) as well as citizens worried about the implications of biometric surveillance in our streets and public spaces. Actions like “hunt for cameras”, where we called upon citizens to map smart cameras in their neighbourhoods, an online petition against biometric surveillance and a crowdfunding campaign for “Thousands of Cameras” have all shown that the fight against biometric surveillance can mobilise people effectively. The datathon organised to make a one-stop platform for the “Thousands of Cameras” initiative was a milestone that enabled us to keep pushing against this dangerously intrusive technology.

Lessons learned 

One of the key preconditions for success was the new Law on Personal Data Protection, modelled after the GDPR, which requires a Data Protection Impact Assessment (DPIA) to be conducted before intrusive data processing mechanisms are put in place and approved by the Commissioner for Personal Data Protection. This led to the Commissioner denying the DPIAs conducted by the Ministry of Interior on two occasions, citing that such a system lacked an adequate legal basis.

International standards and opinions on biometric surveillance provided by bodies such as the UN, Council of Europe and European Union institutions all provided valid points on why such technologies should be banned in any society aspiring towards democracy and the full respect of human rights.

However, SHARE also found that a multidisciplinary approach to the topic was necessary. Solely a legal angle is inadequate to argue against such a controversial issue. It needs to be tackled from different perspectives, such as human rights concerns, technological aspects (techno-solutionism) and by focusing on the impact on citizens’ everyday lives, particularly in vulnerable communities.

Getting the message across via the media was also instrumental. In the past couple of years over 300 news articles have been written about biometric surveillance in Belgrade, in both domestic and international media. In that respect, it is of utmost importance to forging partnerships with media and journalists interested in the topic, as they can immensely contribute to spreading awareness and mobilising people.


Read More

Our voices have been heard: European Parliament calls for a ban on biometric mass surveillance!

In a huge victory for human rights, the European Parliament has just voted to adopt a new report which calls to ban biometric mass surveillance. This is a key moment for the Reclaim Your Face campaign, because, although the report is not legally binding, it gives a strong indication of the Parliament’s position on the ‘Artificial Intelligence Act’.

Over 61,000 EU citizens have already signed our official initiative to ban biometric mass surveillance practices in EU law. Now, we have clear evidence that our voices have been heard! In what’s known as an own-initiative report (INI), the European Parliament decided to proactively set out their vision that police should use artificial intelligence technologies only in ways that respect people’s human rights and freedoms. This includes a demand to ban biometric mass surveillance, which is one of the most powerful and progressive calls we have seen from politicians or lawmakers anywhere in the world. Specifically, the report:

  • Warns about the severe risks of police uses of facial authentification / verification, and the need for such applications to be necessary and proportionate (§ 25);
  • Calls for a moratorium (time-limited suspension) of any facial identification by police until it can be proven as fundamental rights-compliant. If this cannot be proven, it must be banned (§27) (see the final bullet point for an even stronger outcome on any facial ID that leads to mass surveillance);
  • For other biometric features, demands ‘a permanent prohibition of the use of automated analysis and/or recognition in publicly accessible spaces of other human features, such as gait, fingerprints, DNA, voice, and other biometric and behavioural signals’ (§26);
  • Recommends a ban on the use of private databases, like Clearview AI, by law enforcement (§28);
  • And the pièce de résistance: ‘calls on the Commission, therefore, to implement, through legislative and non-legislative means, and if necessary through infringement proceedings, a ban on any processing of biometric data, including facial images, for law enforcement purposes that leads to mass surveillance in publicly accessible spaces’ as well as a ban on funding mass surveillance research (§31).

These are not the only exciting bits of the report. The report also takes a view of AI harms as structural, pointing to the severe risks for racialised and minoritised people. It even calls to prohibit discriminatory predictive policing practices, which rob people of the presumption of innocence.

This report matters so much because it gives the Parliament’s lead negotiators a clear message from their colleagues to push for a ban on biometric mass surveillance in their position on the AI Act, which they will have to negotiate with representatives from every EU member state’s government.

Whilst we celebrated the AI Act for its in-principle ban on real-time remote biometric identification by law enforcement, we called out the enormous holes in this so-called ‘ban’ and the fact that it would not prevent biometric mass surveillance. Now, we have a chance to ensure that the Act really does fulfill its promise to protect individuals, communities, and democracies from the threat of constant biometric surveillance.

Many organisations within the Reclaim Your Face campaign joined the push to help overturn an attempt from some members of the European Parliament (in particular from the right-wing EPP group) to weaken the report and explicitly allow biometric mass surveillance. Today, we celebrate and thank the brave MEPs that stood up for rights and freedoms. Tomorrow, we continue the fight to ban biometric mass surveillance and reclaim our faces!

Read the full text (EN) of the adopted report and check for other language versions.

Why is facial recognition a Roma and Sinti rights issue?

The rights of Romani people should be an important topic for anyone that cares about digital rights. In this blog, hear from experts in Roma, Sinti and digital rights about why facial recognition is an important issue (and what the rest of the digital rights community can learn), and check out the Reclaim Your Face campaign’s first ever resource in the Sinti language!

Roma and Sinti Rights are Digital Rights!

The 8th of April 2021 marked 50 years since the first World Romani Congress, an event which to this day signifies a celebration of Romani lives and culture, but also the barriers to rights, equal treatment and inclusion that are still put in the way of Roma, Sinti, Travellers and other Romani groups* across the world. With most areas of our lives increasingly turning ‘digital’, the purported benefits and opportunities of digitalisation can equally become additional inequalities for Romani people who have typically been shut out from access to digital skills and careers.

Today, there are at least 10-12 million Romani people across Europe, making Romani people Europe’s largest ethnic ‘minority’. And yet as groups such as Equinox Racial Justice Initiative have pointed out, the experiences and expertise of minoritised people like Roma and Sinti have been conspicuously absent in European policy debates and decisions. Take, for example, the recent EU consultation on artificial intelligence (AI) which came before the highly-anticipated proposal for a law on AI, but suffered from a lack of meaningful consultation with historically-marginalised communities from across Europe.

Biometric mass surveillance (BMS) is so harmful that no safeguards can make it safe. BMS means the use of systems – like facial recognition – which exploit people’s faces, bodies and behaviours to track, analyse and monitor them in public spaces. We have already seen it used in European parks, train stations, shopping centres, airports, schools and football stadiums. It can suppress each of our rights and freedoms to live freely and without the fear of being watched and judged at all times.

For historically marginalised communities like Roma and Sinti, BMS can single them out in ways that exacerbate already high levels of discrimination and exclusion. Romani people may also be especially sensitive to the ways in which BMS is based on an analysis of people’s facial proportions in order to put them in arbitrary boxes such as their predicted race, gender or even whether they seem suspicious or aggressive. Such practices have strong parallels, for example, with how the Nazi regime used biometric data to persecute and kill Romani people during the Holocaust. In recent years, data about Romani people have been used in a wide variety of other harmful ways. Read on to learn more about this from Roma and Sinti experts in digitalisation, Roxy and Benjamin.

The RYF x International Romani Day 2021 webinar

We commemorated International Romani Day 2021 by speaking with Roxanna Lorraine-Witt and Benjamin Ignac about the intersection of Roma and Sinti rights with the rise of facial recognition and other forms of biometric mass surveillance.

Roxy and Benjamin are experts on issues of data, digitalisation and Romani rights, and they spoke to us to explore what biometric mass surveillance could mean for Roma and Sinti communities. They also spoke about how including Romani experiences and expertise can strengthen the digital rights movement and help drive resistance against biometric mass surveillance and other rights-violating practices:

A screenshot of the webinar recording

Please note that by clicking on this video, it will open an external link to the video on YouTube. YouTube engages in extensive data collection and processing practices that are governed by their own terms of service.

“I hate that I need to live in a world where I feel like I have to hide my Roma identity because this very identity can be used against me […] Having governments using this identity or data about Roma in that way is totally unacceptable. We should be proud of our identity […] [But] we have plenty of examples that in the wrong hands, data about Roma will be used against us.”

Benjamin Ignac

You can read the written highlights of the discussion here.

Our first Reclaim Your Face resource in a Romani dialect: the Sinti langauge!

We have also been working with Franz-Elias Schneck, the creator of the very first history video in the Sinti langauge. Franz has produced a video for Reclaim Your Face to explain what biometric mass surveillance is, why it is an important issue for Sinti people, and how it links to systemic issues that Romani people have long faced, such as racist policing practices:

Biometric Mass Surveillance

Please note that by clicking on this video, it will open an external link to the video on YouTube. YouTube engages in extensive data collection and processing practices that are governed by their own terms of service.

We are especially excited that this video is available in the Sinti language, sometimes also called Rromanës-Sinti. It’s a type of Romani language which is most commonly spoken by Sinti people in Germany. If you don’t understand Sinti, do not worry: the Reclaim Your Face website homepage contains extensive explanations of what biometric mass surveillance is and why you should care about it. Plus, the site is available in 15 EU languages, with more coming soon – use the drop-down at the top of your screen to pick your preferred language.

The Romani Tea Room Podcast

After attending our International Romani Day webinar, the European Roma Rights Center invited Reclaim Your Face to feature on their Romani tea room podcast along with Benjamin, in an episode appropriately titled “You are being watched”. In the episode, host Sophie Datishvili points out that biometric mass surveillance practices – which EDRi and the Reclaim Your Face campaign have long warned are a major human rights issue – are likely to resonate with Roma because of the similarities to the discriminatory targeting via ethnic profiling that Romani people regularly face:

Whether you are Roma or not, the Romani tea room podcast is a powerful resource examining stereotypes, slavery, gender rights, surveillance and much more, all through the lens of Romani culture and experiences. Along with Sophie and Benjamin, we discussed intersections of facial recognition with issues of predictive policing; the EU’s notorious ‘iBorder CTRL’ project; systemic biases; and European “automated number plate recognition” technology – which has recently been exposed to have been linked with facial recognition identification in the Netherlands

We also explored how the “invisible” nature of biometric mass surveillance means that it can cause harm to any of us without us even knowing it has happened – meaning that minoritised groups can often be most harmed due to structural discrimination, but that any person looking to walk around in public, attend a demonstration or even go to the shops can in fact be targeted. Because these practices are becoming more and more prominent in almost every European country, we agreed that there is a strong need to stop biometric mass surveillance before it goes any further, and therefore to prevent vast future harm to Romani and non-Romani people alike.

Watch, read, listen, learn, reflect – and act!

One thing has been clear to us as we have had the opportunity to speak with and learn from Roxy, Benjamin, Franz and Sophie: in law and policy-making, it is vital that everyone who is subject to laws and policies has a voice in shaping and contributing their expertise to those laws and policies.

Romani experiences offer a critical and at times harrowing insight into why it’s so important that we resist biometric surveillance practices that differentiate between people based on their faces and bodies. If we want a Europe that is truly inclusive, it is important that we make sure that everybody has equal and equitable access to the opportunities and benefits of digitalisation, and that everyone is properly protected from the harm that can arise from the use of these technologies, too.

When it comes to biometric mass surveillance technologies, they are so invasive, with such a huge potential for discrimination, that these severe harms vastly outweigh any potential benefit that they could have.

If you have found this blog interesting, we encourage you to inform yourself about Romani rights, the powerful work of Romani organisations and grassroots movements across Europe, and the issue of biometric mass surveillance with the following resources. If you’re an EU citizen, you can also sign our official EU petition to add your voice to the call to ban biometric mass surveillance – either on our homepage or even at the bottom of this page!


A note on terminology
There is no single type of “Romani” person. Throughout this article we use the terms “Roma” and “Sinti” as nouns to refer to specific groups, although the term “Roma” can also be used more broadly to refer to all Romani groups. We use “Romani” as an adjective to describe association to all groups in the Romani community. There are many different Romani groups across Europe, with often distinct dialects and cultures. To educate yourself, check out our recommendations below.

Read and Learn:

Watch or listen back:

Act:



ReclaimYourFace is a movement led by civil society organisations across Europe:

Access Now ARTICLE19 Bits of Freedom CCC Defesa dos Direitos Digitais (D3) Digitalcourage Digitale Gesellschaft CH Digitale Gesellschaft DE Državljan D EDRi Electronic Frontier Finland epicenter.works Hermes Center for Transparency and Digital Human Rights Homo Digitalis IT-Political Association of Denmark IuRe La Quadrature du Net Liberties Metamorphosis Foundation Panoptykon Foundation Privacy International SHARE Foundation
In collaboration with our campaign partners:

AlgorithmWatch AlgorithmWatch/CH All Out Amnesty International Anna Elbe Aquilenet Associazione Luca Coscioni Ban Facial Recognition Europe Big Brother Watch Certi Diritti Chaos Computer Club Lëtzebuerg (C3L) CILD D64 Danes je nov dan Datapanik Digitale Freiheit DPO Innovation Electronic Frontier Norway European Center for Not-for-profit Law (ECNL) European Digital Society Eumans Football Supporters Europe Fundación Secretariado Gitano (FSG) Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung Germanwatch German acm chapter Gesellschaft Fur Informatik (German Informatics Society) GONG Hellenic Association of Data Protection and Privacy Hellenic League for Human Rights info.nodes irish council for civil liberties JEF, Young European Federalists Kameras Stoppen Ligue des droits de L'Homme (FR) Ligue des Droits Humains (BE) LOAD e.V. Ministry of Privacy Privacy first logo Privacy Lx Privacy Network Projetto Winston Smith Reporters United Saplinq Science for Democracy Selbstbestimmt.Digital STRALI Stop Wapenhandel The Good Lobby Italia UNI-Europa Unsurv Vrijbit Wikimedia FR Xnet


Reclaim Your Face is also supported by:

Jusos Piratenpartei DE Pirátská Strana

MEP Patrick Breyer, Germany, Greens/EFA
MEP Marcel Kolaja, Czechia, Greens/EFA
MEP Anne-Sophie Pelletier, France, The Left
MEP Kateřina Konečná, Czechia, The Left



Should your organisation be here, too?
Here's how you can get involved.
If you're an individual rather than an organisation, or your organisation type isn't covered in the partnering document, please get in touch with us directly.