Why is facial recognition a Roma and Sinti rights issue?

The rights of Romani people should be an important topic for anyone that cares about digital rights. In this blog, hear from experts in Roma, Sinti and digital rights about why facial recognition is an important issue (and what the rest of the digital rights community can learn), and check out the Reclaim Your Face campaign’s first ever resource in the Sinti language!

Roma and Sinti Rights are Digital Rights!

The 8th of April 2021 marked 50 years since the first World Romani Congress, an event which to this day signifies a celebration of Romani lives and culture, but also the barriers to rights, equal treatment and inclusion that are still put in the way of Roma, Sinti, Travellers and other Romani groups* across the world. With most areas of our lives increasingly turning ‘digital’, the purported benefits and opportunities of digitalisation can equally become additional inequalities for Romani people who have typically been shut out from access to digital skills and careers.

Today, there are at least 10-12 million Romani people across Europe, making Romani people Europe’s largest ethnic ‘minority’. And yet as groups such as Equinox Racial Justice Initiative have pointed out, the experiences and expertise of minoritised people like Roma and Sinti have been conspicuously absent in European policy debates and decisions. Take, for example, the recent EU consultation on artificial intelligence (AI) which came before the highly-anticipated proposal for a law on AI, but suffered from a lack of meaningful consultation with historically-marginalised communities from across Europe.

Biometric mass surveillance (BMS) is so harmful that no safeguards can make it safe. BMS means the use of systems – like facial recognition – which exploit people’s faces, bodies and behaviours to track, analyse and monitor them in public spaces. We have already seen it used in European parks, train stations, shopping centres, airports, schools and football stadiums. It can suppress each of our rights and freedoms to live freely and without the fear of being watched and judged at all times.

For historically marginalised communities like Roma and Sinti, BMS can single them out in ways that exacerbate already high levels of discrimination and exclusion. Romani people may also be especially sensitive to the ways in which BMS is based on an analysis of people’s facial proportions in order to put them in arbitrary boxes such as their predicted race, gender or even whether they seem suspicious or aggressive. Such practices have strong parallels, for example, with how the Nazi regime used biometric data to persecute and kill Romani people during the Holocaust. In recent years, data about Romani people have been used in a wide variety of other harmful ways. Read on to learn more about this from Roma and Sinti experts in digitalisation, Roxy and Benjamin.

The RYF x International Romani Day 2021 webinar

We commemorated International Romani Day 2021 by speaking with Roxanna Lorraine-Witt and Benjamin Ignac about the intersection of Roma and Sinti rights with the rise of facial recognition and other forms of biometric mass surveillance.

Roxy and Benjamin are experts on issues of data, digitalisation and Romani rights, and they spoke to us to explore what biometric mass surveillance could mean for Roma and Sinti communities. They also spoke about how including Romani experiences and expertise can strengthen the digital rights movement and help drive resistance against biometric mass surveillance and other rights-violating practices:

A screenshot of the webinar recording

Please note that by clicking on this video, it will open an external link to the video on YouTube. YouTube engages in extensive data collection and processing practices that are governed by their own terms of service.

“I hate that I need to live in a world where I feel like I have to hide my Roma identity because this very identity can be used against me […] Having governments using this identity or data about Roma in that way is totally unacceptable. We should be proud of our identity […] [But] we have plenty of examples that in the wrong hands, data about Roma will be used against us.”

Benjamin Ignac

You can read the written highlights of the discussion here.

Our first Reclaim Your Face resource in a Romani dialect: the Sinti langauge!

We have also been working with Franz-Elias Schneck, the creator of the very first history video in the Sinti langauge. Franz has produced a video for Reclaim Your Face to explain what biometric mass surveillance is, why it is an important issue for Sinti people, and how it links to systemic issues that Romani people have long faced, such as racist policing practices:

Biometric Mass Surveillance

Please note that by clicking on this video, it will open an external link to the video on YouTube. YouTube engages in extensive data collection and processing practices that are governed by their own terms of service.

We are especially excited that this video is available in the Sinti language, sometimes also called Rromanës-Sinti. It’s a type of Romani language which is most commonly spoken by Sinti people in Germany. If you don’t understand Sinti, do not worry: the Reclaim Your Face website homepage contains extensive explanations of what biometric mass surveillance is and why you should care about it. Plus, the site is available in 15 EU languages, with more coming soon – use the drop-down at the top of your screen to pick your preferred language.

The Romani Tea Room Podcast

After attending our International Romani Day webinar, the European Roma Rights Center invited Reclaim Your Face to feature on their Romani tea room podcast along with Benjamin, in an episode appropriately titled “You are being watched”. In the episode, host Sophie Datishvili points out that biometric mass surveillance practices – which EDRi and the Reclaim Your Face campaign have long warned are a major human rights issue – are likely to resonate with Roma because of the similarities to the discriminatory targeting via ethnic profiling that Romani people regularly face:

Whether you are Roma or not, the Romani tea room podcast is a powerful resource examining stereotypes, slavery, gender rights, surveillance and much more, all through the lens of Romani culture and experiences. Along with Sophie and Benjamin, we discussed intersections of facial recognition with issues of predictive policing; the EU’s notorious ‘iBorder CTRL’ project; systemic biases; and European “automated number plate recognition” technology – which has recently been exposed to have been linked with facial recognition identification in the Netherlands

We also explored how the “invisible” nature of biometric mass surveillance means that it can cause harm to any of us without us even knowing it has happened – meaning that minoritised groups can often be most harmed due to structural discrimination, but that any person looking to walk around in public, attend a demonstration or even go to the shops can in fact be targeted. Because these practices are becoming more and more prominent in almost every European country, we agreed that there is a strong need to stop biometric mass surveillance before it goes any further, and therefore to prevent vast future harm to Romani and non-Romani people alike.

Watch, read, listen, learn, reflect – and act!

One thing has been clear to us as we have had the opportunity to speak with and learn from Roxy, Benjamin, Franz and Sophie: in law and policy-making, it is vital that everyone who is subject to laws and policies has a voice in shaping and contributing their expertise to those laws and policies.

Romani experiences offer a critical and at times harrowing insight into why it’s so important that we resist biometric surveillance practices that differentiate between people based on their faces and bodies. If we want a Europe that is truly inclusive, it is important that we make sure that everybody has equal and equitable access to the opportunities and benefits of digitalisation, and that everyone is properly protected from the harm that can arise from the use of these technologies, too.

When it comes to biometric mass surveillance technologies, they are so invasive, with such a huge potential for discrimination, that these severe harms vastly outweigh any potential benefit that they could have.

If you have found this blog interesting, we encourage you to inform yourself about Romani rights, the powerful work of Romani organisations and grassroots movements across Europe, and the issue of biometric mass surveillance with the following resources. If you’re an EU citizen, you can also sign our official EU petition to add your voice to the call to ban biometric mass surveillance – either on our homepage or even at the bottom of this page!

A note on terminology
There is no single type of “Romani” person. Throughout this article we use the terms “Roma” and “Sinti” as nouns to refer to specific groups, although the term “Roma” can also be used more broadly to refer to all Romani groups. We use “Romani” as an adjective to describe association to all groups in the Romani community. There are many different Romani groups across Europe, with often distinct dialects and cultures. To educate yourself, check out our recommendations below.

Read and Learn:

Watch or listen back:


NEW REPORT: Biometric mass surveillance in Germany, the Netherlands, Poland

People outside LGBTQ+ venues, religious buildings, doctor’s surgeries and lawyers’ offices in the German city of Cologne may have had their faces illegally captured. The Dutch cities of Roermond, Eindhoven, Utrecht and Enschede have been turned into experimental “Living Labs”, capturing vast amounts of data about residents and apparently using it to profile them. The Polish COVID-19 quarantine app not only captured people’s faces without good reason – but that this information may now have been abused by police to visit the homes of people no longer subject to quarantine rules. These are just some examples from a new report documenting evidence of biometric mass surveillance.

Read More

EU’s top privacy regulators Reclaim Their Faces!

Today, the two most important regulators for ensuring people’s rights to privacy and data protection across the EU announced their formal call to ban biometric mass surveillance practices! This is a significant development for our campaign and adds even more weight to the pressure that has been exerted by over 55,000 Reclaim Your Face supporters already.

The European Data Protection Supervisor (EDPS) is the watchdog for keeping EU institutions in check when it comes to the use of people’s personal data. Separately, the European Data Protection Board (EDPB) brings together representatives of each national data protection authority (DPA). DPAs are the authorities who keep watch over personal data in their own country, and are empowered to issue significant fines in the event of abuses. They are made up of data protection, technology and human rights experts – so their views on the topic of biometric surveillance are crucial and authoritative!

EDPB & EDPS call for ban on use of AI for automated recognition of human features in publicly accessible spaces, and some other uses of AI that can lead to unfair discrimination

21 June 2021

The Reclaim Your Face coalition is especially excited to see the EDPS and EDPB’s reasoning for wanting to ban biometric mass surveillance – or in their words, to “ban [the] use of AI for automated recognition of human features in publicly accessible spaces, and some other uses of AI that can lead to unfair discrimination.”

In particular, the groups highlight that there are “extremely high risks” posed by the use of these technologies in publicly accessible spaces due to the potential to obliterate people’s fundamental right to stay anonymous and to unduly restrict a very wide range of rights and freedoms. The EDPS and EDPB advocate that a ban should be the starting point when it comes to public biometric uses – in contrast to the European Commission’s recent proposal, which simply does not yet go far enough to protect people and communities.

If you haven’t signed it already, now is a perfect time to show your agreement with Reclaim Your Face, the EDPS and EDPB by officially supporting the formal EU initiative to ban biometric mass surveillance practices. This will help us translate the mounting evidence and legal analysis showing how harmful biometric mass surveillance is, into a real ban!

If you’re feeling like having a bit of fun, you can also use absurd comedy to make the very serious point about how facial recognition and other biometric surveillance in public spaces suppresses our rights and freedoms by joining the #PaperBagSociety challenge now!

Stay tuned for the deeper analysis by EDRi of the full EDPS and EDPB opinion, coming soon.

The Paper Bag Society challenge

The #PaperBagSociety is a social media challenge part of the #ReclaimYourFace campaign that invites everyone to share online the impact of living life with a paperbag on the head. With it, we aim to raise awareness of how ridiculous is to avoid facial recognition technologies in public spaces and why we need to build an alternative future, free from biometric mass surveillance.

In the past months, we’ve raised awareness of the dangers of biometric mass surveillance. Part of the process was also understanding how complex the systems that rely on biometric data are. We tried to find different ways to trick them, looking at facial recognition surveillance technologies deployed in our public spaces. The results are clear: as an individual, it is terribly difficult to trick biometric mass surveillance.

This is the reason why, at some point, one of the campaign organisers joked:

“Let’s just put a paper bag on our head and we’ll be safe from facial recognition surveillance”.

We ask: what would it be like to go about our daily lives with the paper bag on our head? Do we need to use a paper bag to protect our faces from creepy recognition technologies? Is this the society we want to live in? In a world that remains dominated by ableism, it could be challenging to love, to cross the street, to merely interact.

Collectively, a #PaperBagSociety becomes a dystopian reality, a metaphor for the way biometric mass surveillance suppresses our choices, our speech and our freedoms. We realised this could be a great imagination exercise for anyone wanting to understand better why we need a world free from intrusive technologies that track our bodies and behaviour.

This is how the #PaperBagSociety challenge was born.

The #PaperBagSociety is a social media challenge part of the #ReclaimYourFace campaign. The challenge invites everyone to share on social media the impact of living life with a paperbag on the head.

Illustration and design: Valentina Carrasco

Using absurd comedy, this action aims to draw attention to why the heavy burden of avoiding creepy biometric surveillance technologies in public spaces should not fall on us, the people.

Instead, the action emphasises that an alternative future is possible. There are solutions to prevent a paper bag society: we must ban biometric mass surveillance across the EU and beyond.

Be part of the #PaperBagSociety challenge!

1. Go for a stroll in a publicly accessible space (public square, on the street, in a train station, a supermarket, a cafe, a stadium, shopping mall etc).

2. Put a paper bag on and try to live in the public space.

3. Take a video or a photo of the experience and share it on social media.

4. Make sure to tag #ReclaimYourFace & #PaperBagSociety and explain to your friends why we must ban biometric mass surveillance.

P.S. First and above all: make sure you don’t put yourself or others in danger. Keep it cool.

P.S.2: Are you lucky enough to be a citizen in an EU country? VOTE to BAN biometric mass surveillance !

Workplace, public space: workers organising in the age of facial recognition

By Reclaim Your Face campaign coordinator European Digital Rights (EDRi) and campaign partner UNI Europa.
This article was originally published in Social Europe.

A spectre is haunting Europe—the spectre of workers’ surveillance. Under the cover of ‘emergency’ legislation against ‘terrorism’ or to contain the Covid-19 pandemic, surveillance of working people is continually being expanded and normalised, in the workplace and on our streets. This takes various forms, with various degrees of invasiveness. It can be Amazon spying on its workers in private Facebook groups or using Covid-19 health-tracking technology to keep tabs on at least 340,000 workers. And it can be facial recognition of employees working remotely, monitored by smaller employers.

Workers’ surveillance can also take the shape more broadly of facial-recognition systems all across Europe’s publicly accessible spaces. This can—among many other harms—suppress workers’ organising. Imposing an omnipresent sense of being watched, surveillance has a clear chilling effect on workers’ readiness to exercise their right to freedom of assembly.

Short-term appeal

Both corporations and governments may see short-term appeal in this wave of expansion of surveillance. While flashy gadgets help paint a picture of a techno-utopia, a key driver is the quest for control. 

The dominant positions consolidated by data corporations such as Amazon are only matched by their growing unaccountability. We cannot accept their logic of treating people as a threat—let alone see that adopted by governments. That would accelerate the erosion of trust we are seeing in our societies. 

The deployment of facial recognition (or other biometric surveillance technologies) by governments or companies is an immanent threat to workplace democracy. Amid protest outside the headquarters of a company, CCTV cameras could be repurposed for union-busting. Secret algorithms, lacking a scientific basis, could be applied to mark and identify ‘troublemakers’. Faced with marching in the streets, police could use footage and software quickly to identify and target leaders—as well as people supporting or even reporting on the demonstration.

Hard-won rights and freedoms are not negotiable. Ramping up surveillance of individuals is not the way to address collective grievances. Clear boundaries must be set and the collective say of relevant groups of the population over decisions affecting them must be strengthened.

Mass surveillance, at work and beyond, suffocates legitimate social discourse without assuaging the need for it. The history of industrial relations shows time and again that brushing issues under the carpet inevitably results in them re-emerging down the line—often with much more serious consequences.

‘Security-industrial complex’

The documented partnership between private and public surveillance actors in the European Union has been described by the longstanding civil-liberties organisation Statewatch as the ‘security-industrial complex’. Together with the threats to workers’ organising, this phenomenon shows why the EU’s new artificial-intelligence regulation must go beyond what the European Commission has proposed—and ban biometric mass surveillance, by both private and public actors.

Furthermore, the use of AI in the workplace—in recruitment, appraisals and so on—as with all workplace-related rules must be subject to collective-bargaining agreements. The commission’s draft would only require self-evaluation by the companies selling these technologies, leaving the fox to guard the henhouse.

Workers’ rights will be won, as with any other human rights, by collective struggles: collective bargaining or striking at the workplace and protesting on the streets will remain the main tools. But in an increasingly digitalised society, where emails, conversations, faces and bodies can be put under constant surveillance, many will fear for their job security and may choose to stay put rather than become visible.

This is why banning biometric mass surveillance in public spaces and strongly legislating to control harmful AI technologies in the workplace are key to advancing workers’ rights.

Karlsruhe win against biometric mass surveillance in Germany

By ReclaimYourFace campaign lead organisation Chaos Computer Club (CCC)

In November 2020, reporters at revealed that the German city of Karlsruhe wanted to establish a smart video surveillance system in the city centre. The plan involved an AI system that would analyse the behaviour of passers-by and automatically identify conspicuous behaviour. The biometric mass surveillance system was presented by authorities as “data protection compliant video surveillance”. After the intervention of EDRi-member CCC (Chaos Computer Club’s chapter Karlsruhe, also known as Entropia) the project was buried in May 2021. Such a success adds to previous wins by EDRi members involved in the ReclaimYourFace campaign that calls for the EU to ban biometric mass surveillance across all EU countries.

Read More

Can a COVID-19 face mask protect you from facial recognition technology too?

Andreea Belu, Campaigns and Communications Manager at EDRi
Harmit Kambo, Campaigns Director at Privacy International

Our relationship with ‘public space’ is being redefined, not just by a global pandemic, but also by a new era of biometric surveillance technologies. Biometric mass surveillance enables companies and authorities to track us based on unique personal data and identify us whenever, wherever we go.

The increasing use of facial recognition and other biometric surveillance technologies – on our streets, in train stations, at protests, at sports matches and even in our global ‘town square’, Facebook – means that our freedom to be anonymous in public spaces, our freedom to just be, really does face an existential threat.

Mass facial recognition risks our collective futures and shapes us into fear-driven societies of suspicion.

This got folks at EDRi and Privacy International brainstorming. Could the masks that we now wear to protect each other from Coronavirus also protect our anonymity, preventing the latest mass facial recognition systems from identifying us?

A COVID-19 facemask that tricks facial recognition? Let’s try.

At first, it seemed an easy task. After all, a face mask covers most of your face already.

However, we already know that facial recognition systems are remarkably sophisticated, and can identify people (or misidentify people) with relatively little ‘face data’. In fact, while we were working on this project, new revelations showed: a normal face mask doesn’t help you ‘evade’ facial recognition.

Given that anti-surveillance clothing and make-up has had some success in confusing facial recognition systems in the past, we thought that it would be worth experimenting with a similar approach with a face mask pattern.

In looking for the right pattern design, we sought the advice of Tijmen Schep – researchers / artists with experience in tricking facial recognition algorithms.

More difficult than it seems, the tech is too creepy.

Mask design: Ed Grace

And that’s where it got complicated. Here are just some of the challenges we encountered in our journey:

  • Every facial recognition system works in a different way. This means that a design that might ‘fool’ one system, might not fool another.
  • Most facial recognition algorithms as opaque black boxes. The lack of transparency in the use of facial recognition makes it difficult to understand what is being used, how and where. As Adam put it “There is really no such thing as a mask that blocks ‘face recognition’ but rather a mask that blocks, for example, a ResNet-50 CNN model that was trained on VGG Face2 dataset.”
  • Masks have a small surface area. A pattern that might work on an item like a t-shirt might not be effective in confusing FR when used on a small item like a mask.
  • Stopping face detection, let alone recognition. We tried to create a pattern that would confuse the tech in such a way that it wouldn’t even perceive a face, let alone be able to recognise it. This might be through using a pattern made up of simple black and white geometric forms. However, face detection technologies are so good these days, we soon realised that this approach is unlikely to fool them.
  • Face detected. Stop recognition? So then we then considered an alternative approach. Instead of geomertic forms, what about using warped face-like features instead, perhaps using a mixture of natural and ‘unnatural’ colours, to mislead or at least lower the confidence of a match score? We were optimistic about this, but then we got stuck in the black box problem : we could only know of its effectiveness, by testing it across a vast range of currently opaque facial recognition technologies, which are constantly being updated and increasingly sophisticated.

What did we learn?

Here’s what we learnt from our exploration into stopping intrusive facial recognition technologies from detecting or recognizing our face, when wearing a face mask.

  • We do not know whether it is possible to design a face mask that can protect your anonymity against facial recognition technologies.
  • What we know for sure is that it is virtually impossible to know whether even our most ‘educated guess’ would be totally effective, partially effective, or totally ineffective against all, some, or indeed any facial recognition systems.
  • The small acts of resistance we might take against these powerful technologies are not a substitute for for structural, long-lasting solutions. Therefore..
  • …People should not carry the burden of masking themselves to maintain their anonymity. Instead, governments must ensure unlawful biometric mass surveillance such as mass facial recognition is banned in our publicly accessible spaces.

Why did we produce the mask anyway?

The mask is a symbol of resistance against the growing use of mass facial recognition. Wearing this mask means wearing the story of resistance. Wearing this mask means standing up to power, ready for collective action, ready to #ReclaimYourFace.

Since October 2020, a growing coalition of now 61 civil society organisations have mobilised across Europe with one message: “The EU must ban biometric mass surveillance!”

The solutions we ultimately need are not clever facial recognition-thwarting face masks. Instead, we need politicians and decision – makers who do their job right. We need strong regulation against the use of biometric mass surveillance – across the EU and beyond.

Join the movement, make your voice heard today.

Italy wins: DPA blocks facial recognition system and MP proposes moratorium

By Laura Carrer (research & advocacy – Italian campaign lead organisation Hermes Center, and Riccardo Coluccini – Italian campaign contributor)

On Friday 16 April, the Italian Data Protection Authority (DPA) rejected the SARI Real Time facial recognition system acquired by the Police. The DPA argues the system lacks a legal basis and, as designed, it would implement a form of mass surveillance. Two days earlier, on 14 April, a member of the Italian Parliament, Filippo Sensi, proposed a moratorium on the use of video surveillance tools that use Facial Recognition.

Read More

61 MEPs urge the EU to ban biometric mass surveillance!

This week, 61 Members of the European Parliament (MEPs) from across the political spectrum, including representatives of the Greens, the Left, Renew Europe, the Socialists & Democrats (S&D) and the Identity & Democracy (ID) groups raised their voices in agreement with the Reclaim Your Face campaign. They wrote not one, but two powerful letters, which made their position clear to the European Commission (EC): discriminatory biometric mass surveillance practices can have no place in our societies! The letters come only a few days before the EC’s publication of a new and long-awaited EU law on artificial intelligence.

Read More

ReclaimYourFace is a movement led by civil society organisations across Europe:

Access Now ARTICLE19 Bits of Freedom CCC Defesa dos Direitos Digitais (D3) Digitalcourage Digitale Gesellschaft CH Digitale Gesellschaft DE Državljan D EDRi Electronic Frontier Finland Hermes Center for Transparency and Digital Human Rights Homo Digitalis IT-Political Association of Denmark IuRe La Quadrature du Net Liberties Metamorphosis Foundation Panoptykon Foundation Privacy International SHARE Foundation
In collaboration with our campaign partners:

AlgorithmWatch AlgorithmWatch/CH All Out Amnesty International Anna Elbe Aquilenet Associazione Luca Coscioni Ban Facial Recognition Europe Big Brother Watch Certi Diritti Chaos Computer Club Lëtzebuerg (C3L) CILD D64 Danes je nov dan Datapanik Digitale Freiheit DPO Innovation Electronic Frontier Norway European Center for Not-for-profit Law (ECNL) European Digital Society Eumans Football Supporters Europe Fundación Secretariado Gitano (FSG) Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung Germanwatch German acm chapter Gesellschaft Fur Informatik (German Informatics Society) GONG Hellenic Association of Data Protection and Privacy Hellenic League for Human Rights info.nodes irish council for civil liberties JEF, Young European Federalists Kameras Stoppen Ligue des droits de L'Homme (FR) Ligue des Droits Humains (BE) LOAD e.V. Ministry of Privacy Privacy first logo Privacy Lx Privacy Network Projetto Winston Smith Reporters United Saplinq Science for Democracy Selbstbestimmt.Digital STRALI Stop Wapenhandel The Good Lobby Italia UNI-Europa Unsurv Vrijbit Wikimedia FR Xnet

Reclaim Your Face is also supported by:

Jusos Piratenpartei DE Pirátská Strana

MEP Patrick Breyer, Germany, Greens/EFA
MEP Marcel Kolaja, Czechia, Greens/EFA
MEP Anne-Sophie Pelletier, France, The Left
MEP Kateřina Konečná, Czechia, The Left

Should your organisation be here, too?
Here's how you can get involved.
If you're an individual rather than an organisation, or your organisation type isn't covered in the partnering document, please get in touch with us directly.