On 18 November, three of the organisations that have long championed the Reclaim Your Face campaign – Digitale Gesellschaft (CH), Algorithm Watch CH and Amnesty International (CH) – co-launched a brand new and exciting action in the fight to curtail the sinister rise of biometric mass surveillance practices across Europe!
The organisers explain why action is needed in Switzerland:
“We have the right to move freely in public places without anyone knowing what we are doing. But automatic facial recognition allows us to be identified in the street at any time. We want to prevent such mass surveillance. Take a stand against automated facial recognition in public places in Swiss cities! Sign our petition today.”
So what are you waiting for? If you live or work in Switzerland, let the government know that you want to be treated as a person, not a walking barcode:
The newly-agreed German government coalition has called for a Europe-wide ban on public facial recognition and other biometric surveillance. This echoes the core demands of the Reclaim Your Face campaign which EDRi has co-led since 2020, through which over 65 civil society groups ask the EU and their national governments to outlaw biometric data mass surveillance.
Whilst the European Parliament has been fighting bravely for the rights of everyone in the EU to exist freely and with dignity in publicly accessible spaces, the government of Portugal is attempting to push their country in the opposite direction: one of digital authoritarianism.
The Portuguese lead organisation in the Reclaim Your Face coalition D3 (Defesa Dos Direitos Digitais) are raising awareness of how the Portuguese government’s new proposed video surveillance and facial recognition law amounts to illiberal biometric mass surveillance. Why? Ministers are trying to rush the law through the Parliament, endangering the very foundations of democracy on which the Republic of Portugal rests.
Eerily reminiscent of the failed attempts by the Serbian government just two months ago to rush in a biometric mass surveillance law, Portugal now asked its Parliament to approve a law in a shocking absence of democratic scrutiny. Just two weeks before the national Assembly will be dissolved, the government wants Parliamentarians to quickly approve a law, without public consultation or evidence. The law would enable and encourage widespread biometric mass surveillance – even though we have repeatedly shown just how harmful these practices are.
Reclaim Your Face lead organisation EDRi sent a letter to representatives of Portugal’s main political parties, supporting D3’s fight against biometric mass surveillance practices that treat each and every person as a potential criminal. Together, we urged politicians to reject this dystopian law.
We want to express our deep concern about the Proposed Law 111/XIV/2 on the use of video surveillance by security forces and services. Despite providing no evidence of effectiveness, necessity or proportionality of these measures, the proposal puts forward sweeping measures which would permit the constant video and biometric mass surveillance of each and every person.
There are many reasons why this proposal is likely to be incompatible with the essence of Portugal’s constitutional obligations to ensure that restrictions on fundamental rights are necessary and proportionate (article 18/2); with Portugal’s obligations under the Charter of Fundamental Rights of the European Union (including but not limited to articles 1, 7, 8, 11, 12, 20, 21, 41, 47, 48 and 49); and the European Convention on Human Rights (ECHR).
The proposed law 111/XIV/2:
1.Removes current legal safeguards limiting the use of invasive video surveillance, such that the permanent and nearly omnipresent use of these systems in publicly accessible spaces may be permitted;
2.Permits video surveillance by aerial drones without limits, further creating possibilities for the pervasive public surveillance; and
3.Establishes that these vast video surveillance networks may be combined with facial recognition and other AI-based systems in public spaces. Such practices enable the omnipotent tracking of individuals, and can thus unduly interfere with a wide range of people’s rights including to: privacy and data protection; as well as to express, associate and assemble freely; to have respect for their rights to equality, non-discrimination and dignity; as well as rights to the presumption of innocence and other due process rights.
Furthermore, the proposal recklessly removes existing safeguards:
Law 111/XIV/2 proposes to withdraw vital powers from the national data protection authority, the Comissão Nacional de Protecção de Dados (CNPD). This means that not only has the government proposed measures which contradict Portugal’s data protection obligations, but that the very authorities designated to protect people’s from undue violations of their rights will be deliberately prevented from being able to carry out their vital public duties. The CNPD have called this proposal a “gross violation of the principle of proportionality” and have emphasised that it is likely incompatible with the Portuguese Constitution.
The proposal enables biometric mass surveillance practices:
Another particular risk arises from the fact that the proposal requires the processing of especially sensitive data. People’s biometric data, such as their faces, are central to their personal identity and sometimes their protected characteristics. Their processing can therefore infringe on rights to dignity, equality and non-discrimination, autonomy and self-determination.
The proposal is at odds with the European Parliament and the United Nations:
The need to prohibit, rather than legalise, such practices has also been confirmed by the UN High Commissioner for Human Rights, who warned that it “dramatically increases the ability of State authorities to systematically identity and track individuals in public spaces, undermining the ability of people to go about their lives unobserved” and should therefore be limited or banned.
The proposal undermines the essence of a democratic society:
Mass surveillance is not just bad for individuals, but also for communities. The landmark Census judgement of the German Constitutional Court articulated the threats not only to people’s political rights and civil rights, but also to democracy and “the common good, because self-determination [which is harmed by mass surveillance] is an essential prerequisite for a free and democratic society that is based on the capacity and solidarity of its citizens.”
European and international human rights groups have raised the severe harms of biometric mass surveillance. Constant, invasive surveillance disincentivises people from protesting; suppresses anti-corruption efforts by making it harder for sources to blow the whistle anonymously; and has a general chilling effect on people’s rights and freedoms. Biometric mass surveillance systems have been used across Europe and the world to spy on groups including human rights defenders, LGBT+ communities and people going to church.
Lastly, the hurried manner in which this proposal has been brought forward is grave cause for concern. With the upcoming dissolution of the Portuguese Parliamentary Assembly, the government aims to push through this rights-violating proposal in a rushed manner and without public consultation. This prevents proper democratic scrutiny of the proposal and will undermine people’s trust in the legislative process.
We urge you to consider the rights and freedoms of the people of Portugal and your obligations under constitutional, EU and international law, to reject the proposed video surveillance law 111/XIV/2. We are at your disposal should you wish to discuss any of the above.
Yours sincerely, Diego Naranjo
Head of Policy, EDRi
Building on your support, we can to show EU leaders that we do not support the use of technologies that turn our publicly accessible spaces into a permanent police line-up.
Help us grow stronger and sign our citizens’ initiative to ban biometric mass surveillance!
Facebook deleting facial recognition: Five reasons to take it with a pinch of salt
On Nov.2 , the company formerly known as Facebook announced that by the end of the year, it will delete the entire database of facial templates used for automated photo tagging on the Facebook app. Yes, that Facebook – the notorious social media platform most recently in the news for a major whistleblowing scandal and a subsequent change of company name from Facebook to “Meta”.
Early reactions praised Facebook for this bold and surprising move. So, has Christmas come early in the digital rights world? Well, not so fast.
This move seems on the surface to be a good thing because it chips away at the group’s power and control over face data from around 13% of the world’s population. However, the reality is that things are not as rosy as Facebook would like you to think.
The latest Facebook announcement reveals exactly why voluntary “goodwill” self-regulation is superficial, and why we need strong EU rules in future legislation like the AI Act – as the Reclaim Your Face campaign demands.
Here’s five pinches of salt for your reality-check on Facebook deleting facial recognition:
1. The Facebook app will delete a database containing the face templates (“faceprints”) of over a billion people, which underpin the facial recognition system used to flag when people’s faces appear in photos and videos, for example for tagging purposes. But what about the underlying algorithm (the eerily named ‘DeepFace’) that powers this facial recognition? According to the New York Times, Facebook stated that DeepFace algorithms will be retained, and the company is very much open to using them in future products.
2. This means that whenever it suits their commercial interests, Facebook can flick the switch to turn their vast facial recognition capacity back on.
3. The Meta group’s initial statement does not say whether or not the database is the group’s (or even the app’s) only database used for identifying people, or whether they have others. As Gizmodo points out, the commitment doesn’t affect other Meta companies, such as Instagram, which will continue to use people’s biometric data.
4. Facebook has had a lot of bad press recently. So is this a convenient distraction to get praise from their long-time critics, the privacy community? It is probably also no coincidence that, as The Verge reports,this move comes after Facebook had to pay well over half a billion dollars in the US because the Face Recognition feature had been violating people’s privacy.
5. Meta’s press release outlines a desire by the company to do more with face authentication. People’s biometric data is always sensitive, and we increasingly see how authentication can pose serious risks to people’s privacy and equality rights as well as to their access to economic and public services. Given Facebook’s sprawling plans for a “metaverse”, their privacy-invading RayBan glasses, and their track record of massive and systemic privacy intrusions, we cannot trust that they will only use face data in benign and rights-respecting ways.
At its core, the Facebook app’s business model is based on exploiting your data. Far from being an all-out win, this move to delete their face recognition database shows more than ever why we simply cannot rely on the apparent ‘goodwill’ of companies in the place of rigorous regulation. When companies self-regulate, they also have the power to de-regulate as and when they wish.
As Amnesty International’s Dr. Matt Mahmoudi points out, the truly good news in this story is that the international pressure against facial recognition – thanks to movements like Reclaim Your Face and Ban The Scan – is making companies sweat. Mass facial recognition is becoming less socially acceptable as people become more and more aware of its inherent dangers. Much like IBM’s vague 2019 commitment to end general-purpose facial recognition and Amazon’s recently-extended self-imposed pause on the Rekognition facial recognition for law enforcement, it is naive at best to expect that companies will sufficiently rein in their harmful uses of facial recognition and other biometric data.
That’s why Reclaim Your Face continues to fight for a world free from pervasive facial recognition and other forms of biometric mass surveillance.
All EU citizens can sign our official EU initiative which calls to ban these practices and to hold companies like Facebook to account
Serbia withdraws a proposed Biometric Surveillance Bill following national and international pressure
During public consultations, SHARE Foundation sent comments on the Draft Law, which put Serbia in danger of becoming the first country in Europe with permanent indiscriminate surveillance of citizens in public spaces. EDRi’s Brussels office also warned the Serbian government of the dangers to privacy and other human rights if such a law was passed.
Huge awareness raising efforts were needed to highlight the importance of this issue, especially in a society with low privacy priorities such as Serbia. Through SHARE’s initiative called “Thousands of Cameras” (#hiljadekamera), we gathered a community of experts with various backgrounds (law, policy, tech, art, activism) as well as citizens worried about the implications of biometric surveillance in our streets and public spaces. Actions like “hunt for cameras”, where we called upon citizens to map smart cameras in their neighbourhoods, an online petition against biometric surveillance and a crowdfunding campaign for “Thousands of Cameras” have all shown that the fight against biometric surveillance can mobilise people effectively. The datathon organised to make a one-stop platform for the “Thousands of Cameras” initiative was a milestone that enabled us to keep pushing against this dangerously intrusive technology.
One of the key preconditions for success was the new Law on Personal Data Protection, modelled after the GDPR, which requires a Data Protection Impact Assessment (DPIA) to be conducted before intrusive data processing mechanisms are put in place and approved by the Commissioner for Personal Data Protection. This led to the Commissioner denying the DPIAs conducted by the Ministry of Interior on two occasions, citing that such a system lacked an adequate legal basis.
International standards and opinions on biometric surveillance provided by bodies such as the UN, Council of Europe and European Union institutions all provided valid points on why such technologies should be banned in any society aspiring towards democracy and the full respect of human rights.
However, SHARE also found that a multidisciplinary approach to the topic was necessary. Solely a legal angle is inadequate to argue against such a controversial issue. It needs to be tackled from different perspectives, such as human rights concerns, technological aspects (techno-solutionism) and by focusing on the impact on citizens’ everyday lives, particularly in vulnerable communities.
Getting the message across via the media was also instrumental. In the past couple of years over 300 news articles have been written about biometric surveillance in Belgrade, in both domestic and international media. In that respect, it is of utmost importance to forging partnerships with media and journalists interested in the topic, as they can immensely contribute to spreading awareness and mobilising people.
In a huge victory for human rights, the European Parliament has just voted to adopt a new report which calls to ban biometric mass surveillance. This is a key moment for the Reclaim Your Face campaign, because, although the report is not legally binding, it gives a strong indication of the Parliament’s position on the ‘Artificial Intelligence Act’.
Over 61,000 EU citizens have already signed our official initiative to ban biometric mass surveillance practices in EU law. Now, we have clear evidence that our voices have been heard! In what’s known as an own-initiative report (INI), the European Parliament decided to proactively set out their vision that police should use artificial intelligence technologies only in ways that respect people’s human rights and freedoms. This includes a demand to ban biometric mass surveillance, which is one of the most powerful and progressive calls we have seen from politicians or lawmakers anywhere in the world. Specifically, the report:
Warns about the severe risks of police uses of facial authentification / verification, and the need for such applications to be necessary and proportionate (§ 25);
Calls for a moratorium (time-limited suspension) of any facial identification by police until it can be proven as fundamental rights-compliant. If this cannot be proven, it must be banned (§27) (see the final bullet point for an even stronger outcome on any facial ID that leads to mass surveillance);
For other biometric features, demands ‘a permanent prohibition of the use of automated analysis and/or recognition in publicly accessible spaces of other human features, such as gait, fingerprints, DNA, voice, and other biometric and behavioural signals’ (§26);
Recommends a ban on the use of private databases, like Clearview AI, by law enforcement (§28);
And the pièce de résistance: ‘calls on the Commission, therefore, to implement, through legislative and non-legislative means, and if necessary through infringement proceedings, a ban on any processing of biometric data, including facial images, for law enforcement purposes that leads to mass surveillance in publicly accessible spaces’ as well as a ban on funding mass surveillance research (§31).
These are not the only exciting bits of the report. The report also takes a view of AI harms as structural, pointing to the severe risks for racialised and minoritised people. It even calls to prohibit discriminatory predictive policing practices, which rob people of the presumption of innocence.
This report matters so much because it gives the Parliament’s lead negotiators a clear message from their colleagues to push for a ban on biometric mass surveillance in their position on the AI Act, which they will have to negotiate with representatives from every EU member state’s government.
Many organisations within the Reclaim Your Face campaign joined the push to help overturn an attempt from some members of the European Parliament (in particular from the right-wing EPP group) to weaken the report and explicitly allow biometric mass surveillance. Today, we celebrate and thank the brave MEPs that stood up for rights and freedoms. Tomorrow, we continue the fight to ban biometric mass surveillance and reclaim our faces!
The rights of Romani people should be an important topic for anyone that cares about digital rights. In this blog, hear from experts in Roma, Sinti and digital rights about why facial recognition is an important issue (and what the rest of the digital rights community can learn), and check out the Reclaim Your Face campaign’s first ever resource in the Sinti language!
Roma and Sinti Rights are Digital Rights!
The 8th of April 2021 marked 50 years since the first World Romani Congress, an event which to this day signifies a celebration of Romani lives and culture, but also the barriers to rights, equal treatment and inclusion that are still put in the way of Roma, Sinti, Travellers and other Romani groups* across the world. With most areas of our lives increasingly turning ‘digital’, the purported benefits and opportunities of digitalisation can equally become additional inequalities for Romani people who have typically been shut out from access to digital skills and careers.
Today, there are at least 10-12 million Romani people across Europe, making Romani people Europe’s largest ethnic ‘minority’. And yet as groups such as Equinox Racial Justice Initiative have pointed out, the experiences and expertise of minoritised people like Roma and Sinti have been conspicuously absent in European policy debates and decisions. Take, for example, the recent EU consultation on artificial intelligence (AI) which came before the highly-anticipated proposal for a law on AI, but suffered from a lack of meaningful consultation with historically-marginalised communities from across Europe.
For historically marginalised communities like Roma and Sinti, BMS can single them out in ways that exacerbate already high levels of discrimination and exclusion. Romani people may also be especially sensitive to the ways in which BMS is based on an analysis of people’s facial proportions in order to put them in arbitrary boxes such as their predicted race, gender or even whether they seem suspicious or aggressive. Such practices have strong parallels, for example, with how the Nazi regime used biometric data to persecute and kill Romani people during the Holocaust. In recent years, data about Romani people have been used in a wide variety of other harmful ways. Read on to learn more about this from Roma and Sinti experts in digitalisation, Roxy and Benjamin.
The RYF x International Romani Day 2021 webinar
We commemorated International Romani Day 2021 by speaking with Roxanna Lorraine-Witt and Benjamin Ignac about the intersection of Roma and Sinti rights with the rise of facial recognition and other forms of biometric mass surveillance.
Roxy and Benjamin are experts on issues of data, digitalisation and Romani rights, and they spoke to us to explore what biometric mass surveillance could mean for Roma and Sinti communities. They also spoke about how including Romani experiences and expertise can strengthen the digital rights movement and help drive resistance against biometric mass surveillance and other rights-violating practices:
Please note that by clicking on this video, it will open an external link to the video on YouTube. YouTube engages in extensive data collection and processing practices that are governed by their own terms of service.
“I hate that I need to live in a world where I feel like I have to hide my Roma identity because this very identity can be used against me […] Having governments using this identity or data about Roma in that way is totally unacceptable. We should be proud of our identity […] [But] we have plenty of examples that in the wrong hands, data about Roma will be used against us.”
Our first Reclaim Your Face resource in a Romani dialect: the Sinti langauge!
We have also been working with Franz-Elias Schneck, the creator of the very first history video in the Sinti langauge. Franz has produced a video for Reclaim Your Face to explain what biometric mass surveillance is, why it is an important issue for Sinti people, and how it links to systemic issues that Romani people have long faced, such as racist policing practices:
Please note that by clicking on this video, it will open an external link to the video on YouTube. YouTube engages in extensive data collection and processing practices that are governed by their own terms of service.
We are especially excited that this video is available in the Sinti language, sometimes also called Rromanës-Sinti. It’s a type of Romani language which is most commonly spoken by Sinti people in Germany. If you don’t understand Sinti, do not worry: the Reclaim Your Face website homepage contains extensive explanations of what biometric mass surveillance is and why you should care about it. Plus, the site is available in 15 EU languages, with more coming soon – use the drop-down at the top of your screen to pick your preferred language.
The Romani Tea Room Podcast
After attending our International Romani Day webinar, the European Roma Rights Center invited Reclaim Your Face to feature on their Romani tea room podcast along with Benjamin, in an episode appropriately titled “You are being watched”. In the episode, host Sophie Datishvili points out that biometric mass surveillance practices – which EDRi and the Reclaim Your Face campaign have long warned are a major human rights issue – are likely to resonate with Roma because of the similarities to the discriminatory targeting via ethnic profiling that Romani people regularly face:
We also explored how the “invisible” nature of biometric mass surveillance means that it can cause harm to any of us without us even knowing it has happened – meaning that minoritised groups can often be most harmed due to structural discrimination, but that any person looking to walk around in public, attend a demonstration or even go to the shops can in fact be targeted. Because these practices are becoming more and more prominent in almost every European country, we agreed that there is a strong need to stop biometric mass surveillance before it goes any further, and therefore to prevent vast future harm to Romani and non-Romani people alike.
Watch, read, listen, learn, reflect – and act!
One thing has been clear to us as we have had the opportunity to speak with and learn from Roxy, Benjamin, Franz and Sophie: in law and policy-making, it is vital that everyone who is subject to laws and policies has a voice in shaping and contributing their expertise to those laws and policies.
Romani experiences offer a critical and at times harrowing insight into why it’s so important that we resist biometric surveillance practices that differentiate between people based on their faces and bodies. If we want a Europe that is truly inclusive, it is important that we make sure that everybody has equal and equitable access to the opportunities and benefits of digitalisation, and that everyone is properly protected from the harm that can arise from the use of these technologies, too.
If you have found this blog interesting, we encourage you to inform yourself about Romani rights, the powerful work of Romani organisations and grassroots movements across Europe, and the issue of biometric mass surveillance with the following resources. If you’re an EU citizen, you can also sign our official EU petition to add your voice to the call to ban biometric mass surveillance – either on our homepage or even at the bottom of this page!
A note on terminology
There is no single type of “Romani” person. Throughout this article we use the terms “Roma” and “Sinti” as nouns to refer to specific groups, although the term “Roma” can also be used more broadly to refer to all Romani groups. We use “Romani” as an adjective to describe association to all groups in the Romani community. There are many different Romani groups across Europe, with often distinct dialects and cultures. To educate yourself, check out our recommendations below.
Read and Learn:
Explore the blog and academy that Roxy has co-founded, Romblog and Romblog Academy:
People outside LGBTQ+ venues, religious buildings, doctor’s surgeries and lawyers’ offices in the German city of Cologne may have had their faces illegally captured. The Dutch cities of Roermond, Eindhoven, Utrecht and Enschede have been turned into experimental “Living Labs”, capturing vast amounts of data about residents and apparently using it to profile them. The Polish COVID-19 quarantine app not only captured people’s faces without good reason – but that this information may now have been abused by police to visit the homes of people no longer subject to quarantine rules. These are just some examples from a new report documenting evidence of biometric mass surveillance.
Today, the two most important regulators for ensuring people’s rights to privacy and data protection across the EU announced their formal call to ban biometric mass surveillance practices! This is a significant development for our campaign and adds even more weight to the pressure that has been exerted by over 55,000 Reclaim Your Face supporters already.
The European Data Protection Supervisor (EDPS) is the watchdog for keeping EU institutions in check when it comes to the use of people’s personal data. Separately, the European Data Protection Board (EDPB) brings together representatives of each national data protection authority (DPA). DPAs are the authorities who keep watch over personal data in their own country, and are empowered to issue significant fines in the event of abuses. They are made up of data protection, technology and human rights experts – so their views on the topic of biometric surveillance are crucial and authoritative!
EDPB & EDPS call for ban on use of AI for automated recognition of human features in publicly accessible spaces, and some other uses of AI that can lead to unfair discrimination
21 June 2021
The Reclaim Your Face coalition is especially excited to see the EDPS and EDPB’s reasoning for wanting to ban biometric mass surveillance – or in their words, to “ban [the] use of AI for automated recognition of human features in publicly accessible spaces, and some other uses of AI that can lead to unfair discrimination.”
In particular, the groups highlight that there are “extremely high risks” posed by the use of these technologies in publicly accessible spaces due to the potential to obliterate people’s fundamental right to stay anonymous and to unduly restrict a very wide range of rights and freedoms. The EDPS and EDPB advocate that a ban should be the starting point when it comes to public biometric uses – in contrast to the European Commission’s recent proposal, which simply does not yet go far enough to protect people and communities.
If you’re feeling like having a bit of fun, you can also use absurd comedy to make the very serious point about how facial recognition and other biometric surveillance in public spaces suppresses our rights and freedoms by joining the #PaperBagSociety challenge now!
Stay tuned for the deeper analysis by EDRi of the full EDPS and EDPB opinion, coming soon.
The #PaperBagSociety is a social media challenge part of the #ReclaimYourFace campaign that invites everyone to share online the impact of living life with a paperbag on the head. With it, we aim to raise awareness of how ridiculous is to avoid facial recognition technologies in public spaces and why we need to build an alternative future, free from biometric mass surveillance.
In the past months, we’ve raised awareness of the dangers of biometric mass surveillance. Part of the process was also understanding how complex the systems that rely on biometric data are. We tried to find different ways to trick them, looking at facial recognition surveillance technologies deployed in our public spaces. The results are clear: as an individual, it is terribly difficult to trick biometric mass surveillance.
This is the reason why, at some point, one of the campaign organisers joked:
We ask: what would it be like to go about our daily lives with the paper bag on our head? Do we need to use a paper bag to protect our faces from creepy recognition technologies? Is this the society we want to live in? In a world that remains dominated by ableism, it could be challenging to love, to cross the street, to merely interact.
Collectively, a #PaperBagSociety becomes a dystopian reality, a metaphor for the way biometric mass surveillance suppresses our choices, our speech and our freedoms. We realised this could be a great imagination exercise for anyone wanting to understand better why we need a world free from intrusive technologies that track our bodies and behaviour.
This is how the #PaperBagSociety challenge was born.
The #PaperBagSociety is a social media challenge part of the #ReclaimYourFace campaign. The challenge invites everyone to share on social media the impact of living life with a paperbag on the head.
Using absurd comedy, this action aims to draw attention to why the heavy burden of avoiding creepy biometric surveillance technologies in public spaces should not fall on us, the people.
Instead, the action emphasises that an alternative future is possible. There are solutions to prevent a paper bag society: we must ban biometric mass surveillance across the EU and beyond.
Be part of the #PaperBagSociety challenge!
1. Go for a stroll in a publicly accessible space (public square, on the street, in a train station, a supermarket, a cafe, a stadium, shopping mall etc).
2. Put a paper bag on and try to live in the public space.
3. Take a video or a photo of the experience and share it on social media.
4. Make sure to tag #ReclaimYourFace & #PaperBagSociety and explain to your friends why we must ban biometric mass surveillance.
P.S. First and above all: make sure you don’t put yourself or others in danger. Keep it cool.
P.S.2: Are you lucky enough to be a citizen in an EU country? VOTE to BAN biometric mass surveillance !