How he reclaimed his face from ClearviewAI
The Hamburg Data Protection Authority deemed Clearview AI’s biometric photo database illegal in the EU as a result of a complaint Matthias Marx, a member of the Chaos Computer Club (an EDRi member) filed.
By ReclaimYourFace campaign lead organisation Chaos Computer Club (CCC)
Originally published by noyb here.
In January 2020, two days after the New York Times revealed the existence of the face search engine Clearview AI, Matthias Marx, a member of the Chaos Computer Club (an EDRi member), sent a data subject access request to Clearview AI. He was surprised to learn that the company was processing his biometric data and lodged a complaint to the Hamburg data protection authority (DPA). As a result, the Hamburg DPA has now deemed Clearview AI’s biometric photo database illegal in the EU.
In order to facilitate the data subject access request, Matthias shared a photo of his face. To confirm Matthias’ identity to guard against fraudulent access requests, Clearview AI additionally requested a government issued ID. Although Matthias ignored the request, Clearview AI sent him search results based on the photo he provided and confirmed deletion of the search photo in February 2020.
Matthias then electronically submitted a complaint to the Hamburg DPA that Clearview AI was processing his biometric data without consent. The DPA first rejected the complaint, arguing that the GDPR is not applicable. After further submissions, the DPA then eventually launched preliminary investigations. At the same time, noyb offered their support.
In May 2020, Clearview AI sent another, unsolicited, response to Matthias’ request and included new search results. Apparently, Clearview AI had not deleted the search photo as promised. While Clearview AI’s first answer only showed two photos of Matthias, this time it also contained eight photos of other people.
In August 2020, the Hamburg DPA ordered Clearview AI to answer a set of 17 questions under threat of penalties. Clearview AI replied in September 2020. In January 2021, Hamburg DPA initiated administrative proceedings against Clearview AI.
The decision acknowledges the territorial scope of the GDPR, which is triggered for entities outside the EU if they monitor the behaviour of data subjects in the EU. Clearview AI had argued against the applicability of the GDPR, saying that they do not monitor the behaviour of individuals but provide only a “snapshot of some photos available on the internet”.
The Hamburg DPA discarded this argument for two reasons. For one, Clearview AI’s results include information that stretches over a period of time. For another, Clearview AI’s database links photos with their source and associated data. As such, it records information in a targeted manner – the definition of monitoring. Moreover, the Hamburg DPA noted that the subsequent use of collected personal data for profiling purposes, as happens with Clearview AI’s results, can be seen as a strong indicator for the existence of monitoring.
Taking into account subsequent use is important because it underlines that downstream processing by other entities can, to a certain extent, be used to classify the nature of the upstream processing. In other words, entities cannot fully launder their data processing by handing off the dirty work downstream.
Despite clearly stating that Clearview AI lacked a legal basis for its biometric profile, the Hamburg DPA unfortunately only ordered the deletion of the complainant’s biometric profile – it neither ordered the deletion of the complainant’s photos already collected, nor did it issue an EU-wide ban on Clearview AI’s processing. noyb had submitted arguments on why the Hamburg DPA could issue an EU-wide ban against Clearview.
In conclusion, this decision is only a first step. Further litigation is necessary. While Europeans now have precedent to rely on, we need decisions that also declare the harvesting of photos for totally incompatible purposes to their initial publication illegal.