As facial recognition technology (FRT) becomes more ubiquitous, fears of its unregulated use increase. The concern is justified. If companies and law enforcement are left to use FRT without rules, civil liberties are more likely to be violated. Some feel we are already in the midst of that dilemma.
Clearview AI is under the gun once again as privacy watchdogs in Europe file legal complaints against its data collection practices. On Thursday, Privacy International (PI), Noyb, and others asked regulators in the UK, France, Austria, Italy, and Greece to halt Clearview's scraping of facial recognition data from social media sites like Instagram and Facebook. They say these practices "have no place in Europe."
The beleaguered company has faced pushback from virtually every social media platform. This makes the second time the UK has petitioned for action. Last year regulators opened a joint investigation in the UK and Australia into Clearview's data collection practices.
"Extracting our unique facial features or even sharing them with the police and other companies goes far beyond what we could ever expect as online users," said PI Legal Officer Ioannis Kouvakas.
Since being exposed by The New York Times, Clearview AI has been transparent about what it does. The owner and CEO Ton-That has maintained from the start that all the images the software scapes are publically available and that he is protected under the First Amendment. While that protection, if even applicable, does not extend outside of the US border, the company claims it does not do business in Europe.
"[Clearview AI] has helped thousands of law enforcement agencies across America save children from sexual predators, protect the elderly from financial criminals, and keep communities safe," the company said in a statement. "[Clearview] has never had any contracts with any EU customer and is not currently available to EU customers."
However, Bloomberg notes that data regulators in Sweden fined the nation's law enforcement agency for using Clearview's FRT. "[Police] unlawfully processed biometric data for facial recognition [and failed to do] a data protection impact assessment." So while Clearview maintains it has no overseas contracts, it is still apparently being used in and around Europe.
Image credit: Clearview AI Dark by Ascannio, Clearview App by The New York Times
Privacy watchdogs take legal action against Clearview AI in five countries