The faces of millions have been collected and disseminated widely, serving the user identification database of AI.
In 2020, Matthias Marx, a security researcher at Security Research Labs, stumbled upon Clearview AI, a company that specializes in analyzing billions of images from the internet to build a database of user faces. Clearview allows its business clients to use facial recognition technology to find images containing any specified user’s face.
A Database Surpassing All Social Media
Curious if his own face was included in the company’s database, Marx emailed Clearview for clarification. A month later, he received a response from the company along with two screenshots.
One of the images was a photo of Marx taken at a Google competition nearly a decade ago. This photo had been uploaded to Alamy by an unknown photographer. However, the issue was that he had never sold this image to anyone.
“I had no idea what others would do with user data,” Marx said. He believes Clearview violated the General Data Protection Regulation (GDPR) of the European Union (EU) by using his biometric information without his consent.
Consequently, in 2020, he filed a lawsuit against Clearview, but to date, he has not received a final ruling. “It has been 2.5 years, and this lawsuit is still ongoing. That pace is far too slow,” the security expert expressed his dissatisfaction.
Clearview processed hundreds of billions of images in just one year, enriching its vast database. (Photo: AP).
Not stopping there, Marx discovered that images containing his face were being circulated widely. When he searched for his face on another recognition platform called Pimeye, the results were even more abundant than those from Clearview. These included images from 2014 and from private, politically-themed events.
As of March this year, Marx found four images of himself on the facial recognition tool Public Mirror. This site even attached links to personal information about Marx and events he attended in the photos.
According to the security expert, each platform revealed a different aspect of his private information. This has exposed a frightening industry that possesses data far exceeding that of all social media.
“Monetizing” User Faces
According to Wired, in Europe, millions of faces have appeared unlawfully on the search engines of tech companies like Clearview. This region is known for having the strictest privacy protection laws in the world, but there are still many “blind spots” that some areas, like Hamburg, Germany, have yet to manage.
Not only Matthias Marx, but many other users have also filed lawsuits for data breaches. In October, the French Data Protection Authority (CNIL) fined Clearview AI 20 million euros (19.6 million USD) for illegally analyzing personal data and demanded that the company cease such activities. However, after Clearview, a slew of companies that “monetize” user faces have sprung up like mushrooms.
As a security expert, Marx believes that Clearview will never be able to permanently delete the facial data it has collected. He argues that the core technology of Clearview continuously hunts for faces that have appeared on the internet, meaning that his images will inevitably be used again. “This will happen again if my face appears somewhere on the internet. Clearview’s algorithm will never stop,” Marx stated.
Speaking to investors, Clearview claimed that the company processed over 100 billion images this year, which means that, on average, each person could have 14 images exposed.
Exploiting Legal “Blind Spots”
According to Wired, Clearview’s operational methods are very sophisticated. This AI technology company uses automated bots to search for different faces on the internet and stores them in its own database, meaning this data will not be publicly available on platforms, stated CEO Hoan Ton-That.
“From just images, no one can know where the owner of the face comes from. Clearview commits to only collecting publicly available information on the internet, such as Google or Bing,” he said.
However, searching by a person’s name and searching by facial images, like Clearview does, are vastly different.
“Names are not unique identifiers and users can easily conceal them. But with faces, it’s different; users cannot hide their identity once their face is exposed,” Lucie Audibert, a lawyer at Privacy International, commented.
Clearview is a tool that allows searching for a person’s face on the Internet based on any given image. (Photo: Shutterstock).
As a result, concerns about facial recognition search tools have erupted across Europe, prompting lawmakers to ban such practices. In disagreement with this, CEO Ton-That argued that Clearview does not need to comply with GDPR regulations since they have no customers or offices in the EU.
“It is very difficult to enforce EU laws on a U.S. company that has no cooperation with Europe,” said Felix Mikolasch, a lawyer for NOYB who previously advocated for Marx in his lawsuit against Clearview. This is precisely the “blind spot” in the law where it cannot deter or require technologies to stop illegal data collection because they can simply “cut ties” with Europe if pressured.