Clearview AI has scraped 10 billion photos from the web

Clearview AI has scraped 10 billion photos from the web

Image:
Clearview AI has scraped 10 billion photos from the web

Clearview appears to have ignored demands from social media companies to stop taking images from their platforms

Clearview AI has scraped over 10 billion photographs from people's social media profiles on Facebook, Instagram, Twitter and other platforms for its controversial facial-recognition tool.

Clearview CEO Hoan Ton-That told Wired that a larger database of photos means its customers, often law enforcement agencies, are more likely to find people of interest. The bigger data set also makes the firm's tool more accurate.

The statement suggests that the firm has completely ignored requests from Facebook, Twitter, and Google last year to stop downloading photographs.

Twitter, Facebook and YouTube have all threatened legal action against Clearview if it doesn't stop the practice. The company also faces multiple privacy lawsuits in the US, and elsewhere.

Ton-That said the firm only operates in the US, and does not want its tool to be abused.

"We're focusing on the United States, because we want to get it right here. We never want this to be abused in any way."

Clearview has come under intense scrutiny in several countries over the privacy implications of its software in recent years.

The company's AI tool enables customers to run facial recognition searches and identify persons of interest. Customers submit peoples' pictures and the system tries to locate those people in the database, using facial recognition. If successful, it returns details like the individual's name, social media handles and so on.

Ton-That told Wired that his engineers are working to add a pair of new features that will make blurry images sharper and depict the full faces of people wearing masks.

MIT Professor Aleksander Madry, who specialises in machine learning technology, told Wired that he suspects the accuracy of Clearview AI's new tool will be poor, and will introduce unintended bias.

Ethical issues are also associated with the unmasking of people's images, according to Madry.

"Think of people who masked themselves to take part in a peaceful protest or were blurred to protect their privacy," he said.

Ton-That claims, however, that tests have shown the new features improve the accuracy of Clearview's technology.

"Any enhanced images should be noted as such, and extra care taken when evaluating results that may result from an enhanced image," he said.

"My intention with this technology is always to have it under human control. When AI gets it wrong it is checked by a person."

The Clearview boss added that nearly 3,100 government customers worldwide now use the company's face-matching service.

In August, Buzzfeed said that it reviewed Clearview's internal data and found that 88 government agencies in 24 countries (not including the USA) had used the firm's facial recognition system, as of February 2020.

The agencies included police departments, interior ministries, prosecutors' offices and universities.

The report said that Clearview offered its technology on a try-before-you-buy basis in Australia, Brazil, Saudi Arabia, the UK and elsewhere.

Earlier this year, Canada's data privacy commissioner concluded that Clearview had 'violated federal and provincial privacy laws' by collecting images of Canadians. It told the firm to stop offering its services to Canadian clients, stop collecting images of Canadian people, and to delete all related images and biometrics.

European Union authorities are also assessing whether the use of Clearview software violates GDPR rules.

Last week, the European Parliament voted to support a total ban on law enforcement agencies' use of AI and facial recognition systems for mass public surveillance. Members of the European Parliament pointed to the risk of algorithmic bias in AI applications, and emphasised that human supervision is needed to prevent AI discrimination, especially in a law enforcement or border control context.