Clearview AI faces possible £17m fine for violating Britain's privacy laws

Clearview AI faces possible £17m fine for violating Britain's privacy laws

Image:
Clearview AI faces possible £17m fine for violating Britain's privacy laws

The ICO has issued a provisional notice to the company to stop further processing of the personal data of people in the UK

Controversial facial recognition firm Clearview AI faces a potential fine of about £17 million from the UK regulator for failing to comply with the nation ' s data protection laws.

On Monday, the Information Commissioner ' s Office (ICO) announced its provisional intent to impose the fine on the firm, stating that the company might have collected data on "a substantial number" of British people, from publicly available information online, including social media platforms, without people ' s knowledge.

The ICO said it has also handed the firm a provisional notice to stop further processing of UK citizen ' s personal data and to delete any data it already holds on them.

The ICO ' s announcement on Clearview follows a joint probe by the British regulator and the Office of the Australian Information Commissioner (OAIC), which focused on the company ' s use of images from the internet and the use of biometrics for facial recognition.

Earlier this month, the OAIC ordered the firm to delete data after concluding that it violated Australian data protection laws.

Clearview AI is said to have scraped over 10 billion photographs from people's social media profiles on Facebook, Instagram, Twitter and other platforms for its facial-recognition tool.

Last month, Clearview CEO Hoan Ton-That told Wired that a larger database of images means its customers, often law enforcement agencies, are more likely to find people of interest. The bigger data set also makes the firm's tool more accurate.

The company's AI tool enables customers to run facial recognition searches and identify persons of interest. Customers submit peoples' pictures and the system tries to locate those people in the database, using facial recognition. If successful, it returns details like the individual's name, social media handles and so on.

The ICO said it understands that the facial recognition service provided by Clearview was used by several law enforcement agencies in the UK, on a free trial basis, but that the "trial was discontinued and Clearview AI Inc's services are no longer being offered in the UK".

The regulator believes that Clearview does not have "a lawful reason for collecting the information" and has also failed to meet the higher data protection standards required for biometric data, which is classified as ' special category data ' under the GDPR and UK GDPR.

The ICO said Clearview will have the opportunity to make representations in respect of the alleged breaches, which will be carefully considered before any final decision is made by the regulator.

The UK Information Commissioner, Elizabeth Denham, said she was concerned that Clearview processed the personal data of British people in a way that "nobody in the UK will have expected".

"It is therefore only right that the ICO alerts people to the scale of this potential breach and the proposed action we ' re taking."

Clearview is facing scrutiny in other countries as well over the privacy implications of its software.

Earlier this year, Canada's data privacy commissioner concluded that the firm had 'violated federal and provincial privacy laws' by collecting images of Canadians.

It told the firm to stop offering its services to Canadian clients, stop collecting images of Canadian people, and to delete all related images and biometrics.

European Union authorities are also assessing whether the use of Clearview software violates GDPR rules.