3 November 2021

Australian Information Commissioner and Privacy Commissioner Angelene Falk has found that Clearview AI, Inc. breached Australians’ privacy by scraping their biometric information from the web and disclosing it through a facial recognition tool.

The determination follows a joint investigation by the Office of the Australian Information Commissioner (OAIC) and the UK’s Information Commissioner’s Office (ICO).

Commissioner Falk found that Clearview AI breached the Australian Privacy Act 1988 by:

  • collecting Australians’ sensitive information without consent
  • collecting personal information by unfair means
  • not taking reasonable steps to notify individuals of the collection of personal information
  • not taking reasonable steps to ensure that personal information it disclosed was accurate, having regard to the purpose of disclosure
  • not taking reasonable steps to implement practices, procedures and systems to ensure compliance with the Australian Privacy Principles.

The determination orders Clearview AI to cease collecting facial images and biometric templates from individuals in Australia, and to destroy existing images and templates collected from Australia.

Clearview AI’s facial recognition tool includes a database of more than three billion images taken from social media platforms and other publicly available websites. The tool allows users to upload a photo of an individual’s face and find other facial images of that person collected from the internet. It then links to where the photos appeared for identification purposes.

The OAIC determination highlights the lack of transparency around Clearview AI’s collection practices, the monetisation of individuals’ data for a purpose entirely outside reasonable expectations, and the risk of adversity to people whose images are included in their database.

“The covert collection of this kind of sensitive information is unreasonably intrusive and unfair,” Commissioner Falk said.

“It carries significant risk of harm to individuals, including vulnerable groups such as children and victims of crime, whose images can be searched on Clearview AI’s database.

“By its nature, this biometric identity information cannot be reissued or cancelled and may also be replicated and used for identity theft. Individuals featured in the database may also be at risk of misidentification.

“These practices fall well short of Australians’ expectations for the protection of their personal information.”

Commissioner Falk found the privacy impacts of Clearview AI’s biometric system were not necessary, legitimate and proportionate, having regard to any public interest benefits.

“When Australians use social media or professional networking sites, they don’t expect their facial images to be collected without their consent by a commercial entity to create biometric templates for completely unrelated identification purposes,” she said.

“The indiscriminate scraping of people’s facial images, only a fraction of whom would ever be connected with law enforcement investigations, may adversely impact the personal freedoms of all Australians who perceive themselves to be under surveillance.”

Between October 2019 and March 2020, Clearview AI provided trials of the facial recognition tool to some Australian police forces which conducted searches using facial images of individuals located in Australia.

The OAIC is currently finalising an investigation into the Australian Federal Police’s trial use of the technology and whether it complied with requirements under the Australian Government Agencies Privacy Code to assess and mitigate privacy risks.

Clearview AI argued that the information it handled was not personal information and that, as a company based in the US, it was not within the Privacy Act’s jurisdiction. Clearview also claimed it stopped offering its services to Australian law enforcement shortly after the OAIC’s investigation began.

However, Commissioner Falk said she was satisfied Clearview AI was required to comply with Australian privacy law and that the information it handled was personal information covered by the Privacy Act.

“Clearview AI’s activities in Australia involve the automated and repetitious collection of sensitive biometric information from Australians on a large scale, for profit. These transactions are fundamental to their commercial enterprise,” she said.

“The company’s patent application also demonstrates the capability of the technology to be used for other purposes such as dating, retail, dispensing social benefits, and granting or denying access to a facility, venue or device.

“This case reinforces the need to strengthen protections through the current review of the Privacy Act, including restricting or prohibiting practices such as data scraping personal information from online platforms.

“It also raises questions about whether online platforms are doing enough to prevent and detect scraping of personal information.”

The OAIC has released its findings today while the ICO is considering its next steps and any formal regulatory action that may be appropriate under the UK’s data protection laws.

The full determination can be found on the OAIC website.