Be one of the 250 people who contribute to the climate conversation this month
New York-based Clearview AI’s practice of vacuuming up millions of images of Canadians and offering facial recognition services for customers constituted illegal “mass surveillance,” according to a probe by four privacy commissioners.
The joint federal, B.C., Alberta and Quebec investigation released Wednesday found that Clearview broke federal and provincial privacy laws when it gathered the faces of Canadians — including children — catalogued their biometrics and then offered it as a searchable database, all without the knowledge or consent of the people being searched.
The exact number of images of Canadians that were sucked into Clearview’s database is unknown, the commissioners said, because the company did not retain that information when collecting images. But they said it was a “relative certainty” that Clearview held “millions of images of individuals in Canada” in its pile of more than three billion faces worldwide.
The company marketed its service to law enforcement agencies in Canada. Media reports last year said the company had shared its technology with police in Toronto and Ottawa, among other cities. The RCMP was a “paying customer” at one point, and in total, 48 accounts were created for law enforcement and other organizations across Canada, the commissioners said.
“It is completely unacceptable for millions of people who will never be implicated in any crime to find themselves continually in a police lineup,” said federal privacy commissioner Daniel Therrien.
Michael McEvoy, information and privacy commissioner for British Columbia, said it was “unacceptable and deeply troubling that a company would create a giant database of our biometric data and sell it for profit without recognizing its invasive nature.”
Clearview’s technology extracts images of faces from publicly available sources, such as social media, and then associates each image with biometric measurements. Users then upload an image of their own to see if its biometrics match any of those from the faces in the database. The results are then linked with the image’s original source.
The company stopped offering its services in Canada after the joint probe was launched, and subsequently dropped its Canadian clients. In July, Therrien’s office confirmed the RCMP’s contract with Clearview was being suspended indefinitely.
The commissioners want the company to go further and commit to ceasing the collection or use of images of Canadians, and to delete those images already in its possession.
Clearview AI broke federal and provincial privacy laws when it gathered up millions of faces of Canadians, catalogued their biometrics and then offered it in a searchable database, according to an investigation by four commissioners.
Clearview said it would be willing to “take steps ... to try to limit the collection and distribution of the images that it is able to identify as Canadian.” But overall, the commissioners said the company “expressly disagreed with our conclusions.”
Clearview’s position is that because it uses publicly available information, it should be exempt from requirements surrounding the gaining of consent.
The company said it only collects images from web pages that are viewable by members of the public, and avoids those that are behind social media privacy settings, or from websites that instruct search engines not to scan their pages.
Clearview AI lawyer Doug Mitchell reiterated in a statement that the technology was “not available in Canada” and the company does not operate in Canada.
“Clearview AI only collects public information from the Internet which is explicitly permitted under PIPEDA (Canada’s Personal Information Protection and Electronic Documents Act),” he told the media.
The commissioners — Therrien, McEvoy, Alberta privacy commissioner Jill Clayton, and Diane Poitras, president of the Commission d'accès à l'information du Québec — disagreed with that assessment.
They said information that is collected from public websites, like social media profiles, “and then used for an unrelated purpose, does not fall under the 'publicly available' exception of PIPEDA,” nor does it fall under similar exemptions in Alberta, B.C. or Quebec law.
“The company essentially claims that individuals who placed or permitted their images to be placed on the Internet lacked a reasonable expectation of privacy in such images, that the information was publicly available, and that the company’s appropriate business interests and freedom of expression should prevail,” said Therrien.
“My colleagues and I think these arguments must be rejected.”
Clearview also argued that it was providing a benefit to public safety by offering a new tool to police to identify victims, witnesses and suspects.
The company touts success stories related to murders or child sexual exploitation. It pointed out that its terms of service only allow for “legitimate law enforcement” use.
But the commissioners said Clearview’s actions would create a “risk of significant harm” to individuals, including through misidentification or exposure to data breaches.
Carl Meyer / Local Journalism Initiative / Canada’s National Observer