Biometric face recognition service Clearview AI is illegal in Canada. It defined the authority to protect data in the monarchy. New York-based company Clearview AI has collected more than three billion facial images online and trained their face recognition algorithm, which it now hires. However, the company did not even try to obtain approval from those affected.
“The vast majority of these people have never been and will never participate in any crime. What Clearview does is mass surveillance, and it’s illegal,” Canadian Privacy Commissioner Daniel Terrain said Last week, “It is an affront to the privacy rights of individuals and causes widespread harm to all members of society who find themselves in an ongoing police confrontation. This is totally unacceptable.”
In cooperation with data protection authorities in the Canadian provinces of Alberta, British Columbia and Quebec, Therrien investigated the matter. According to the final report Clearview AI has violated Canadian federal law three times:
- Clearview AI has collected and processed personal data without the consent of the data owners;
- Clearview AI collects, processes and discloses personal data to third parties for an improper purpose; And the
- Clearview AI failed to report the creation of a database of biometric properties and measurement results to the Canadian authorities.
- Additionally, Clearview AI has failed to obtain the biometric approvals specifically provided for under Quebec law.
Since the purpose of data processing is not acceptable, Clearview AI is not allowed to implement its business model even with the consent of the data holders. Even the random combination of images by scraping public sites is unacceptable. Additionally, the authorities have doubts about the efficiency and accuracy of facial recognition in general and that of Clearviews in particular. The results of incorrect facial recognition can have dire consequences for those affected. A database can also be compromised, leaving all data open to third parties.
Clearview prompts new guidelines to continue
As part of the investigation, the four authorities asked Clearview to withdraw the facial recognition service in Canada, to stop collecting and process images and facial biometrics data related to people in Canada, and to delete this data. Clearview paused its display during investigation, but did not delete the Canadian data.
The company denies both the risk to those affected by incorrect results or database breaches, as well as damage to all registered persons. Clearview is asking Canadian data protection authorities to establish guidelines within two years to enable the service to function legally.
In addition, the American company has attempted to practice the horse trade: if Canadian authorities keep its investigation report a secret, Clearview will attempt to take steps to restrict the collection and distribution of the images – but only for the images it identifies as “Canadian”, and only on the basis of efforts, i.e. without obligation. Legal and without admitting to legal violation. The Canadians did not respond to this and released the report last week.
Clearview’s arguments aren’t going anywhere
It shows Clearview is unaware of any guilt. The Canadian authorities are not responsible at all and the laws are ineffective because Clearview is not “in Canada” or “Quebec.” Additionally, all images are public. In addition, it is not important that people put their pictures on the Internet for completely different purposes. Generally speaking, the facial recognition service is a service that benefits the community.
With the publication of the investigation report, the four Canadian data protection authorities are calling for Clearview AI to stop collecting, processing and publishing people-related facial biometrics and patterns in Canada and to delete this data. Additionally, Clearview AI must adhere to discontinuing its facial recognition service in Canada. If rejected, the data protection authorities threaten the company with further legal steps in order to enforce a law-compliant case.