The ACLU said in its complaint that it filed a lawsuit “to stop the secret arrests and storage of millions of Illinois sensitive biometric identifiers.” Several other non-profit organizations, including the Chicago Alliance Against Sexual Exploitation and the Chicago Sex Worker Outreach Project, have also signed contracts.
“Clearview AI is a search engine that only uses publicly available images that can be accessed on the internet,” Clearview AI’s lawyer, Tor Ekeland, told CNN Business in an email statement. “It makes no sense if the ACLU wants to censor which search engines people can use to access public information on the internet. The First Amendment prohibits this.”
This tool scratches billions of publicly available images from social media sites and other places on the internet, and uses facial recognition software to make databases searchable. By using one photo of a person, the Clearview database can identify the person’s photos from the internet, and link back to the original source, which can help identify images of unknown people.
Clearview emphasizes that this service not for public use, but investigative tools sold to law enforcement that are used to help identify suspects and resolve crimes. The company said its database was used by more than 600 law enforcement agencies in the United States and Canada.
Many people may not realize when posting photos of themselves – even if they post them publicly – that they can be swept into a massive database and used by law enforcement.
“Clearview AI is one of the most innovative, effective and accurate law enforcement tools on the market,” Ekeland, corporate lawyer, said in an email. “Not only does it protect victims by helping law enforcement arrest child rapists, murderers and thieves, its accuracy protects the innocent from false accusations – all by only using public images that are available to everyone on the public internet.”
Ekeland added that Clearview operates in “in strict accordance with US Constitution and American law.” He said Clearview works similar to other search engines, and claims it collects less data than some other online companies.
He said that “Clearview AI only collects public images and their web addresses. That’s all.”
“We will welcome the opportunity to work collaboratively with the State of Vermont – outside the hostile environment of the courtroom – to further refine our proven technology and solve crimes for the benefit of everyone,” Ekeland said.
The ACLU lawsuit accuses the device of violating the privacy rights of Illinois residents. The lawsuit took issue not only with the Clearview image collection but also the alleged use of the images by the company to collect biometric data, which the lawsuit referred to as “faceprints”.
“A faceprint, ‘like a thumbprint or DNA profile, is a biometric identifier that is used to distinguish or verify an individual’s identity,” the complaint said. “Like other biometrics, facial impressions rely on individual biological characteristics that cannot change – from the distance between one’s eyes and the shape of one’s cheekbones to the pattern of spots on one’s forehead – to capture their biometric signatures.”
Because people cannot change their faces or hide them publicly, the ACLU alleges that taking and storing images of individual faces with AI makes people vulnerable to risks such as identifying theft, data breaches, and surveillance “by making it possible to instantly identify everyone at a demonstration. or political rallies, synagogues, shelters for domestic violence, Alcoholics Anonymous meetings and more, “according to the complaint.
“In capturing billions of these faceprints and continuing to store them in large-scale databases, Clearview has failed, and continues to fail, to take the basic steps needed to ensure that their behavior is legitimate,” the ACLU accused in the complaint.
The lawsuit seeks to have Clearview “destroy all biometric identifiers” in its possession that is alleged to violate the country’s Biometric Information Privacy Act, and to take steps to comply with the law, in addition to unspecified further assistance.
“We want to work with the government to create something safe,” said Ton-That, adding that the company had met with legislators, although he declined to say with whom.
“We don’t want that at all,” said Ton-That. “The way it is currently used in all law enforcement agencies in the US is to ensure it is only a guide.”
– Donie O’Sullivan contributed to this report
to request modification Contact us at Here or [email protected]