ACLU is suing Clearview AI, calling the tool an “unprecedented violation” of privacy rights | Instant News

The ACLU alleges that Clearview’s technology contradicts the 2008 Illinois Biometric Information Privacy Act, according to complaint, filed Thursday in the Cook County Circuit Court, Illinois. This alleged in a statement that the company was involved in “surveillance activities that violate the law and damage privacy.”

The ACLU said in its complaint that it filed a lawsuit “to stop the secret arrests and storage of millions of Illinois sensitive biometric identifiers.” Several other non-profit organizations, including the Chicago Alliance Against Sexual Exploitation and the Chicago Sex Worker Outreach Project, have also signed contracts.

Clearview rejected the ACLU complaint as “unreasonable” when asked for comment. According to him website, Clearview Service “has been independently tested for accuracy and evaluated for legal compliance by nationally recognized authorities.”

“Clearview AI is a search engine that only uses publicly available images that can be accessed on the internet,” Clearview AI’s lawyer, Tor Ekeland, told CNN Business in an email statement. “It makes no sense if the ACLU wants to censor which search engines people can use to access public information on the internet. The First Amendment prohibits this.”

Clearview AI’s founder, Hoan Ton-That has described the technology as “basically a search engine for faces.”

This tool scratches billions of publicly available images from social media sites and other places on the internet, and uses facial recognition software to make databases searchable. By using one photo of a person, the Clearview database can identify the person’s photos from the internet, and link back to the original source, which can help identify images of unknown people.

This man said he hoarded billions of our photos

Clearview emphasizes that this service not for public use, but investigative tools sold to law enforcement that are used to help identify suspects and resolve crimes. The company said its database was used by more than 600 law enforcement agencies in the United States and Canada.

But the company has come under fire in recent months after a front page investigations by the New York Times in January.

Many people may not realize when posting photos of themselves – even if they post them publicly – that they can be swept into a massive database and used by law enforcement.

If someone posts an image to a public Instagram page, for example, Clearview technology can achieve it, and even if that person then changes their page to private or deletes the photo altogether, the image will still appear in the Clearview database. This tool can also erode someone’s photo even if it was posted by someone else without the person’s knowledge.
Indonesia, Google, Facebook and other technology companies has sent a stop and stop permit from Clearview, saying the device violates their terms of service. Clear view said it will address the problem of technology companies, but also push back, saying there is a First Amendment right to public information.
In February, Clearview said that a hackers get access to his entire client list, which includes police forces, law enforcement agencies and banks.
New Jersey in January impose restrictions in all states about law enforcement using Clearview when looking into software. Vermont Attorney General too file a lawsuit against Clearview for alleged violations of data privacy.

“Clearview AI is one of the most innovative, effective and accurate law enforcement tools on the market,” Ekeland, corporate lawyer, said in an email. “Not only does it protect victims by helping law enforcement arrest child rapists, murderers and thieves, its accuracy protects the innocent from false accusations – all by only using public images that are available to everyone on the public internet.”

Ekeland added that Clearview operates in “in strict accordance with US Constitution and American law.” He said Clearview works similar to other search engines, and claims it collects less data than some other online companies.

He said that “Clearview AI only collects public images and their web addresses. That’s all.”

“We will welcome the opportunity to work collaboratively with the State of Vermont – outside the hostile environment of the courtroom – to further refine our proven technology and solve crimes for the benefit of everyone,” Ekeland said.

The ACLU lawsuit accuses the device of violating the privacy rights of Illinois residents. The lawsuit took issue not only with the Clearview image collection but also the alleged use of the images by the company to collect biometric data, which the lawsuit referred to as “faceprints”.

“A faceprint, ‘like a thumbprint or DNA profile, is a biometric identifier that is used to distinguish or verify an individual’s identity,” the complaint said. “Like other biometrics, facial impressions rely on individual biological characteristics that cannot change – from the distance between one’s eyes and the shape of one’s cheekbones to the pattern of spots on one’s forehead – to capture their biometric signatures.”

Because people cannot change their faces or hide them publicly, the ACLU alleges that taking and storing images of individual faces with AI makes people vulnerable to risks such as identifying theft, data breaches, and surveillance “by making it possible to instantly identify everyone at a demonstration. or political rallies, synagogues, shelters for domestic violence, Alcoholics Anonymous meetings and more, “according to the complaint.

That Illinois Biometric Information Privacy Act, referred to in the lawsuit, states that private entities may not collect individual biometric identifiers (including scanning “facial geometry”) unless they first tell the individual that their identifiers are being collected, why and for how long, and receive “written releases “from the individual in response. The law also stipulates that private entities may not “sell, rent, trade, or profit from” individual biometric identifiers, among other requirements.

“In capturing billions of these faceprints and continuing to store them in large-scale databases, Clearview has failed, and continues to fail, to take the basic steps needed to ensure that their behavior is legitimate,” the ACLU accused in the complaint.

Clearview AI has billions of our photos. His entire client list has just been stolen

The lawsuit seeks to have Clearview “destroy all biometric identifiers” in its possession that is alleged to violate the country’s Biometric Information Privacy Act, and to take steps to comply with the law, in addition to unspecified further assistance.

Clearview’s Ton-That to CNN in a February interview that he doesn’t need to defy rules.

“We want to work with the government to create something safe,” said Ton-That, adding that the company had met with legislators, although he declined to say with whom.

He also said the company had taken steps to ensure its technology was not used to inadvertently identify someone as a criminal, a concern given research The finding is that some other artificial intelligence systems can suffer racial bias.

“We don’t want that at all,” said Ton-That. “The way it is currently used in all law enforcement agencies in the US is to ensure it is only a guide.”

– Donie O’Sullivan contributed to this report


image source

to request modification Contact us at Here or [email protected]