Law Street Media

Facebook, YouTube Request Clearview AI to Stop Scraping Their Platforms

Connected notes

Glowing spheres connected by binary code and exchanging data.

Facebook and YouTube have ordered Clearview AI to stop scraping images on their platforms for potential matches for law enforcement or other clients. Clearview has scraped more than three billion images.  The business maintains “a massive international universal face recognition database” using the photos, and markets use of the database to law enforcement agencies. Clearview currently works with about 600 law enforcement agencies.

Facebook and YouTube have asked Clearview AI to not use their platforms to source photographs for the database, claiming it violates their policies and Terms of Service. “YouTube’s Terms of Service explicitly forbid collecting data that can be used to identify a person. Clearview has publicly admitted to doing exactly that, and in response we sent them a cease and desist letter,” Alex Joseph, a spokesperson for YouTube said. Their request comes after Twitter asked Clearview AI to stop, claiming the company violated Twitter’s policies. Twitter wanted Clearview to stop scraping and to delete the collected data. Other platforms have followed suit. Facebook requested it stop scraping Facebook and Instagram, while Microsoft stated it must stop scraping LinkedIn. Venmo also informed Clearview to stop scraping its platform.

In 2017 and 2018, once Clearview advertised to and partnered with law enforcement, it garnered a lot of attention and investment. Law enforcement had used Clearview to solve “shoplifting, identity theft, credit card fraud, murder and child sexual exploitation cases.” Clearview has heavily marketed the New York Police Department’s use of its platform to solve cases, but the law enforcement agency has vehemently denied using the platform to solve cases.

Images collected by Clearview, especially those of children, could be very sensitive. The company’s security policies are unclear. Images by default are kept forever, but they can be kept for 30 days if the user adjusts his settings. Sen. Ed Markey (D-Mass.) sent the company a list of questions about its platform and service. Markey stated “[a]ny technology with the ability to collect and analyze individuals’ biometric information has alarming potential to impinge on the public’s civil liberties and privacy. Clearview’s product appears to pose particularly chilling privacy risks, and I am deeply concerned that it is capable of fundamentally dismantling Americans’ expectation that they can move, assemble, or simply appear in public without being identified.” Sen. Ron Wyden (D-Ore.) also found the company troubling and tweeted, “This story reads like one of the more disturbing episodes of Black Mirror… Americans have a right to know whether their personal photos are secretly being sucked into a private facial recognition database.”

In response, Hoan Ton-That, Clearview founder and CEO, stated “[t]he way we have built our system is to only take publicly available information and index it that way… Google can pull in information from all different websites…So if it’s public and it’s out there and could be inside Google search engine, it can be inside ours as well.”

Clearview AI was previously sued by an individual in a class action complaint for violating his rights and Illinois’ Biometric Information Privacy Act (BIPA). Another suit was filed last week in the Eastern District of Virginia. New Jersey’s Attorney General has issued a cease and desist prohibiting it from advertising a relationship or hinting at a relationship with state agencies. The Electronic Privacy Information Center (EPIC) is leading 40 other privacy and civil rights groups, who have requested that Congress get involved.

Exit mobile version