Parents Sue TikTok Over Biometrics-Based Targeting

Parents sued social media app TikTok in the Northern District of Illinois Friday in a class action complaint alleging the app surreptitiously collected their minor childrens’ data and biometrics without notifying them or obtaining their consent; it is one of many suits against the company alleging similar violations.  The plaintiffs sued TikTok and its parent company, ByteDance, claiming that TikTok has violated the Illinois Biometric Information Privacy Act (BIPA).

TikTok offers users the ability to add audio and visual features to a user-created video, including “popular song clips, moments from TV shows, stickers, animations, face filters, and face trackers.” The plaintiffs noted that “in connection with certain visual features and effects, Defendants surreptitiously collect, use, and store users’ facial geometry, which is private, legally protected biometric information.” Moreover, “[d]efendants fail to disclose or obtain consent for this collection, use, or storage. And they further fail to disclose why they collect, use, and store [this] biometric data, who has access to the data, or how long the data will be retained – all of which is required by law.”

Biometrics can include fingerprints, faceprints, retina or iris scans, and voiceprints, as well as other information. TikTok uses biometrics in its app; for example, it “trained its AI technology to, among other things, engage in ‘facial recognition for the filters.’” TikTok also allows users “to select an individual’s face in a video and subsequently use TikTok’s facial recognition technology to identify other videos in which that person appears.” To obtain biometrics, BIPA requires companies “to provide notice that they are collecting biometric information, obtain written consent, and make certain disclosures.” They must also have a written retention policy and guidelines for destroying this information.

The plaintiffs’ children use TikTok “to record personal videos, some of which were intended to be kept private and others which were to be uploaded to the platform.” They state that neither they nor their children “recall seeing or reviewing the terms of service, privacy policy for younger users upon creating their TikTok accounts.” They also do not believe they were notified of any terms or policy updates. The plaintiffs assert that they were not notified that defendants would “collect, store or use their biometric identifiers or biometric information” or that this information would be retained and why this information would be retained. Further, the plaintiffs state that they did not consent to this collection and conduct.

The plaintiffs said they were concerned that TikTok uses the personal information and biometrics that it collects to target content, features, and effects to users. For example, the “For You” homepage “is an algorithmic feed that recommends targeted videos for each user, even if the user never posted anything, followed anyone, or liked a video.” However, “[o]nce active on and engaging with the app, TikTok’s AI Algorithm continuously records each action users take on the app, including what videos each user watches, how long each user watched a particular category of video, and which advertisement a user engages with, as well as their current location.”

TikTok also uses biometrics to identify and categorize users by race, age, and gender, according to the complaint. The app allegedly recommends videos based on these characteristics, having users follow accounts with similar characteristics to those a user currently follows. For example, if a user followed an Asian man they were provided with suggestions of other Asian men to follow, and if a user followed a white man with a beard they were suggested other white men with beards.

The plaintiffs claimed that “TikTok attracts children to its platform and collects their private and legally protected data, but leaves them vulnerable and exploited.” They claim that a large percentage of users are young. For example, 28 percent are under 18, and 60 percent are between 16 and 24 years old. Additionally, “70% of 10-year-old girls with smartphones in the U.S.  downloaded and used TikTok in 2019.” TikTok was previously sued for violating the Children’s Online Privacy Protection Act (COPPA). 

TikTok also allows users to purchase “coins,” which are used to “reward” “individuals who create the videos. TikTok content creators are then able to cash out their coins into real currency.” The complaint raises concerns that TikTok targets unknowing children ant entices them to purchase these coins. “TikTok has designed its ‘coin’ packages to attract children to purchase them, with or without parental consent, by making them easy to purchase and including playful images such as pandas and rainbows.” It has been viewed that children could be taken advantage of as a result of this.

The plaintiffs have sought certification as a class action; injunctive relief to prohibit defendants allegedly unlawful conduct; an award for compensatory, consequential, general, and nominal damages; an award for statutory damages, trebled, and punitive or exemplary damages; declaratory relief; an award for costs and fees; pre- and post-judgment interest; and other relief as determined by the court.

The plaintiffs are represented by Fegan Scott LLC.

TikTok has faced criticism for its “inadequate data protection practices.” It has been alleged that not only does TikTok collect biometrics and other data without consent, it fails to adequately protect this information. For example, the U.S. Navy and Army banned TikTok over safety and security concerns and the TSA has banned TikTok. The app has faced numerous suits from users for its data policy and for surreptitiously collecting their data and information, including biometrics.