A motion to dismiss filed on Monday by ByteDance Inc., Meta Platforms Inc., Snap Inc. and TikTok Inc. sought to dismiss a suit filed by a parent over their daughter’s death by suicide, ostensibly driven by mental health issues developed from her use of Instagram, TikTok, and Snapchat.
The defendants’ filing asserted that Section 230 of the Communications Decency Act shields them from liability, and secondarily, that the First Amendment bars the plaintiffs’ claims because they impermissibly challenge the platforms’ content moderation policies.
The operative complaint, filed in May, alleges that the minor’s use of the platforms resulted in adult male users soliciting her for sexual exploitation. The filing avers that the defendants’ refusal to verify identity and age for new users “encourages” such sexual exploitation and minor abuse.
The daughter purportedly developed “numerous mental health conditions including an eating disorder, self-harm, and physically and mentally abusive behaviors toward her mother and sibling and underwent multiple inpatient psychiatric admissions,” the complaint says.
The lawsuit is one of many recently lodged against Meta, TikTok, and Snap accusing them of fomenting a mental health crisis in adolescents and teens. Similar to other suits, the instant one seeks to hold the defendants accountable for strict liability design defects, failure to warn, negligence, statutory unfair business practices, and federal sex trafficking.
In this week’s dismissal bid, the defendants first empathized with the parent’s loss, but said that they could not be held accountable for material created and posted by third parties. In line with its goal of promoting free expression on the internet, the defendants reiterated that “Section 230 forecloses any claim that seeks to impose liability on interactive computer service providers like Defendants for the alleged effects of third-party content—including, as in this case, third-party content neither condoned nor permitted by the provider.”
Next, and although the plaintiff cannot argue around the fact that the defendants publish third-party content, the claims further fail to pass constitutional muster, the opposition said. “The First Amendment prohibits forcing a communications service to adopt or enforce particular content policies or practices, because such policies are themselves an exercise of a platform’s protected ‘editorial control and judgment,’” the platforms explained.
Lastly, the filing asserted that the plaintiff fails to plead essential elements of her claims, warranting dismissal of the entire complaint. The motion hearing is set for mid-October in San Francisco, Calif.
The plaintiff is represented by Social Media Victims Law Center PLLC. Facebook is represented by Gibson Dunn & Crutcher LLP and Covington & Burling LLP and TikTok by Munger Tolles & Olson LLP and King & Spalding LLP.