Unlock Editor’s Digest for free
FT editor Roula Khalaf selects her favorite stories in this weekly newspaper.
Meta and Google were held liable in a landmark legal case that found the social media platforms are designed to become addictive to children, leading to the tech giants facing penalties in thousands of similar claims filed across the US.
A jury in the Los Angeles trial returned the verdict Wednesday after nine days of deliberations, finding that Meta’s platforms like Instagram and Google’s YouTube were harmful to children and teens and that the companies failed to warn users about the dangers.
The jury awarded $3 million in damages to the 20-year-old plaintiff, who claimed her social media addiction during childhood harmed her mental health – including anxiety, depression and physical disfigurement.
The case would set a precedent for a flood of similar lawsuits and has been compared to the crackdown on Big Tobacco in the 1990s. Thousands of individuals, school districts and state attorneys-general have filed similar claims against social media platforms seeking damages and design changes.
The jury awarded total damages of $6 million. Meta was found liable for 70 percent of the damages – equivalent to $4.2 million – while Google was liable for 30 percent.
Meta said: “We respectfully disagree with the decision and are evaluating our legal options.”
Google spokesman Jose Castañeda said, “We disagree with the decision and plan to appeal. This case misunderstands YouTube, which is a responsibly-built streaming platform, not a social media site.”
Snap and TikTok settled for an undisclosed amount before trial.
The verdict is another blow to Meta after a jury in New Mexico on Tuesday found it liable for failing to protect children from sexually explicit material, solicitation and human trafficking. The company was ordered to pay $375 million in civil penalties, but said it would appeal the decision.
The wave of US lawsuits is part of a global backlash against Big Tech, with Spain and Australia banning social media access for people under 16. Britain and France are also considering similar measures.
The EU is investigating whether social media harms the physical and mental well-being of users by using addictive features such as constantly scrolling through feeds to increase user engagement and thus show more ads.
Meta boss Mark Zuckerberg was the most high-profile executive to testify in the Los Angeles case. He admitted that he rejected the ban on Instagram beauty filters – despite experts advising that they promote body dysmorphia – because he was more concerned about “free expression”.
He also said that Meta no longer sets internal targets for the time users spend on the platform. The jury was shown internal documents from 2013 and 2022 in which he and other employees explicitly stated that increasing time spent was a goal or milestone, including for teenage users.
Other emails show employees acknowledging potential addiction to the social platform. “IG (Instagram) is a drug…we are basically pushers,” one researcher wrote in an email. He said Instagram chief Adam Mosseri “flew” when he raised the topic of dopamine hits from social media use.
Meta and Google hoped that Section 230 of the US Communications Decency Act would protect them. The legal provision holds that social platforms are not liable for user-generated content.
However, the plaintiffs’ lawyers successfully argued that the case is not about the content, but rather how the platform is designed with addictive features like “likes” that encourage social comparison, “infinite scrolling”, and push notifications.
Meta said that “significant free speech implications were at stake”, adding that these cases “threaten to destroy Section 230 and the First Amendment protections that protect free expression online”.