California Jury Holds Meta and YouTube Liable for Harm to Young Women
A jury in California has found Meta and YouTube liable for the harm their platforms cause to young women, attributing their designs as a contributing factor to addiction. The court ordered the companies to pay $3 million, which may lead to significantly larger punitive damages in the future.
This ruling empowers plaintiffs in over 1,000 similar cases currently pending, indicating a shift in how the tech industry may be held accountable for the mental health consequences of its design choices. Jurors unanimously confirmed that both Meta, the parent company of Facebook and Instagram, and YouTube were negligent in their platform operations, significantly impacting the plaintiffs’ well-being.
The jury also determined that both companies were aware of the risks their services posed to minors, yet they failed to provide adequate warnings. Such negligence, the jury found, represents a significant lapse in responsibility that a reasonable platform operator would not have overlooked.
A lawyer representing the plaintiffs declared that accountability has been established. In contrast, a spokesperson for Meta expressed their disagreement with the verdict, mentioning that the company will contemplate its options moving forward.
Significant Implications for Social Media Giants
The jury allocated 70 percent of the liability for the plaintiffs’ damages—amounting to $2.1 million—to Meta, while YouTube was assigned the remaining 30 percent, equating to $900,000. As two additional cases are on the docket in the same Los Angeles court, the outcomes may shape the future strategies of social media companies, whether they choose to contest the rulings or pursue comprehensive settlements that involve redesigning their platforms.
Industry analyst Jasmine Enberg of Scalable highlighted that although $3 million may seem substantial for these advertising giants, the necessity to redesign their products could pose an existential threat to their business models. The jury also identified bad faith, coercion, or fraud on the part of the companies, which sets the stage for potential punitive damages to be determined in subsequent court proceedings.
YouTuber Lewis Lee extended his apologies to a plaintiff known as KGM in court documents, acknowledging the pain she has experienced. However, he cautioned jurors that punitive damages should be linked to specific incidents rather than being influenced by broader societal movements. KGM, who began using YouTube at age six and Instagram at nine, provided testimony about how social media use negatively impacted her self-esteem and led to difficulties in forming friendships.
In his closing arguments, plaintiffs’ attorney Mark Lanier condemned the case as a reflection of corporate greed, arguing that features such as infinite scrolling and notifications are intentionally designed to promote compulsive usage among young individuals. Meta and YouTube have persistently denied the connection between KGM’s mental health challenges and their platforms, maintaining that her issues stem from other sources.
Further Developments in New Mexico
In another related case, a jury in New Mexico found Meta liable for compromising children’s safety on its platform, making them vulnerable to predators and various dangers. Although the state sought $2.2 billion in damages, the jury ultimately awarded $375 million. This ruling adds to the growing legal pressures facing social media companies, particularly regarding child safety.
Paul Schmidt, the lawyer representing the New Mexico plaintiff, highlighted family dynamics and presented recordings to the jurors to illustrate the plaintiff’s struggles. YouTube contested the duration of time the plaintiff spent on their platform, claiming the records indicated she only used specific features for just over a minute a day.
Despite the defenses put forth, the jury unanimously rejected all arguments from the companies, indicating a clear stance against social media giants in cases related to mental health and safety risks associated with their platforms. The implications of these ongoing legal battles could redefine the responsibilities of technology companies in safeguarding their users.
