Dawn Chmielewski and Courtney Rozen
Updated ,first published
Los Angeles: Meta and YouTube must pay millions in damages to a 20-year-old woman after a jury ruled the tech giants designed their systems to hook young users without regard for their well-being.
A California jury’s decision in the first case of its kind could affect the outcome of thousands of similar lawsuits accusing social media companies of intentionally causing harm.
The plaintiff, known by his initials KGM, testified at his trial that he started using social media as a child and that this addiction exacerbated his mental health problems. After 40 hours of deliberation, the majority of judges agreed and awarded him $3 million ($4.3 million) in compensation.
Jurors later recommended an additional $3 million in punitive damages after determining the companies acted in bad faith, oppression or fraud in harming children with their platform. The judge has the final decision on the damages.
It is the second sentence against Meta this week, after a New Mexico court ruled that the company harmed the mental health and safety of children, in violation of state law.
Meta and Google-owned YouTube issued statements of disagreement with the decision and vowed to explore their legal options, which include an appeal.
“We will continue to vigorously defend ourselves as every case is different,” Meta spokesman Andy Stone said. “Youth mental health is very complex and cannot be summed up in one program.”
Google spokesman Jose Castañeda said the decision misrepresented YouTube, “which is a streaming platform created responsibly, not a social media site”.
The council found that Meta and YouTube knew the design or operation of their systems was dangerous or could be dangerous when used by a child. They also agreed that the platforms failed to adequately warn of the risk, further contributing to the plaintiff’s harm.
“Today’s decision is a referendum — from the jury, to the entire industry — that accountability has arrived,” the plaintiff’s attorney said in a statement.
Meta shares were up 1 percent and Alphabet shares rose 0.2 percent, little changed after the decision.
The Los Angeles case focused on the platform’s design rather than its content, making it difficult for companies to avoid liability.
Jurors listened to nearly a month of attorneys’ arguments, testimony and evidence, and heard from KGM, or Kaley as his attorneys called him during the trial, as well as Meta leaders Mark Zuckerberg and Adam Mosseri. YouTube’s chief executive, Neal Mohan, was not called to testify.
Kaley said she started using YouTube when she was 6 and Instagram when she was 9 and told the jury she was on the social network “all day” as a child.
‘Infinite scroll’ he questioned
The attorneys representing Kaley, led by Mark Lanier, were tasked with proving that the defendants’ negligence was a major cause of Kaley’s injuries. They referred to specific design features they said were designed to “hook” younger users, such as “unlimited” feeds that offered an unlimited supply of content, autoplay and even notifications.
The judges were told not to consider the content of the posts and videos Kaley saw on the forums. That’s because technology companies are protected from legal liability for content posted on their websites thanks to Section 230 of the Communications Decency Act of 1996.
Meta has repeatedly argued that Kaley had issues with her mental health separate from her use of social media, often pointing to her turbulent home life.
The company also said “none of his therapists identified social media as a cause” of his mental health problems in a statement following closing arguments. But plaintiffs didn’t have to prove that social media caused Kaley’s struggles — only that it was a “significant factor” in her harm.
YouTube focused less on Kaley’s medical records and mental health history and more on her use of YouTube and the nature of the platform.
They said that YouTube is not a form of social media, but a video platform similar to television, and pointed to his declining use of YouTube as he got older.
According to their data, he spent about one minute a day on average watching YouTube Shorts since its inception. YouTube Short Videos, which launched in 2020, is part of the platform’s short-form, vertical videos that have an “infinite scroll” feature, which plaintiffs argued were addictive.
Attorneys representing both systems frequently pointed to the security features and protections each has available for people to monitor and personalize their use.
Snap and TikTok were also defendants in the case. They both settled with the claimant before it started. Terms of the agreement were not disclosed.
States are strengthening social media laws
Major US technology companies have faced a lot of criticism over the past decade regarding the safety of children and young people. The debate has now shifted to the courts and state governments. Congress has refused to pass comprehensive legislation regulating social media.
At least 20 states passed laws last year regarding social media use by children, according to the nonpartisan National Conference of State Legislatures, an organization that monitors state laws.
The law includes bills that regulate the use of phones in schools and require users to prove their age to open social media accounts. NetChoice, a trade association backed by tech companies such as Meta and Google, is seeking to overturn age verification requirements in court.
A separate social media addiction lawsuit brought by several states and school districts against technology companies is expected to go to trial this summer in federal court in Oakland, California.
Another federal trial is expected to begin in Los Angeles in July, said Matthew Bergman, one of the attorneys leading the plaintiffs’ case. It will involve Instagram, YouTube, TikTok and Snapchat.
difference, A New Mexico jury on Tuesday found that Meta violated state law in a lawsuit brought by the state’s attorney general, who accused the company of misleading users about the safety of Facebook, Instagram and WhatsApp and facilitating the sexual abuse of children on those platforms.





