Meta, TikTok and YouTube heading to trial to defend against youth addiction, mental health harm claims

Los Angeles
—
For years, social media giants have argued against claims that their platforms harm young people’s mental health. Starting Tuesday, they will for the first time have to defend against those claims before a jury in a court of law.
A 19-year-old identified as KGM and her mother, Karen Glenn, are suing TikTok, Meta and Google’s YouTube, alleging that the companies knowingly created addictive features that harmed her mental health and led to self-harm and suicidal thoughts. (Snap, also a defendant, settled last week under undisclosed terms.)
Parents, advocates, health experts, tech whistleblowers and teens themselves have for years worried that that social media platforms can get young people hooked on scrolling, enable bullying, disrupt their sleep and send them down harmful content rabbit holes. Tech executives have repeatedly been hauled before Congress, at one point even apologizing to parents who say their children died or were harmed because of social media. But the companies have nonetheless faced few consequences or regulations in the United States.
KGM’s case seeks unspecified monetary damages. The outcome could influence how more than 1,000 similar personal injury cases against Meta, Snap, TikTok and YouTube are resolved.
Top executives from Meta, TikTok and YouTube are expected to take the witness stand during the trial, which takes place in Los Angeles and is set to last several weeks.
In recent years, TikTok, Meta, YouTube and Snap have rolled out safety features and policies, as well as parental control tools, that they say protect young users.
The four social media companies are involved in other cases this year as well, including some brought by school districts and state attorneys general. Losses could put the tech companies on the hook for billions of dollars in damages and force them to change their platforms.
“For parents whose children have been exploited, groomed, or died because of big tech platforms, the next six weeks are the first step toward accountability after years of being ignored by these companies,” said Sarah Gardner, CEO of the non-profit Heat Initiative, which advocates for child safety online. “These are the tobacco trials of our generation, and for the first time, families across the country will hear directly from big tech CEOs about how they intentionally designed their products to addict our kids.”
KGM’s lawsuit alleges that the social media giants intentionally designed their platforms to be addictive, despite knowing the risks to young people.
KGM, a California teen, started using social media at age 10, despite her mom’s attempts to use third party software to block access to the platforms, according to court documents. “Defendants design their products in a manner that enables children to evade parental consent,” the complaint states.
The “addictive design” of Instagram, TikTok and Snapchat and frequent notifications led her to use the platforms compulsively, the suit alleges, which corresponded with a decline in her mental health.
Features that recommend other users to connect with on Snapchat and Instagram “facilitated and created connections between minor Plaintiff K.G.M. and complete strangers, including predatory adults and others she did not know in real life,” the complaint states. Instagram and TikTok also allegedly “targeted” KGM with “depressive” and “harmful social comparison and body image” content.
On Instagram, KGM alleges she was bullied and sextorted — a scam where a bad actor threatens to share explicit photos of a person if they don’t send money or more photos. It took two weeks and “K.G.M.’s friends and family spamming and asking other Instagram users to report the persons targeting” her for Meta to address the problem, according to the complaint.
“Defendants’ knowing and deliberate product design, marketing, distribution, programming and operational decision and conduct caused serious emotional and mental harms to K.G.M. and her family,” the complaint states. “Those harms include, but are not limited to, dangerous dependency on their products, anxiety, depression, self-harm, and body dysmorphia.”
KGM’s is one of several bellwether cases in a larger multi-district litigation consolidating around 1,500 personal injury cases alleging similar harms because of TikTok, YouTube, Meta and Snap.
In 2024, then-US Surgeon General Vivek Murthy called on Congress to mandate a tobacco-style warning label on social media platforms in light of the “mental health crisis” among young people, something state attorneys general have also advocated for. And a Pew Research Center study published last year indicated that nearly half of US teens believe social media has “mostly negative” effects on people their age.
But tech leaders have for years rejected the idea that social media harms young people’s mental health. They point to a lack of conclusive research on the subject and argue that their platforms provide benefits such as entertainment and connection to friends.
Tech giants have also repeatedly relied on Section 230, a federal law that shields them from liability over content that their users post, as a defense against safety claims. Los Angeles Superior Court Judge Carolyn Kuhl, who is overseeing the KGM and related cases, said last year that jurors should consider whether design features implemented by the companies, like endlessly scrolling feeds, have contributed to mental health harms rather than content alone.
Snap has previously said that Snapchat was “designed differently from traditional social media — it opens to the camera, not a feed, and has no public likes or social comparison metrics.”
Snapchat’s youth safety measures include parental control tools, message warnings desgined to prevent sextortion and mechanisms for removing age-inappropriate content.
Asked for comment, a Meta spokesperson pointed CNN to a website dedicated to its response to the youth mental health lawsuits, where the company claims the suits “misportray our company and the work we do every day to provide young people with safe, valuable experiences online.”
“We have listened to parents, researched the issues that matter most, and made real changes to protect teens online,” Meta states. “Despite the snippets of conversations or cherry-picked quotes that plaintiffs’ counsel may use to paint an intentionally misleading picture of the company, we’re proud of the progress we’ve made, we stand by our record of putting teen safety first, and we’ll keep making improvements.”
Meta’s teen safety features include “teen accounts,” which launched in 2024 to provide default privacy protections and content limits for teen users on Instagram. It also provides parental supervision tools and uses AI to try to identify minor users regardless of the age they provide when they sign up for Meta’s platforms.
In a statement to CNN, YouTube spokesperson José Castañeda said the allegations in the youth mental health lawsuits are “simply not true.”
“Providing young people with a safer, healthier experience has always been core to our work. In collaboration with youth, mental health and parenting experts, we built services and policies to provide young people with age-appropriate experiences, and parents with robust controls,” he said in the statement.
YouTube’s youth safety measures include restrictions on certain kinds of sensitive content, such as violent or sexually suggestive videos, as well as AI identification of minor users. It also offers parental control tools and last week rolled out an option for parents to limit or block their kids from scrolling through its short-form video feed, among other new offerings.
TikTok did not respond to a request for comment on this story.
The youth safety and parental control features TikTok has rolled out in recent years include adding default privacy settings and disabling late-night notifications. Last year, it introduced a “guided meditation” feature purportedly aimed at getting teens to cut back on scrolling.
Despite those efforts, many parents and advocates say social media platforms have still failed to protect young users. Soon, a jury will have a chance to decide if they agree.



