A Los Angeles jury has, for the first time, concluded that Meta’s platforms—Instagram, Facebook, and WhatsApp—along with Google (YouTube), bear responsibility for causing psychological harm to a child.
Markets reacted immediately: Meta’s shares fell by 8.5%, while Reddit and Snap also recorded sharp declines—more than 10% and 11% respectively during Thursday trading.
The plaintiff, Kaylee, now 20, began using YouTube at the age of six and Instagram at nine. By the age of ten, she had developed anxiety and depression, later confirmed by a psychotherapist. She was subsequently diagnosed with body dysmorphic disorder—a condition affecting the perception of one’s own appearance—which she links to prolonged use of Instagram filters.
“I stopped communicating with my family because I spent all my time on social media,” Kaylee said.
According to her lawyer, Mark Lanier, core features of the platforms—infinite scroll, push notifications, and other engagement mechanisms—were deliberately engineered to retain younger users. This, he argued, translates directly into advertising revenue while imposing costs on users’ mental health. As evidence, the court was presented, among other materials, with internal correspondence from 2015 in which Meta’s chief executive, Mark Zuckerberg, demanded a 12% increase in the time users spend on social media.
The court ordered the companies to pay $6m, though they intend to appeal the ruling.
The verdict could set a precedent for thousands of similar lawsuits and, for the first time, establishes the principle that harm may arise not only from content, but from platform design itself—engineered to maximise user engagement.