Antisocial media: Meta, Google liable in landmark case for mental health harm

Infinite scroll, likes, autoplay and other features that exploit users' dopamine cravings replicate neurological processes that fuel gambling addiction

I've yet to meet a parent who wished their child had more screen time.

Though just how far the problem can go was laid bare in the recent case of KGM against Meta and Google in a US court.

The 20-year-old plaintiff, KGM, sued the owners of Instagram and YouTube, claiming that her extensive use of those platforms was driven by a deliberate use of addictive design features and damaged her mental health.

Do you have questions about the biggest topics and trends from around the world? Get the answers with SCMP Knowledge, our new platform of curated content with explainers, FAQs, analyses and infographics brought to you by our award-winning team.

KGM said she had been using YouTube since she was six and Instagram since she was nine, spending up to 16 hours a day on these platforms.

The California jury agreed. They found Meta and Google liable, awarding KGM US$6 million in damages. The owners of TikTok and Snapchat had reached settlements shortly before the case went to trial.

Perhaps the result shouldn't come as much of a surprise? Social media platforms generate profits through advertising. Retaining eyeballs on screens for as long as possible is a commercial imperative.

Still, the science behind all this does seem rather creepy. KGM's legal team adduced evidence from various experts and whistle-blowers to the effect that social media platforms are designed to hijack dopamine reward systems in the brain.

Features such as infinite scrolling, algorithmic recommendations, autoplay and push notifications were created and optimised to drive frequent and sustained engagement. (Keep reading for more exciting revelations below!)

The foundations of these systems have been tried and tested in the gambling industry. Slot machines - or "pokies" in Australia, where 0.3 per cent of the world's population plays host to 18 per cent of its poker machines - wouldn't be very profitable if players lost every time.

Frequent small "wins", coupled with the promise of a large - but highly improbable - jackpot, keep players spinning away to statistically certain net losses.

If that seems an unsavoury foundation for the design of platforms that can be used by children, consider, for a moment, the potential implications of all this for freedom of thought and choice.

The Cambridge Analytica scandal in 2018 revealed mass collection of Facebook users' data to build deep psychological profiles of individuals. These profiles were then deployed in political campaigns (including, reportedly, during the 2016 US presidential election), enabling highly targeted, personalised political advertising.

The objective was to selectively present information that would influence how people cast their vote.

While children can't vote (yet), KGM's lawyers argued that Meta and Google had set growth targets for young users in particular. Child users, the data showed, were more likely to remain loyal over the longer term.

Meanwhile, in another case brought by the New Mexico Attorney General, Meta was found to have breached consumer protection laws by misleading users of Facebook, Instagram and WhatsApp into believing that these platforms were safe for children, when they in fact enabled child sex abuse and human trafficking. Meta was fined US$375 million in that case.

The California and New Mexico results have been hailed as landmark cases. While the sums involved are probably little more than a rounding error for Meta - whose market capitalisation sits at around US$1.45 trillion at the time of writing - they could set a significant precedent.

Both appear to have been test cases, with many hundreds or thousands of similar cases expected to follow.

Meta said of the New Mexico case that "we respectfully disagree with the verdict" and will appeal. (This is a practised form of words lawyers often use when they have no respect at all for a verdict but do not wish to be seen to denigrate the court system more generally).

Australia's answer to all of this has been emphatic: to ban social media use for all children under 16. Since the ban took effect on December 10 last year, some 4.7 million social media accounts have been deactivated by the numerous platforms to which the ban applies.

While it is too early to draw firm conclusions about Australia's grand social experiment, and the many detailed studies commissioned on the impact of the ban are not expected to produce robust, verifiable results for some years, there has been much positive reporting so far.

While the age verification mechanism is reportedly easy to circumvent, such as by using a VPN, the whole proposition of social media may be a little less enticing if one's peers are out in the park kicking a ball, not posting content to be liked, followed and emulated - or perhaps mocked, derided and bullied.

Has social media had its "big tobacco moment"? I'm not sure, but my phone just pinged so I'll rush off and see who's "liked" my latest post.

Tim Parker SC is a former council member of the Hong Kong Bar Association and a civil litigator. He specialises in public and international law, commercial disputes and competition, and is also qualified in England and Wales.

Legal Tales is a weekly column by senior members of the Hong Kong Bar Association presenting their perspectives on current affairs.

More Articles from SCMP

Arco movie review: Annecy Cristal winner is a Peter Pan-esque sci-fi tale for our times

Xi Jinping backs services sector to power China’s next growth phase

Joao Moreira and Caspar Fownes launch three-month partnership with four-timer at Happy Valley

Why the US-Iran ceasefire is seen as a failure for Donald Trump

This article originally appeared on the South China Morning Post (www.scmp.com), the leading news media reporting on China and Asia.

Copyright (c) 2026. South China Morning Post Publishers Ltd. All rights reserved.

Post a Comment for "Antisocial media: Meta, Google liable in landmark case for mental health harm"