Skip to content

A seismic shift – Group litigation against Big Tech in the UK is only a matter of time


26th Mar 2026

The current litigation in the US

In California, Kaley, a 20 year old woman alleged that she became addicted to YouTube from age 6, and Instagram from age 9, leading to depression and suicidal thoughts. TikTok and Snapchat settled before trial. Meta and YouTube fought on, and after many days of deliberation by the jury, the jury found that Meta (which owns Instagram) and Google (which owns YouTube) intentionally built addictive social media platforms that harmed the 20 year old’s mental health[1].

The jury awarded Kaley damages of $6m (£4.5m), with Meta to pay 70% and YouTube the remaining 30%. Meta and YouTube have both said they intend to appeal the verdict.

This is what is known as a bellwether trial in the US, with Kaley’s being the first such case to test the law on this matter, but with thousands of plaintiffs with similar actions to follow. Whilst not directly comparable, this US process is similar to group litigation in the UK, with a trial by lead claimants.

The LA verdict came a day after a jury in New Mexico found Meta liable for the way in which its platforms endangered children and exposed them to sexually explicit material and contact with sexual predators[2]. These two decisions in two days mark a seismic shift in the litigation against Big Tech.

In addition, more than 2,200 cases brought by individuals, families, school districts, states and Native American tribes against Meta, TikTok, YouTube and Snapchat over their alleged harms to children have been consolidated in a federal trial in California set to begin in June 2026.

The current landscape in the UK

Against a background of Australia implementing a social media ban for children under the age of 16 from 10 December 2025, and the European Commission’s preliminary finding that TikTok were in breach of the Digital Services Act for its ‘addictive design’ on 6 February 2026[3], it is widely recognised in the UK that the current regulation of social media and other online platforms for under 16s in the UK is insufficient.

The Online Safety Act 2023[4] protects both children and adults online. It puts a range of duties on social media companies and search services, giving them legal duties to protect their users from illegal content, and content harmful to children. As of 25 July 2025, platforms have a legal duty to protect children online. Platforms are required to use highly effective age assurance to prevent children from accessing pornography, or content which encourages self-harm, suicide or eating disorder content.

Critics would argue that the scope of the Online Safety Act is not wide enough: it does not currently cover AI chatbots, leading to the government announcing on 16 February 2026 that it intended to close this loophole. Critics would also argue that it focusses too heavily on ‘harmful content’, whilst not properly addressing ‘safety by design’, and most notably ‘addictive design features’ for young people.

On 21 January 2026, Lord Nash proposed an amendment to the Children’s Wellbeing and Schools Bill for a social media ban for U16s. The House of Lords voted in favour of the amendment, which was then defeated in the House of Commons leading to the current Parliamentary ‘ping pong’. Yesterday (25 March 2026), the House of Lords backed a social media ban for U16s for a second time[5].

On 18 March 2026, the House of Lords also voted in favour of Baroness Beeban Kidron’s amendment to the Crime and Policing Bill which would make it a criminal offence to create an AI chatbot that produced specified content, not to risk assess their products, nor take steps to mitigate the risks (especially to children) from content and design-based harms.

In addition to social media, the risks from AI chatbots and AI companions to children are often less well understood. Megan Garcia, whose 14 year old son Sewell Setzer III died by suicide in February 2024, sued Character AI for the wrongful death of her son following his interactions with the AI companion. That litigation was settled, and from 25 November 2025, Character AI subsequently sought to raise the age limit for using its AI companion to 18.

In the UK, the government has recently announced a national consultation ‘Growing up in the online world’ on 2 March 2026[6], which is due to run until 26 May 2026. As well as considering whether there should be a ban / minimum age for children to access social media, the government is also looking at whether to restrict ‘addictive design features’ that encourage excessive use in children, and how age verification and age assurance technologies might support effective implementation.

As is often the case, when there is a growing concern about the impacts of social media and online platforms to children’s wellbeing, and it appears to be universally recognised that the current ‘status quo’ of social media platform design cannot remain, litigation often arises from a retrospective assessment as to whether the social media platforms ought to have done more to protect children.

Given the raft of litigation in the US, and the UK’s current focus on how best to improve regulation for social media for under 16s, group litigation against Big Tech in the UK is only a matter of time. As with the US, that litigation is likely to focus on the design of social media, and in particular the ‘addictive features’ of any given platform, and the ‘algorithmic harms’ said to have been suffered.

Likely group litigation against Big Tech in the UK

Many cases that start in the US, then follow suit in the UK. Mark Lanier, the US lawyer representing Kaley in the social media addiction trials against Meta and You Tube, also represented the plaintiffs in the artificial hip litigation in the US, which then came to the UK (Gee v DePuy[7]), and the Johnson and Johnson talc litigation, currently in the KBD.

There are already a number of high profile cases of harm being suffered to children in the UK. Ellen Roome has been campaigning for changes to social media after the death of her 14 year old son Jools Sweeney, which she believes was caused by a TikTok challenge gone wrong in 2022. Ian Russell has also been campaigning for changes to social media after the death of his 14 year old daughter Molly Russell who had viewed suicide material online. Coroner Andrew Walker found at her inquest that Molly had died from an act of self harm whilst suffering from depression and the negative effects of on-line content[8].

On a much broader scale, is the increasing view that high levels of technology use can interfere with children’s ability to concentrate and engage effectively in school, disrupt healthy sleep patterns, and contribute to worsening mental health and wellbeing. In its consultation, the government is specifically looking at ‘addictive design features’ and whether they should be restricted for young users.

What might UK litigation look like, and what are the wider implications?

It should be recognised that ‘social media addiction’ is not a term currently medically recognised in the UK. Instead, the term ‘problematic’ smartphone/social media use is appearing in the literature[9]. While the published studies have recognised parallels between ‘problematic’ use and addictive behaviours, an ‘addiction’ has a specific medical meaning.

One key issue that is likely to be fought out in these cases is whether the Claimants’ behaviours are indeed ‘addictions’ (and therefore the ‘addiction’ itself is said to be the harm suffered), or whether if ‘addictive design’ is alleged to lead straight to conventional psychiatric injury (such as depression/anxiety). If the latter, the hurdle of proving ‘addiction’ as injury is avoided, albeit the task of proving ‘addictive design’ may be more difficult.

What may be meant by ‘addictive design features’? Ofcom research into ‘persuasive design’[10] suggests that these may include:

  • Infinite scrolling: Infinite scrolling means new content keeps loading automatically as you scroll down, so the page never really ends.
  • Autoplay features: This is a system feature that automatically starts the next piece of content without requiring a user action, once the current content finishes or when it appears on screen.
  • Affirmation functions: These are designed to give users positive feedback, validation, or encouragement. These include likes and reactions, comments and replies, shares and reposts, follower counts and subscribers, stories and reactions, streaks, achievements and personalised feedback.
  • Alerts and push notifications: An alert is a message that pops up inside an app or website to get your attention right away. A push notification is message sent to your phone or device by an app, even when you’re not using it, to update you or bring you back to the app.

In my view, any group litigation involving social media is likely to focus on the following:

  • Multiple general claims for alleged harm suffered as a result of social media platforms having ‘addictive design features’;
  • More specific claims alleging harm, and even wrongful death, as a result of certain content i.e. dangerous challenges;
  • More specific claims alleging harm, and even wrongful death, as a result of personalised algorithms recommending certain harmful content to children.

Social media is just the beginning. Similar allegations are being raised against gaming platforms, and AI chatbots and companions accused of emotional manipulation. In my view, any litigation against ‘Big Tech’ may in fact be broader than simply ‘social media’ and ‘addictive design’, and may also include:

  • Whether gaming platforms and not just social media platforms are ‘addictive by design’ for young people;
  • Claims relating to harms suffered as a result of children’s interactions with AI chatbots and AI companions.

Meta and You Tube have confirmed that they intend to appeal the decision in the US, and it does not necessarily follow that litigation which is ultimately successful in the US, is also successful in the UK. Key battle grounds in the UK would be likely to include:

(i) Whether the social media platforms as designed were ‘addictive’ or led to ‘addiction’ (and what is meant by that);

(ii) Whether social media platforms had a duty to warn about the potential risks of their products, and particularly excessive use by children;

(iii) Whether social media platforms took reasonable steps to ensure that age verification / assurance was effective;

(iv) Whether the Claimants can prove a general causal link between social media use, and a recognised personal injury;

(v) Whether any individual Claimant could prove a causative link between their social media use and personal injury.

In addition to any product liability litigation, there are also likely to be wider implications for the insurance industry. Whilst previously insurance liability was mainly limited to data breaches or harmful content, harms associated with social media use as a result of its ‘design’, especially among young people, is likely to bring a new wave of digital exposure to the insurance industry. Interesting issues are likely to arise if UK litigation also frames such harms as arising from intentional conduct on the part of platforms as was alleged in the US.

The recent decisions in the US mark a pivotal moment in product liability and litigation accusing platforms of ‘addictive design’, and it is only a matter of time before similar litigation follows in the UK.

Article written by Elizabeth Boon.

You can read the Chambers and Partners Product Liability Safety 2025 Guide here.

 

Footnotes:

[1] https://www.bbc.co.uk/news/articles/c747x7gz249o

[2] https://www.bbc.co.uk/news/articles/cql75dn07n2o

[3] https://ec.europa.eu/commission/presscorner/detail/en/ip_26_312

[4] https://www.legislation.gov.uk/ukpga/2023/50

[5] https://commonslibrary.parliament.uk/research-briefings/cbp-10468/

[6] https://www.gov.uk/government/consultations/growing-up-in-the-online-world-a-national-consultation

[7] https://www.judiciary.uk/wp-content/uploads/2018/05/pinnacle-mom-final-approved.pdf

[8] https://www.judiciary.uk/wp-content/uploads/2022/10/Molly-Russell-Prevention-of-future-deaths-report-2022-0315_Published.pdf

[9] https://www.kcl.ac.uk/news/teens-with-problematic-smartphone-use-are-twice-as-likely-to-have-anxiety-and-many-are-eager-to-cut-down

[10] https://www.ofcom.org.uk/online-safety/safety-technology/research-into-persuasive-design-features-and-potential-child-financial-harms

 

 

 

Portfolio Builder

Close

Select the practice areas that you would like to download or add to the portfolio

Download Add to portfolio
Portfolio close
Title Type CV Email

Remove All

Download