In January, Seattle public schools filed a lawsuit against Meta, YouTube, TikTok and Snapchat in a bid to hold such companies accountable for social media’s impact on children. Schools in California, Florida, New Jersey and Pennsylvania have since followed, opening up the conversation around how accountable social media companies should be for users’ safety and wellbeing.
One of the key issues in the lawsuit is children’s mental health, as an official US survey on youth risk behaviour showed the number of young people experiencing “persistent feelings of sadness or hopelessness” has increased from 28% in 2011 to 42% in 2021. President Joe Biden also addressed the issue in his state of the union speech in February. “We must finally hold social media companies accountable for the experiment they are running on our children for profit,” he said.
The Seattle schools’ lawsuit pointed out that social media platforms “exploit the same neural circuitry as gambling and recreational drugs to keep consumers using their products as much as possible”. It also referenced studies into social media usage among teens – one showed users check Snapchat 30 times per day and nearly 20% of teens use YouTube “almost constantly”.
While the platforms can’t be held directly responsible for the content posted (although they do make efforts to monitor and remove harmful material), the schools are claiming the platforms’ algorithms are seeking to maximise time spent on the apps to the extent of showing dangerous content to users – including children.
You may also like
Former Facebook employee and Meta whistleblower Frances Haugen revealed several flaws in the company’s strategy and routines through leaked documents in 2021. Among them was the discovery that Instagram knew about the toxic effect it was having on body image, particularly for teenage girls, and that Meta was working to attract preteens to its platforms. Facebook’s “angry” reaction was considered five times more valuable than a “like” and was prioritised by the algorithm, which has helped spread hate, controversial posts and divisive attitudes.
The US schools are not the first to sue media companies over issues around insufficient child protection. In 2022, The Social Media Victims Law Center in California filed a lawsuit against Roblox, Discord, Snapchat and Meta after a child was encouraged to engage in harmful behaviour by an adult. The child was nine-years-old when she met the adult man on Roblox, who then continued to communicate with her over Discord, Snapchat and Instagram.
In early June, Meta set up a taskforce to counter child exploitation after it was revealed Instagram was leading users to child sexual abuse content. This comes alongside pressure from the EU on Instagram to fix its child protection issues. The US schools’ lawsuits – which now includes a total of 40 school districts across ten states – are ongoing and it remains to be seen if more details will be uncovered about the social media platforms’ algorithms and how they are perpetuating harmful content.
By Dina Zubi, CORQ news and features writer.