Court papers claim Meta designed platforms to addict children

Updated on:
Smartphone screen with Instagram application on playstore and the logo of Meta on the background

Meta intentionally designed its platforms to addict children and knowingly permitted underage users to maintain accounts, newly unsealed court documents claim.

The documents are part of a lawsuit filed by attorneys general from 33 states in October.

They claim Meta was aware of millions of complaints about underage users on Instagram.

However, it adds that the Instagram owner only disabled a fraction of those accounts.

Read More: Meta Calls For Parental Control Law For Under-16s’ App Downloads

The papers claim the company has treated the large number of underage users as an “open secret.” 

The complaint cites internal company documents and details instances where employees allegedly ignored requests to take down accounts of underage users, questioning their age verification capabilities.

It also accuses Meta of violating the Children’s Online Privacy and Protection Act by not providing notice and obtaining parental consent before collecting data from children. 

In 2021 alone, Meta reportedly received over 402,000 reports of users under 13 on Instagram.

Read More: Meta Lawsuit Accuses Mark Zuckerberg Of Neglecting Fears Over Instagram’s Impact On Mental Health

However, it only disabled around 164,000 accounts for potentially being underage, leaving a significant backlog of accounts awaiting action.

The lawsuit also delves into longstanding allegations Meta deliberately created addictive and harmful products for children. 

The complaint cites whistleblower Frances Haugen, who exposed internal studies showing platforms like Instagram led children to anorexia-related content. 

Company documents mentioned in the complaint reportedly show Meta officials acknowledging the design of products exploiting vulnerabilities in youthful psychology.

Looking to boost your online brand? Create your FREE business profile at WhatBiz? here.

Age verification is a “complex industry challenge”

Meta said the complaint misrepresents its decade-long efforts to enhance online safety for teens.

The company says it has “over 30 tools to support them and their parents”.

Meta argues age verification is a “complex industry challenge” and proposes shifting responsibility to app stores and parents.

Need Career Advice? Get employment skills advice at all levels of your career

The Facebook owner has advocated for federal legislation requiring app stores to obtain parental approval for users under 16.

The documents also reveal internal emails indicating a Facebook safety executive’s concerns about the potential impact on the company’s business if it cracked down on younger users. 

However, the same executive expressed frustration that while Facebook studied underage user usage for business purposes, it lacked enthusiasm for identifying and removing younger children from its platforms.

Follow us on YouTubeXLinkedIn, and Facebook.