technology

At Meta, millions of underage users were an 'open secret,' show court filing


Meta has received more than 1.1 million reports of users younger than 13 on its Instagram platform since early 2019, yet it “disabled only a fraction” of those accounts, according to a newly unsealed legal complaint against the company brought by the attorneys general of 33 states.

Instead, the social media giant “routinely continued to collect” children’s personal information, including their locations and email addresses, without parental permission, in violation of a federal children’s privacy law, according to the court filing. Meta could face hundreds of millions of dollars, or more, in civil penalties should the states prove the allegations.

Elevate Your Tech Prowess with High-Value Skill Courses

Offering College Course Website
Indian School of Business ISB Professional Certificate in Product Management Visit
Indian School of Business ISB Digital Transformation Visit
MIT MIT Technology Leadership and Innovation Visit

“Within the company, Meta’s actual knowledge that millions of Instagram users are under the age of 13 is an open secret that is routinely documented, rigorously analyzed and confirmed,” the complaint said, “and zealously protected from disclosure to the public.”

The privacy charges are part of a larger federal lawsuit, filed last month by California, Colorado and 31 other states in US District Court for the Northern District of California. The lawsuit accuses Meta of unfairly ensnaring young people on its Instagram and Facebook platforms while concealing internal studies showing user harms. And it seeks to force Meta to stop using certain features that the states say have harmed young users.

Readers Also Like:  How safe is my data after a hack or leak?

But much of the evidence cited by the states was blacked out by redactions in the initial filing.

Now the unsealed complaint, filed Wednesday evening, provides new details from the states’ lawsuit. Using snippets from internal emails, employee chats and company presentations, the complaint contends that Instagram for years “coveted and pursued” underage users even as the company “failed” to comply with the children’s privacy law.

Discover the stories of your interest


The unsealed filing said that Meta “continually failed” to make effective age-checking systems a priority and instead used approaches that enabled users under 13 to lie about their age to set up Instagram accounts. It also accused Meta executives of publicly stating in congressional testimony that the company’s age-checking process was effective and that the company removed underage accounts when it learned of them – even as the executives knew there were millions of underage users on Instagram.

“Tweens want access to Instagram, and they lie about their age to get it now,” Adam Mosseri, the head of Instagram, said in an internal company chat in November 2021, according to the court filing.

In Senate testimony the following month, Mosseri said: “If a child is under the age of 13, they are not permitted on Instagram.”

In a statement Saturday, Meta said that it had spent a decade working to make online experiences safe and age-appropriate for teenagers and that the states’ complaint “mischaracterizes our work using selective quotes and cherry-picked documents.”

The statement also noted that Instagram’s terms of use prohibit users younger than 13 in the United States. And it said that the company had “measures in place to remove these accounts when we identify them.”

Readers Also Like:  Twitter gives fake Disney account verified status

The company added that verifying people’s ages was a “complex” challenge for online services, especially with younger users who may not have school IDs or driver’s licenses. Meta said it would like to see federal legislation that would require “app stores to get parents’ approval whenever their teens younger than 16 download apps” rather than having young people or their parents supply personal details such as birth dates to many different apps.

The privacy charges in the case center on a 1998 federal law, the Children’s Online Privacy Protection Act. That law requires that online services with content aimed at children obtain verifiable permission from a parent before collecting personal details – like names, email addresses or selfies – from users younger than 13. Fines for violating the law can run to more than $50,000 per violation.

The lawsuit argues that Meta elected not to build systems to effectively detect and exclude such underage users because it viewed children as a crucial demographic – the next generation of users – that the company needed to capture to assure continued growth.

Meta had many indicators of underage users, according to the Wednesday filing. An internal company chart displayed in the unsealed material, for example, showed how Meta tracked the percentage of 11- and 12-year-olds who used Instagram daily, the complaint said.

Meta also knew about accounts belonging to specific underage Instagram users through company reporting channels. But it “automatically” ignored certain reports of users younger than 13 and allowed them to continue using their accounts, the complaint said, as long as the accounts did not contain a user biography or photos.

Readers Also Like:  ChatGPT-driven smart home voice assistant coming soon

In one case in 2019, Meta employees discussed in emails why the company had not deleted four accounts belonging to a 12-year-old, despite requests and “complaints from the girl’s mother stating her daughter was 12,” according to the complaint. The employees concluded that the accounts were “ignored” partly because Meta representatives “couldn’t tell for sure the user was underage,” the legal filing said.

This is not the first time the social media giant has faced allegations of privacy violations. In 2019, the company agreed to pay a record $5 billion, and to alter its data practices, to settle charges from the Federal Trade Commission of deceiving users about their ability to control their privacy.

It may be easier for the states to pursue Meta for children’s privacy violations than to prove that the company encouraged compulsive social media use – a relatively new phenomenon – among young people. Since 2019, the FTC has successfully brought similar children’s privacy complaints against tech giants including Google and its YouTube platform, Amazon, Microsoft and Epic Games, the creator of Fortnite.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.