Meta’s products may “exploit the weaknesses and inexperience of minors” to build behavioral addictions that endanger their mental health, according to a statement from the European Commission. E.U. regulators may eventually punish Meta up to 6% of its global revenue, which was $135 billion last year, as well as compel other product changes.
The probes are part of a rising global push by governments to limit the use of sites like Instagram and TikTok in order to safeguard children. Meta has long been accused of designing its products and service recommendation algorithms to appeal to minors. In October, three dozen US states sued Meta for utilizing “psychologically manipulative product features” to lure youngsters, in violation of consumer protection laws.
EU Probes Meta’s Compliance: Digital Services Act and Child Protection Measures
European Union authorities said they had spoken with their US colleagues about the probes launched on Thursday. According to the regulators, Meta may have violated the Digital Services Act, a legislation passed in 2022 that requires large internet services to more actively watch their platforms for unlawful material and have systems in place to limit risks to minors. People under the age of 13 are not allowed to sign up for an account, but EU officials said they will look into the company’s age-verification measures as part of their investigation.
“We will now investigate in-depth the potential addictive and ‘rabbit hole’ effects of the platforms, the effectiveness of their age verification tools, and the level of privacy afforded to minors in the functioning of recommender systems,” Thierry Breton, the EU’s internal markets commissioner, said in a statement. “We are sparing no effort to protect our children.”
On Thursday, Meta stated that its social media platforms were safe for children, pointing to features that allow parents and children to set time restrictions for how much time they spend on Instagram or Facebook. Teenagers are also automatically assigned more restrictive content and suggestion settings. Advertisers are prohibited from targeting minor users based on their activities on Meta’s applications.
“We have spent a decade developing more than 50 tools and policies designed to protect young people,” a statement from Meta said. “We want young people to have safe, age-appropriate experiences online.” “This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
Officials from the European Union did not provide a deadline for the investigation. On Thursday, however, a formal investigation was launched, granting authorities broad power to get information from Meta. This includes requesting documents through legal means, speaking with officials of the firm, and inspecting corporate facilities. Separate investigations will be performed for Instagram and Facebook.
Following the introduction of the Digital Services Act, several corporations have been targeted by EU regulators. TikTok banned a version of its app in the European Union last month after regulators expressed concerns about a “addictive” feature that allows users to earn prizes such as gift cards by watching videos, like content, and following particular creators.
Meta is being investigated again related to political advertising, while Elon Musk’s social networking site, X, is being investigated for content moderation, risk management, and advertising transparency.