Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Brussels has opened an in-depth probe into Meta over concerns it is failing to do enough to protect children from becoming addicted to social media platforms such as Instagram.
The European Commission, the EU’s executive arm, announced on Thursday it would look into whether the Silicon Valley giant’s apps were reinforcing “rabbit hole” effects, where users get drawn ever deeper into online feeds and topics.
EU investigators will also look into whether Meta, which owns Facebook and Instagram, is complying with legal obligations to provide appropriate age-verification tools to prevent children from accessing inappropriate content.
The probe is the second into the company under the EU’s Digital Services Act. The landmark legislation is designed to police content online, with sweeping new rules on the protection of minors.
It also has mechanisms to force internet platforms to reveal how they are tackling misinformation and propaganda.
The DSA, which was approved last year, imposes new obligations on very large online platforms with more than 45mn users in the EU. If Meta is found to have broken the law, Brussels can impose fines of up to 6 per cent of a company’s global annual turnover.
Repeat offenders can even face bans in the single market as an extreme measure to enforce the rules.
Thierry Breton, commissioner for internal market, said the EU was “not convinced” that Meta “has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram”.
“We are sparing no effort to protect our children,” Breton added.
Meta did not immediately reply to a request for comment.
In the investigation, the commission said it would focus on whether Meta’s platforms were putting in place “appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors”. It added that it was placing special emphasis on default privacy settings for children.
Last month, the EU opened the first probe into Meta under the DSA over worries the social media giant is not properly curbing disinformation from Russia and other countries.
Brussels is especially concerned whether the social media company’s platforms are properly moderating content from Russian sources that may try to destabilise upcoming elections across Europe.
Meta defended its moderating practices and said it had appropriate systems in place to stop the spread of disinformation on its platforms.