EU investigates Meta over Facebook and Instagram child safety
European Union regulators have opened a formal investigation into Meta for potential breaches of online content rules relating to child safety in its Facebook and Instagram platforms.
The European Commission said on Thursday it was concerned the algorithmic systems used by the popular social media platforms to recommend videos and posts could “exploit the weaknesses and inexperience” of children and stimulate “addictive behaviour”.
Its investigators will also examine whether these systems are strengthening the so-called “rabbit hole” that leads users to increasingly disturbing content.
“In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta,” the bloc’s executive arm said in a statement.
The investigation is being conducted under the Digital Services Act (DSA), a law that compels the world’s largest tech firms to enhance their efforts in protecting European users online.
The DSA has strict rules to protect children and ensure their privacy and security online.
Thierry Breton, the EU’s internal market commissioner, said on X the regulators were “not convinced that Meta has done enough to comply with the DSA obligations – to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram”.