The European Union has officially launched a formal investigation into Facebook and Instagram over concerns regarding child protection.
This investigation stems from the implementation of the Digital Services Act (DSA), the bloc's online governance regime, which came into effect last August.
The investigation could have significant implications for Meta, the parent company of Facebook and Instagram, as it grants EU enforcers additional powers such as conducting office inspections and applying interim measures.
Any confirmed violations of the DSA could result in fines of up to 6% of Meta's global annual turnover.
Commission officials have expressed suspicions that Meta has not adequately assessed and mitigated risks affecting children on its platforms. They have highlighted concerns about the addictive nature of the design on both social networks, particularly the "rabbit hole effect," where children are continuously exposed to similar content due to the platforms' algorithmic recommendations.
Additionally, concerns have been raised about the effectiveness of Meta's age assurance methods, which may be easily bypassed by children. The EU is specifically investigating whether Meta has violated Articles 28, 34, and 35 of the DSA, which address systemic risk assessment and mitigation, protection of minors, and age verification measures.
Meta has been contacted for a response regarding the investigation. This development follows a similar probe launched last month into TikTok over similar concerns about addictive design. The Commission has also opened two other DSA investigations into Facebook and Instagram related to election integrity.
This investigation marks a significant step in the EU's efforts to enforce stricter online regulations and protect children from potential harm on social media platforms.