The European Union Commission has taken a close monitoring of YouTube, Snapchat, TikTok algorithms to mitigate against content breach.
The Commission has requested information from those platforms regarding their algorithms and their roles in amplifying systemic risks; in mental health, protection of minors, among others.
According to the EU Commission, the requests are made under the Digital Services Act (DSA), which aims to hold big tech companies accountable for not tackling illegal and harmful content on their platforms.
The commission also sought to understand the measures the platforms had taken to mitigate the potential influence of their recommender systems on the spread of illegal content, including promoting illegal drugs and hate speech.
TikTok, in particular, has been asked to provide additional information on the measures it had adopted to prevent bad actors from manipulating the application and to reduce risks related to elections and civic discourse.
The commission slammed the tech firms with November 15 deadline to provide the requested information, after which the commission would decide on next steps, which could include fines.
The EU had previously opened non-compliance proceedings under the DSA against other big tech companies, including Meta’s Facebook and Instagram, AliExpress, and TikTok.
In a statement, a TikTok spokesperson said, “This morning, we received a request for information from the European Commission, which we will now review. We will cooperate with the Commission throughout the RFI process”.
According to EU, "The move is part of its efforts to ensure that online platforms are held accountable for their role in shaping public discourse and protecting users from harm."