In a recent development, the European Commission has set its sights on social media juggernauts Meta and TikTok, demanding they step up their game in tackling the spread of terrorism-related, violent, and hate content on their platforms.
This move comes just a week after a similar directive was issued to Elon Musk’s X, as the European Union aims to combat the proliferation of misinformation following Hamas’ recent attack on Israel.
A Growing Issue of Disinformation
The European Union’s executive body, the European Commission, wasted no time in making its intentions clear.
Recognizing the surge in disinformation after the recent Hamas attack, it has officially requested Meta and TikTok to provide details about the measures they have taken to combat the proliferation of terrorist, violent, and hate speech content on their platforms.
Potential Investigations Loom
Should the European Commission find the responses from these social media giants unsatisfactory, it holds the power to initiate investigations into their operations.
The Commission is not taking these matters lightly and is determined to ensure that online platforms adhere to new rules set forth in the Digital Services Act (DSA), which came into force recently.
New Rules Under the Digital Services Act
Under the recently enacted Digital Services Act (DSA), major online platforms are now obliged to be more proactive in removing illegal and harmful content from their websites.
Failure to do so could result in hefty fines, amounting to as much as 6% of their global turnover.
This stringent approach aims to safeguard the digital landscape and protect users from harmful content, whether it be terrorist propaganda or hate speech.
Deadline for Compliance
The European Commission has provided specific deadlines for both Meta and TikTok to furnish the requested information. Meta has been given until the 25th of October, 2023, to respond to inquiries regarding their crisis response measures.
They must also provide a comprehensive account of their actions aimed at protecting the integrity of elections by the 8th of November, 2023.
TikTok’s Compliance Schedule
TikTok, on the other hand, is expected to follow the same schedule as Meta. They have until the 25th of October, 2023, to address queries related to their crisis response strategies.
In addition, they must also present their measures for safeguarding the integrity of elections and protecting minors online by the 8th of November, 2023.
As the European Commission intensifies its efforts to combat the spread of harmful and illegal content online, Meta and TikTok find themselves in the spotlight.
The coming weeks will reveal whether these social media giants can sufficiently address the concerns raised by the European Commission and adhere to the strict regulations outlined in the Digital Services Act.
For now, the focus is on ensuring the digital realm remains a safe and reliable space for users across Europe.