LONDON — The European Union on Thursday demanded Meta and TikTok detail their efforts to curb illegal content and disinformation during the Israel-Hamas war, flexing the power of a new law that threatens billions in fines if tech giants fail to do enough to protect users.
The European Commission, the 27-nation bloc’s executive branch, formally requested that the social media companies provide information on how they’re complying with pioneering digital rules aimed at cleaning up online platforms.
The commission asked Meta and TikTok to explain the measures they have taken to reduce the risk of spreading and amplifying terrorist and violent content, hate speech and disinformation.
It’s the prelude to a possible crackdown under the new digital rules, which took effect in August and have made the EU a global leader in reining in Big Tech. The biggest platforms face extra obligations to stop a wide range of illegal content from flourishing or face the threat of fines of up to 6% of annual global revenue.
The new rules, known as the Digital Services Act, are being put to the test by the Israel-Hamas war. Photos and videos have flooded social media of the carnage alongside posts from users pushing false claims and misrepresenting videos from other events.
Brussels issued its first formal request under the DSA last week to Elon Musk’s social media platform X, formerly known as Twitter.
European Commissioner Thierry Breton, the bloc’s digital enforcer, had previously sent warning letters to the three platforms, as well as YouTube, highlighting the risks that the war poses.
“In our exchanges with the platforms, we have specifically asked them to prepare for the risk of live broadcasts of executions by Hamas — an imminent risk from which we must protect our citizens — and we are seeking assurances that the platforms are well prepared for such possibilities,” Breton said in a speech Wednesday.
Meta, which owns Facebook and Instagram, said it has a “well-established process for identifying and mitigating risks during a crisis while also protecting expression.”
After Hamas militants attacked Israeli communities, “we quickly established a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation,” the company said.
Meta said it has teams working around the clock to keep its platforms safe, take action on content that violates its policies or local law, and coordinate with third-party fact checkers in the region to limit the spread of misinformation.
TikTok didn’t respond to a request for comment.
The companies have until Wednesday to respond to the commission’s questions related to their crisis response. They also face a second deadline of Nov. 8 for responses on protecting election integrity and, in TikTok’s case, child safety.
Depending on their responses, Brussels could decide to open formal proceedings against Meta or TikTok and impose fines for “incorrect, incomplete, or misleading information,” the commission said.
Be the first to comment