Online networks delete millions of pieces of content, according to EU database

Child pornography, hate speech or terrorist propaganda: Amazon, Facebook, YouTube, Instagram, Pinterest, TikTok and X (formerly Twitter) have deleted or restricted more than 960 million pieces of questionable content in the past six months. This is according to an EU database created by the EU Commission.

(Iconic image: Unsplash.com)

In April, the database contained more than 16 billion entries from 16 major platforms. In addition to large social networks, providers such as Zalando, Booking.com and various Google services also have to report deleted or restricted content.

The background to this is an EU law on digital services - the Digital Services Act (DSA) - which requires large online platforms and search engines to delete such content more quickly and make the reasons for doing so transparent. The data can be filtered and analyzed according to categories such as hate speech, violence or pornography.

Platforms report very differently

The reporting behavior of the individual platforms varies greatly. With more than 14 billion reports, Google Shopping accounts for by far the most. That is almost 94 percent of all posts reported since the end of September.

Almost 508 million posts have been reported and more than 348 million posts deleted on the social media platform TikTok since the introduction of the reporting obligation. That is almost 70 percent of all content reported by TikTok.

Instagram reported around 19 million posts in the first six months of the year. Of these, just over 6.6 million, or just under 35 percent, were deleted completely. More than 148 million posts have been reported on Facebook so far. Of these, just under 54 million - just over a third - were deleted.

According to the database, only just over 832,000 items of content were reported to the online platform X during the same period. As of the beginning of April, only 24 posts were deleted.

Scope for messages for platforms

For communications researcher Jakob Ohme, who conducts extensive research into disinformation, there are several reasons for the differing reporting behavior of the networks: "Firstly, platforms are differently active in content moderation. Secondly, platforms take the reporting obligation differently seriously."

Platform X, for example, is currently undergoing a transformation and is "not known for adhering to regulations that are not strictly enforced". TikTok, on the other hand, is trying to "cut a good figure".

Digitalization expert Julia Kloiber also sees room for improvement: the data collection is only based on self-reporting by the platforms. "Companies can make a lot of claims in their transparency reports. It is important that there is also a thorough review of the information."

EU Commission investigates compliance

The EU Commission can request access to data in the event of suspected violations in order to check compliance with transparency regulations, according to a request from the Brussels authority. Fluctuations in reported content are due to different strategies and different content on the platforms.

In principle, investigations are already underway on the basis of the DSA to determine whether X and TikTok, among others, are complying with the DSA regulations and doing enough to prevent the distribution of illegal content. However, a decision and therefore possible penalties have not yet been made. Should the Commission come to the final conclusion that providers are in breach of the DSA, fines of up to six percent of annual worldwide turnover could be imposed. (SDA)

More articles on the topic