DSA reversed almost 50 million moderation decisions in two years

In two years the Digital Services Act empowered EU users to have almost 50 million moderation decisions reversed by platforms. The reversal total comes from 165 million appeals filed since the DSA began to apply, with roughly 30 percent leading to restored content or accounts.
Before the DSA took effect, platform moderation ran largely on platforms own terms, with limited transparency and few enforceable user challenge rights. Big platforms such as Facebook, Instagram, and TikTok managed appeals opaquely and prioritised speed and scale over review accuracy. The DSA changed the rules, introducing user challenge rights, more researcher access, higher marketplace obligations, and a ban on targeted ads to minors that has been in force since 2024.
Reversals at scale
The headline numbers matter for operations. Platforms faced 165 million user challenges and reversed almost 50 million moderation decisions in two years, a reversal rate near 30 percent. In the first half of 2025, platforms reported that 99 percent of enforcement actions were based on their own terms, not only on DSA mandatory notice and action procedures. That shows platforms are still setting the substantive rules, but they now must document and justify enforcement in a way that invites successful challenges.
For engineering and ops teams this means appeal traffic is no longer a rare exception. It is a predictable volume that needs dedicated tooling, audit logs, and retention policies to support reversals. Platform compliance teams must now bake in auditability, human review capacity, and clear documentation of why content was removed in the first place. That will raise moderation costs for everyone from global platforms to EU based marketplaces.
Out of court bodies, researcher access, and incentives
The DSA also routed disputes outside courts more often. Over 1,800 disputes were reviewed by out of court bodies in the first half of 2025, and 52 percent of closed cases resulted in overturning the platforms decision. Wider researcher access and marketplace obligations increased external scrutiny and provided data for civil society to challenge enforcement patterns.
Those mechanisms change incentives. Platforms are now more exposed to reputational and regulatory risk if their automated removal tools produce false positives at scale. Vendors selling moderation automation will need to offer clearer audit trails. Companies that use platform moderation as part of content policy enforcement must assume a non trivial fraction of takedowns will be reversed and plan for reinstatements, user notifications, and potential legal follow up.
Why this matters
For publishers and social teams that publish to Facebook or TikTok, the DSA means appeals will routinely restore removed items, so workflows must include rapid reinstatement and dispute tracking. For platform engineers the implication is technical: preserve detailed logs, add human review for edge cases, and instrument appeals as a core feature. Any organisation relying solely on automated takedowns should treat the compliance gap as a legal and operational liability and implement an appeals pipeline with audit logging by the end of 2026.
Sources
European Commission - Digital news on the Digital Services Act
The Verge coverage of platform accountability and internal priorities
EFF Deeplinks on expanded surveillance and platform data risks
Ready to Switch to EU Alternatives?
Explore our directory of 400+ European alternatives to US tech products.
Browse Categories