The Digital Services Act’s Top Enforcement Priority: Tackling CSAM First

Introduction: An Uncontroversial Crusade

The EU’s Digital Services Act (DSA) is a sweeping new rulebook aiming to civilize the online Wild West. It tasks platforms with everything from reining in disinformation to removing illegal content. But with so many ills plaguing the internet, where should enforcers begin? One target rises to the top by moral consensus and urgent necessity: child sexual abuse material (CSAM). Unlike murkier content disputes, CSAM is unequivocally illegal and universally reviled – even the staunchest free speech defenders draw the line at protecting child abuse imagery. If European regulators want a quick win that delivers tangible harm reduction without kicking a political hornet’s nest, cracking down on CSAM is the obvious low-hanging fruit.

DSA Obligations: Illegal Content in the Crosshairs

Under the DSA, online intermediaries have clear duties to act against illegal content. All platforms must remove or disable access to illegal material “expeditiously” once they’re aware of it, or face hefty penalties. In fact, failure to remove content like CSAM promptly can invite fines as high as 6% of a company’s global turnover. The law empowers national authorities to issue orders to act against illegal content across borders – a French regulator can demand removal of CSAM from a platform based in Ireland, for example, and the platform is legally obliged to comply.

To facilitate swift action, the DSA requires user-friendly “notice and action” systems (Article 16) so that users or trusted bodies can flag illegal posts for removal. It also elevates “trusted flaggers” – typically expert organizations like child protection hotlines – whose reports must be treated with priority by platforms. This means when INHOPE member hotlines or similar trusted flaggers report CSAM, companies are bound to react faster than for the average user flag. In short, the DSA builds a framework where a CSAM video spotted by a hotline can be swiftly notified and taken down, with the full backing of EU law.

Crucially, the DSA also breaks new ground by requiring proactive cooperation with law enforcement. Article 18 obliges hosting services to inform police when they become aware of information indicating a serious crime threatening someone’s life or safety (). It’s hard to imagine a clearer case for this than CSAM – identifying a user sharing child abuse material obviously “gives rise to suspicions” of a grave crime. In practice, this means a platform that finds an apparent child abuse video not only must erase it, but also should alert authorities to help locate and prosecute the perpetrator. The DSA thus pushes platforms beyond mere content removal towards actively aiding criminal justice in egregious cases.

(For the legal eagles: the DSA’s tough stance on CSAM doesn’t exist in a vacuum. It complements earlier laws like the 2011 EU Directive on Child Sexual Abuse, which made distribution of child pornography a crime in all Member States and even called on countries to remove or block CSAM websites. In essence, EU law has long outlawed CSAM; the DSA now compels Big Tech to police it — or be policed.)

CSAM Online: Prevalent and Persistent on Big Platforms

If CSAM were a rare, vanishing scourge, regulators might be tempted to focus elsewhere. Unfortunately, child sexual abuse material remains shockingly prevalent online, especially on the largest platforms. Year after year, the numbers have climbed to sobering heights. Reports of online child sexual exploitation worldwide leapt from just 1 million in 2010 to 29.3 million in 2021, a record high that included nearly 85 million individual images and videos of child abuse. (Yes, you read that right – tens of millions of horrific files in a single year.) The pandemic period saw an especially grim spike: 2021’s total was 35% higher than the year before. And the trajectory hasn’t slowed. By 2023, global reports to authorities surged past 36 million, prompting Europol and the European Commission to voice alarm at the “growing phenomenon”.

What’s even more striking is where this content is being found. The overwhelming majority of CSAM reports trace back to a handful of major tech platforms – the same household-name social networks many of us use daily. In 2021, for example, Facebook alone was responsible for 22 million reports of child abuse imagery, roughly three-quarters of all the worldwide reports that year. Its sister platform Instagram generated another 3.3 million reports, and even WhatsApp (despite encrypted messages) managed 1.3 million. Other big players like Google and Snapchat contributed hundreds of thousands of reports each. These jaw-dropping figures reflect both the massive scale of these platforms and, one hopes, their detection efforts – but they also undercut any notion that CSAM online is a tiny fringe issue. On the contrary, child abuse content has proliferated on mainstream services, piggybacking on their global reach.

Europe has a particular responsibility in this fight. A huge proportion of CSAM is uploaded and hosted on servers in Europe or by users in Europe, making it very much an EU problem. One study found that in 2022, over half of the known CSAM content hosted in Europe was traced to servers in the Netherlands (a long-standing hub for image hosting), with significant amounts also in Slovakia and elsewhere (). Moreover, the U.S. clearinghouse (NCMEC) reports that over 90% of the CSAM uploads it sees are by users outside the United States. In other words, the rest of the world – and Europe in particular – is heavily implicated in the spread of this material. The internet knows no borders, but EU regulators enforcing the DSA have a strong mandate to act locally against a globally shared threat.

Despite tech companies’ efforts, disturbing content still slips through, and too often it stays available far too long. EU-based hotlines coordinated by INHOPE still process hundreds of thousands of CSAM reports and issue takedown notices. They note that about 67% of illegal content URLs get removed within three days of a hotline notice () – meaning one-third linger longer. Every hour that such content stays online is an hour of continued trauma for abuse survivors (who know images of their suffering are circulating) and an opportunity for predators to consume or share it. The persistence of CSAM online underscores that current measures haven’t eradicated the problem. This is precisely why the DSA’s beefed-up enforcement tools are so vital – and why using them assertively on CSAM should be priority number one.

Low-Hanging Fruit: Why CSAM Enforcement is the Smart Start

Given this backdrop, zeroing in on CSAM enforcement is both morally right and strategically smart for the DSA’s new regulators. Consider the advantages:

  1. Universal Agreement: No one will defend the “right” to post child abuse images. Unlike, say, hate speech or extremist propaganda, there’s no opposing camp arguing that perhaps CSAM should be permitted under free expression. Treading on this issue doesn’t risk ideological controversy – it’s one of the rare internet battles where all sides (governments, NGOs, industry, left, right, and center) agree on the goal. An EU crackdown here won’t spawn culture-war headlines or partisan pushback; it will be cheered on unanimously. (It’s hard to imagine civil liberties groups penning op-eds titled “Let the Child Abuse Videos Stay Online.”)

  2. Tangible Harm Reduction: Every successful removal of CSAM, every network of abusers disrupted, has a direct and tangible benefit in the real world. We’re talking about rescuing actual children from ongoing abuse and stopping the re-victimization of survivors whose images would otherwise be repeatedly viewed and traded. The impact is concrete: fewer vile images in circulation means fewer opportunities for abusers to normalize or profit from child exploitation, and perhaps even fewer victims in the long run. By prioritizing CSAM, DSA enforcers can immediately start making the internet safer for children, scoring a clear humanitarian win that can be measured in arrests made and victims protected.

  3. Proving the DSA’s Teeth: The DSA is shiny and new – and untested. Both supporters and skeptics are watching closely to see if it’s a paper tiger or a real game-changer. Early enforcement against an obvious evil like CSAM is a chance to demonstrate the law’s muscle without ambiguity. If the EU can show that, thanks to the DSA, major platforms are now catching and purging illegal child abuse material faster than ever (and facing consequences if they don’t), it sends a powerful message: the DSA is working. This builds credibility for the regulators and makes it easier to tackle harder issues next. It’s the equivalent of flexing on the easiest opponent first – not to be complacent, but to build momentum and public confidence. As a bonus, such enforcement helps blunt the narrative (popular in some Silicon Valley circles) that the DSA is just about stifling legitimate speech. Showing success in wiping out unequivocally illegal content is the best advertisement for Europe’s new digital rulebook.

Avoiding the Quagmires: CSAM vs. Other DSA Headaches

Focusing on CSAM first is also a savvy way to avoid getting bogged down in the DSA’s more contentious areas right out of the gate. The Act indeed covers other problems – hate speech, terrorist propaganda, misinformation, and more – but some of those are proving fertile ground for disagreement, notably with the Trump administration turning it into a “free speech” absolutism issue, overlooking both its own domestic crackdown on expression as well as the inconvenient fact that US laws do not apply in Europe. 

CSAM, by contrast, is refreshingly straightforward to act on. The line of illegality is bright and clear: a naked child in a sexual pose is not political speech, it’s a crime scene. Regulators don’t need to fear cries of “censorship” when they swoop in on a platform failing to curb child abuse content. In fact, a platform manager would be extremely ill-advised to protest a DSA penalty by saying “we were protecting users’ freedom to trade toddler rape videos” – that’s not exactly a winning argument anywhere. By starting with CSAM enforcement, the EU’s Digital Services Coordinators and the European Commission (which oversees very large platforms) can project unity and moral authority. They sidestep the messy debates and show that the DSA isn’t about Brussels imposing some ideology – it’s about common-sense law enforcement on the internet’s worst crimes. This strategic sequencing doesn’t mean other issues get ignored; it just means the first enforcement actions set a solid, unassailable precedent.

Conclusion: Set the Bar by Clearing the Lowest Bar

The fight against online child sexual abuse material is often described as trying to empty the ocean with a bucket. The internet’s vastness means we’ll never catch every illicit image – but the DSA gives Europe a much bigger bucket and stronger arms to wield it. By making CSAM enforcement the top priority, EU regulators can achieve immediate wins for children’s safety, earn public applause, and build momentum for tackling the trickier terrains of online harm next. It’s a rare policy decision that is at once morally evident, politically safe, and pragmatically effective.

The Digital Services Act was born out of the conviction that what is illegal offline should be illegal (and unenforced) online. Few things are as illegal offline as child abuse. The same should be true on our platforms – and now the law says so unequivocally. Going after CSAM with full force will show that this landmark regulation has a soul and a spine. It proves Europe is serious about enforcing its digital rules where it matters most: protecting the most vulnerable among us. And if the EU can’t succeed in purging the one type of content that everyone agrees has to go, what hope would there be for everything else? Fortunately, by most accounts, this is one battle regulators can win. In the grand narrative of the DSA, cleaning up CSAM online is the obvious opening chapter – and one we’ll all be better off seeing written.

Sources: