“But you can’t make people listen. They have to come round in their own time, wondering what happened and why the world blew up around them. It can’t last.”
— Fahrenheit 451, Ray Bradbury
Safety First, Privacy Later—That’s a Good Subject
What if your own phone was a tattletale? Every message you have sent, every photo you have shared, every joke you have ever made: scanned, analysed, and stored by the one-eyed state. Not because you are a criminal. Because you might very well be. Or someone else you know might be. This is not the fever dream of a conspiracy theorist. Nor is this the premise of a dystopian novel…yet. This is the reality toward which Europe is heading.
Officially known as the “Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse”, often referred to as the Child Sexual Abuse Regulation (CSAR), CSA Regulation or CSAM proposal, is a European Union regulation proposed by the European Commissioner for Home Affairs Ylva Johansson back in 2022. Its goal is to combat online sexual child abuse by introducing a framework that would legally oblige digital platforms across the EU to detect and report Child Sexual Abuse Material (CSAM).
The law is marketed as a noble crusade to protect children online—a goal no sane person could ever oppose. Yet beneath its benevolent language hides a measure effectively legalising pervasive surveillance of private communications.
The EU calls it CSAR, but it’s really smokescreen digital hypocrisy.
Protecting the Innocent—or Perfecting Control?
In an era of rampant child exploitation, the EU argues, new detection tools are essential to combat the issue. It’s an easy argument to sell. After all, who would say no to saving children? But as history has shown, the most dangerous policies often arrive draped in moral virtue. The road to digital authoritarianism is rarely paved with malicious intent—it is built on good intentions weaponised by those who wield the power.
The question is not whether Europe should protect its children. The question is whether it should sacrifice its entire digital autonomy to do so. The whole “privacy versus child protection” debate is a false dilemma—a convenient fiction that pits two sides against each other when, in truth, they are fighting for the same team. Some like to claim that “there is no other way” to protect children online. But there is and it doesn’t require turning Europe into a surveillance state. The first line of defence isn’t an algorithm. Parents must engage in their children’s digital lives, teach them about risks, manipulation, and the predators that lurk behind anonymous profiles. Outsourcing moral vigilance to machines doesn’t make children safer; it only absolves adults of responsibility.
Inside “The Grid”
For starters, CSAR should not be confused with the ‘Directive on combating the sexual abuse and sexual exploitation of children and child sexual abuse material and replacing Council Framework Decision 2004/68/JHA’. Although both measures share similar names and were discussed in parallel in 2024 and 2025, their aims diverge. The directive updates national criminal-justice and education frameworks on child sexual abuse, whereas the regulation introduces far broader, surveillance-driven systems for detecting and reporting content.
Initially, the concept behind Chat Control was to impose scanning obligations directly on large technology companies. However, the scope of the proposal soon expanded, evolving into a far more intrusive framework—one that would seek to encompass private messaging services such as WhatsApp and Apple’s iMessage—all of which rely on end-to-end encryption.
The following detail matters: the regulation is not yet passed. It’s still a proposal, still negotiable, still vulnerable to outside pressure. It remains under discussion in the Council of the EU, where despite repeated efforts, member states have not yet reached a consensus. Since mid-2025, the Danish Presidency has taken charge of the negotiations, becoming the seventh member state to attempt to forge an agreement. As of 5 November 2025, Denmark introduced revisions to the proposal, choosing to require voluntary detection of sensitive material within private communications rather than imposing blanket surveillance on all users. But don’t be fooled by the language of compromise. “Voluntary” is just Brussels’ new word for “eventually mandatory”. The Danes first need to manage to strike a deal with the Council on this watered-down version.
No algorithm, no matter how advanced, can fully distinguish context from crime. False positives will be inevitable. Parents could find their names stained, their lives upended—all for a photo that once meant love, not guilt. In 2022, the Irish Council for Civil Liberties (ICCL) revealed that the national police and security service of Ireland ‘An Garda Síochána’ has determined that over 11% of the reported cases (471 referrals) contained no CSAM. The individuals concerned were entirely innocent; the flagged content consisted of ordinary images—children playing on a beach. The ICCL maintains that there is no legal basis to do so—under Section 71 of the Irish Data Protection Act 2018, personal data collected for the purposes of criminal investigation may only be stored for as long as necessary to fulfil that purpose. Yet, despite clearing people of wrongdoing, An Garda Síochána has failed to erase their personal data.
Europe’s Fall from Grace
It’s a tragicomedy. A policy that comes from the same EU that built the General Data Protection Regulation, the gold standard of data privacy. The same institution that enshrined it in Articles 7 and 8 of the EU Charter of Fundamental Rights, now proposes a legislation that would institutionalise suspicion.
The European Digital Rights (EDRi) group has already sounded the alarm, asserting that the public attention surrounding it is entirely warranted. EDRi’s “Stop Scanning Me” campaign discloses that in two open letters—civil society groups, security experts and academics have called on the EU to withdraw CSAR. The campaign has put forward a list of systemic structural solutions that address the root cause of the crime itself: education, awareness and prevention, empowerment of survivors, increase in investments in social services and hotlines, reform of police and institutions, enforcement of existing rules and directives, and most importantly uniting and collaborating.
If adopted, the regulation will create a legal contradiction so risky that the EU will have its own “Janus face” moment—a guardian and a violator of privacy. In the long run, it would most definitely erode the credibility of European human-rights leadership—the morally righteous capital morphing into everything that it supposedly hates and so openly condemns.
The Truman Show Society
In the end, the most dangerous surveillance is not the one we resist, but the one we accept.
The one we are supposedly being told is for our own safety. The one that whispers persuasively, that if we have nothing to hide, we have nothing to fear. Bradbury’s firemen lit fires to erase thought. Ours write codes. The EU doesn’t need to burn books—it only needs to read your messages. The future won’t smell of smoke; it will hum the sound of an algorithm that knows you better that you know yourself. The difference is that this time, the fire is digital, and we will be the ones feeding it.
Written by Valerie Schicke, Edited by Konstantin Philipp
Photo Credit: TheDigitalWay (Uploaded August 16, 2016), on Pixabay









