TL;DR
Problem with Chat Control Revival:
Introduces backdoors that undermine end-to-end encryption and can be abused for surveillance
Scanning tools generate false positives, leading to over-censorship and erosion of free expression
Varying rules across 27 member states create legal and technical complexity
Offenders will move to unregulated channels, making illicit content harder to trace
It’s a really bad idea
PrivID’s Better Approach:
Zero-knowledge scanning: Client-side checks produce proofs, not raw messages
Selective disclosure: Only flagged content triggers minimal metadata sharing under strict legal review
Auditability: Every scan and access request is recorded with tamper-proof logs
No performance impact: Millisecond-scale checks, easy integration with existing apps
Denmark’s push to bring back the “Chat Control” bill would force messaging apps to scan end-to-end encrypted content for child sexual abuse material (CSAM) from October 2025. On paper it looks like it’s designed to protect children, isn’t this always the argument they seem to use(?). The reality? It creates technical and legal problems that will undermine encryption, weaken user trust and do little to stop determined offenders.
It creates technical and legal problems that will undermine encryption, weaken user trust and do little to stop determined offenders.
1. Encryption Is Only As Strong As Its Weakest Link
Basically they are introducing a backdoor or a client-side filter. Either approach means breaking the mathematical guarantees that make end-to-end encryption secure. Once a backdoor exists anyone can figure out how to exploit it. History shows that any additional access point will be used not just for its intended purpose but also for surveillance of political dissidents, journalists and minorities.
Inaccurate blocks or take-down requests will frustrate users and platform operators
2. False Positives and Overreach
No scanning system is perfect, just look at the system Meta uses. Image-hash databases and machine-learning classifiers will flag benign content as CSAM. Inaccurate blocks or take-down requests will frustrate users and platform operators. Companies will feel pressured to over-censor rather than risk legal liability. This will practically kill free expression and undermine the very safety goals the policy claims to pursue.
3. Cross-Border Enforcement Is Complex
The EU consists of 27 member states each with its own legal traditions and procedures. A unified scanning mandate will require common standards for what content counts as illegal, how to handle appeals, and which authority can request access. The resulting operational complexity will slow down all requests, burden small providers and create inconsistent enforcement across countries.
Offenders already use private networks, darknet forums and peer-to-peer tools that lie outside the scope of this regulation
4. Criminals Will Adapt
Offenders already use private networks, darknet forums and peer-to-peer tools that lie outside the scope of this regulation. Scanning mainstream apps will push harmful material further underground, making it harder to detect and more dangerous to investigate.
5. It’s just a really Bad Idea.
Period.