On the Fallacy of Client-Side-Scanning and Chatcontrol measures.
TL;DR — Client-side scanning (CSS) breaks end-to-end encryption, fails basic error-rate math at population scale, and creates powerful new abuse surfaces. That’s why standards bodies, data protection authorities, and most security researchers oppose it — even as the EU’s CSA Regulation (“Chatcontrol”) continues to be debated. It’s all about this EU proposal
What “client-side scanning” actually does
CSS moves content inspection onto user devices before messages are encrypted and sent. In practice, it means continuously hashing or classifying your photos, texts, voice notes, and files on-device, comparing them to databases or ML models, and auto-reporting “hits.” That’s surveillance by design on every endpoint—including those that use end-to-end encryption. Have you heard of “privacy by design”? It’s actually required by the GDPR—see Article 25 (“Data protection by design and by default”). But if you are the lawmaker, it may not be especially relevant. What was established once can, in similar fashion, be repealed.
Proponents argue you can “scan without breaking E2EE.” In reality, once the client is compelled to read messages for authorities, E2EE’s core guarantees are gone — the endpoint is deputized as an informant. This is why the Internet Architecture Board (IAB) explicitly warns against mandatory client-side scanning.
But whatever…
What technical experts actually say
-
Standards bodies: IAB/IETF have long rejected standardizing backdoors and mass surveillance because they reduce the security of the Internet as a whole and don’t stop malicious actors. Their 2023 statement explicitly cautions against mandatory client-side scanning. (IETF Datatracker)
-
Data protection authorities: The EU’s EDPB/EDPS joint opinion flags serious risks to fundamental rights and stresses the importance of strong E2EE; it also explains that “client-side scanning” can be circumvented and would lead to substantial, untargeted access to unencrypted content—i.e., generalised monitoring enabled by detection orders. (EDPB)
-
Academic consensus: The multi-author study Bugs in Our Pockets concludes CSS introduces unmitigable risks, is vulnerable to abuse, and cannot be safely limited to one purpose. (arXiv)
-
Industry & practitioners: After intense criticism, Apple abandoned its on-device CSAM-scanning plans in December 2022; the episode exposed real-world fragility (e.g., NeuralHash reverse-engineering and hash collisions). E2EE providers like Signal warn CSS (and rebrands like “upload moderation”) fundamentally undermines encryption. (WIRED)
-
An open letter singed by more than 700 scientists.
Why the “we’ll make the models accurate” argument fails
Mass screening runs into basic statistics. Even tools with seemingly “good” precision generate huge numbers of false positives when run against billions of private messages — and tools for unknown CSAM or “grooming” are substantially less accurate than hash-matching known content. The EU’s own materials and independent analyses acknowledge this.
You can turn the thresholds up or down, but you cannot escape the trade-off: lower false positives ⇒ more abuse missed; higher recall ⇒ more innocents flagged. This is the same dilemma that sank mass polygraph screening and applies here, too.
Security regressions and abuse surfaces
- Model/database coercion & function creep: Once a scanning pipeline exists on every handset, adding new categories is a policy toggle away. History suggests scope only expands.
- Adversarial evasion: Offenders adapt. Perceptual hashes can be dodged or collided; classifiers can be reverse-engineered or adversarially fooled. Those episodes already happened in the Apple trial balloon.
- Supply-chain risk: Mandating privileged scanning code on all devices increases attack surface for criminals and hostile states — a point repeatedly made by IAB/IETF and DPAs.
“But this is about child safety…”
It is — and child safety work deserves effective measures: targeted investigations, more resources for specialized units, cross-border cooperation, and faster takedown of known material without universal device surveillance. Even Parliament/EPRS briefings and civil-society analyses stress the legal and technical fragility of mass scanning, while the Council has struggled to reach a position that doesn’t break encryption.
Meanwhile, political pressure continues, media narratives fluctuate, and Member States remain split on mandatory scanning. Recent reporting shows growing opposition among key countries and broad expert pushback. The debate is unresolved — but the technical direction of travel is clear.
Bottom line
Client-side scanning is not a harmless add-on; it’s a structural change that degrades E2EE, scales poorly in the face of adversaries and error-rate math, and invites function creep. The dominant expert view — from standards bodies to DPAs to independent researchers — is to reject CSS mandates and invest in targeted, rights-preserving child-protection strategies instead.