A divisive project
In 2025, the draft European regulation ChatControl (or CSAR) is sparking debates and concerns. Presented as a response to the fight against online child abuse, it aims to force messaging and collaboration platforms to implement automatic monitoring of private communications.
Protecting the most vulnerable is an unquestionable priority. But should this mean accepting the generalised surveillance of all digital exchanges?
What is ChatControl?
The ChatControl regulation would require digital service providers to:
- scan automatically all messages, images and files exchanged,
- detect content deemed illicit,
- alert the competent authorities in case of suspicion.
This would affect all digital channels:
- instant messaging apps (WhatsApp, Signal, Telegram…),
- emails,
- collaborative platforms,
- cloud services.
⚠️ In practice, this means a systematic filtering of private communications.
In 2024, the CNIL was notified of 5,629 data breaches (+20% in one year).
The latest developments: where does the project stand in September 2025?
On 12 September 2025, the EU Member States must finalise their positions. The Danish presidency is pushing for a vote on 14 October 2025.
- Several countries (such as Germany) have shifted from outright opposition to a more undecided stance, creating uncertainty.
- The new version of the text reintroduces client-side scanning: analysing messages before encryption, undermining the integrity of end-to-end encryption.
- More than 500 cryptographers and researchers have signed open letters denouncing this major risk for cybersecurity.
- Voices warn that this regulation could drive users towards decentralised solutions if trust in traditional tools is broken.
Decryption – Client-side scanning
A system that analyses your messages directly on your device, before they are encrypted.
Problem: this weakens the promise of end-to-end encryption and creates a new vulnerability point.
👉 The debate has therefore reached a critical point: either Europe confirms its will to impose generalised scanning, or it reverts to a more rights-respecting approach.
What are the risks for citizens?
Did you know?
In 2024, 19% of cyberattacks in the EU directly targeted citizens’ and administrations’ personal data.
- Invasion of privacy: Every message, photo or document could be scanned, including personal, family or professional exchanges.
- False security: Organised criminals will find ways around such systems. Ordinary citizens will mainly see their intimacy weakened.
- Political abuse risks: Such a tool could be misused tomorrow to monitor opponents, journalists, whistle-blowers.
- A dangerous precedent: the normalisation of mass surveillance.
Key figures that speak for themselves
To understand why ChatControl is causing so much concern, one needs to look at the reality of threats and practices in Europe:
- In 2024, the CNIL was notified of 5,629 personal data breaches, a 20% increase compared to 2023. A clear sign: cybersecurity incidents are multiplying.
- That same year, the CNIL recorded 17,772 complaints, a record, up 8% — proof that citizens are increasingly aware of breaches to their digital rights.
- According to the ENISA Threat Landscape 2024 report, 11,079 incidents were recorded in the EU between June 2023 and July 2024, including 322 transnational incidents. The most frequent threats: ransomware, availability attacks, data theft.
- Over a single year, 10,000 cyberattacks were identified in the EU, and nearly 19% directly targeted personal data of citizens or administrations.
👉 These figures confirm one reality: cybersecurity is already a huge challenge. Adding generalised surveillance of private communications via ChatControl would risk amplifying the threats (leaks, flaws, abuse), rather than reducing them.
What impacts for organisations?
- Weakened confidentiality: internal exchanges, sensitive data, contracts… everything could be scanned.
- Legal risk: incompatibility with GDPR and European requirements on proportionality.
- Cybersecurity: introducing scanners before encryption means creating new exploitable flaws.
- Digital sovereignty: such constraints would benefit US tech giants, able to adapt quickly, to the detriment of independent European solutions.
ChatControl undermines confidentiality without solving organised cybercrime.
The arguments of the opponents
Lawyers, associations, researchers and actors of sovereign digital solutions warn about:
- Disproportion: treating all citizens as suspects.
- Inefficiency: criminal content will simply circulate elsewhere.
- Technical risk: flaws created by client-side scanning.
- Domino effect: after children, who will be next? Opinions, health, religion…
What can citizens do?
- Get informed and raise awareness around them.
- Check privacy settings.
- Choose ethical and sovereign solutions (like Whaller).
- Support initiatives that challenge European institutions.
And for organisations?
Executives, CIOs and CISOs must:
- Assess their exposure: which services would be affected?
- Anticipate the impacts on their flows, archives, legal obligations.
- Raise their teams’ awareness about the importance of confidentiality.
- Turn to sovereign alternatives such as Whaller, which guarantee compartmentalisation, security and compliance.
Whaller’s position
At Whaller, we believe it is possible to fight crime without tipping into generalised surveillance.
Our commitments:
- No exploitation of user data.
- Compartmentalised architecture by spheres.
- Hosting in France, with the option of Whaller DONJON (SecNumCloud) for critical environments.
- Total transparency: no backdoors.
In short: an ethical, sovereign and secure digital world.
Vigilance and informed choices
The ChatControl project illustrates a major dilemma: how to protect the most vulnerable without tipping into a surveillance society? The answer does not lie in mass control, but in prevention, education and the choice of solutions that respect freedoms. Whaller will continue to defend a model of digital collaboration that is fluid, sovereign and secure, faithful to its mission: to protect organisations and citizens, without compromise on trust.
FAQ: ChatControl in 10 questions
1. What exactly is ChatControl?
It is a draft European regulation aiming to force digital platforms to automatically scan private communications (messages, images, documents) to detect content related to child abuse.
2. Who would be concerned?
All messaging services, emails, collaborative platforms, cloud services used in Europe: WhatsApp, Gmail, Teams, Slack, but also more specialised solutions such as those used by schools, associations or administrations.
3. Does this mean my private conversations would be read?
Yes, potentially. Even if algorithms analyse the content, this remains a direct breach of confidentiality. And no one can guarantee these analyses will not be reused for other purposes.
4. What are the main risks?
- Massive leaks of sensitive data
- Abusive exploitation by States or private actors
- False positives, potentially criminalising citizens or organisations by mistake
- Creation of a dangerous precedent for other forms of surveillance
5. Will this really stop criminals?
Probably not. Criminal networks already know how to use tailor-made encrypted tools, outside the radar of mainstream platforms. The first victims of such a measure would therefore be… ordinary users.
6. How is this contrary to the GDPR?
The GDPR requires that personal data be collected and processed only for precise, proportionate and legitimate purposes. Here, the massive and systematic processing of the entire population’s communications would be disproportionate.
7. Why do cybersecurity experts oppose it?
Because introducing backdoors to scan data creates new flaws. It is a basic principle of security: the more access points, the more exploitable vulnerabilities.
8. What are the arguments of ChatControl supporters?
They stress the absolute necessity of protecting children and argue that current means (police cooperation, targeted investigations) are insufficient.
9. What alternatives exist?
- Strengthening teams specialised in cybercrime
- Improving cross-border cooperation
- Investing in prevention and digital education
- Developing targeted, proportionate tools that respect fundamental rights
10. What can I do, as a citizen or organisation?
- Inform myself and others
- Choose ethical and sovereign digital solutions
- Raise awareness among my relatives, colleagues, students about data protection
- Support initiatives that defend a balanced approach to cybersecurity in Europe
0 Comments