By Michel Portier, Lecturer at Hogeschool Arnhem and Nijmegen
On 14 October, the European Council will vote on what is perhaps the most controversial proposal in the EU’s digital agenda: Chat Control. Under the guise of protecting children, they want to monitor your private messages. This means a spy on your phone that can flag any undesirable messages and pass them on to the authorities. With all the chilling consequences that entails. In this article, I explain what Chat Control is, why it is a problem, and what we can do about it.
What is Chat Control?
Most of our messaging services, such as WhatsApp and Signal, are end-to-end encrypted. This means that only the device sending the message and the device receiving it have access to the content of the message. Law enforcement agencies see this as a problem because they cannot monitor these communications. Because strong encryption cannot be weakened without security risks, they propose that every messaging service scan the message before it is sent. If the message contains images of child abuse, the authorities are notified. This happens without you noticing, using a technique called client-side scanning (CSS). This means that every photo is viewed. There are two ways to achieve this.
1) Scanning for known material: every phone receives a database with fingerprints of CSAM (Child Sexual Abuse Material). Every photo you send is compared to the database and if there is a match, it is reported
2) Scanning for unknown material. An algorithm (AI) scans your photos to see if they contain CSAM material. This algorithm cannot distinguish between good material (photo of your child in the bath) and bad material (actual abuse). Any photo found must be reviewed by someone who makes this decision.
It doesn’t stop at scanning images: everyone who uses a messaging service will have to identify themselves, or at least undergo an age check.
Chat Control: Germany, Belgium, Italy, and Sweden shift their positions ahead of the October 14 meeting
Pressure within the EU Council, and domestically, has led to some nations changing their position on the controversial … https://t.co/cLkSBygzrh pic.twitter.com/CDlMMVc7yG— TechPulse Daily (@DailyTechpulse) October 1, 2025
Why is this a problem?
To begin with, chat control is disproportionate and ineffective. In addition, there are a number of technical and ethical objections.
Fundamental rights
Chat control is not proportionate because all messages from everyone are always viewed. In a constitutional state, you would expect to be innocent until proven guilty, with only suspects being monitored. Furthermore, it is not effective; it does not work. It is like searching everyone all the time, while criminals simply walk in through the back door. When abusers know that all their messages are being viewed, they will look for other ways to distribute the material. And there are plenty of those.
Technical objections
There are also major technical objections. The fingerprint database is a so-called black box. A trusted party is needed to create the fingerprints, but it is not difficult for a hacker or state to add fingerprints from other photos. You could then be flagged if, for example, you have a photo of a certain location or person on your phone.
Finally, there is a huge risk of false positives. Even photos that have nothing to do with abuse can lead to a flag. Even if only 0.1% of photos lead to a false positive, that means hundreds of thousands of unjustified flags because billions of messages are sent. On GitHub, you can find tools that edit normal photos so that they match the database. This makes it very easy to trick others.
By far the biggest technical objection is the detection of new material. As I mentioned, this leads to many flags that have to be reviewed by humans. This has the following consequences:
1) You think you are privately sharing a photo of your child with your parents or partner, but if AI flags it, someone else will see it. Exactly what you didn’t want.
2) If you are wrongly accused of possessing abusive material, this has major consequences for your reputation or your career. Try explaining to your neighbours why your house is being searched.
3) For teenagers who voluntarily send racy photos of themselves to each other, this is a huge nightmare. Others are watching, and in general these photos are considered illegal.
Ethical concerns
If others are watching your private messages, you will behave differently. Think of the silence that falls in a lift when strangers enter. Because grooming texts are flagged, flirting in a chat becomes a risky business. This leads to self-censorship, where you are not free to express yourself for fear of being watched.
There is also such a thing as mission creep. Once this system is in place, it can be expanded for other purposes. Adding extra keywords for terrorism or flagging other types of photos is not difficult.
Chat Control provides very valuable data because it scans your communications. This allows your behaviour to be predicted. It is not inconceivable that algorithms already know you want to go to a protest before you yourself know it. Science fiction? No, years ago Google was already able to determine whether someone was pregnant or what illness someone had based on their online behaviour. This process of behavioural prediction can be performed to perfection with a spy on your phone. Finally, your behaviour can be influenced based on what someone knows about you, for example by giving you small nudges in your search results or recommended videos. The Search Engine Manipulation Effect has been well described and researched, and can influence the outcome of elections.
In short: Chat Control does not help children; it actually puts them at risk because they no longer have any real private communication. The exact opposite of what this legislation aims to achieve is being achieved. And the real perpetrators are simply looking for other platforms.
Salient detail: EU officials, politicians and military personnel are excluded from Chat Control due to ‘security risks’. Their messages will not be scanned.
BREAKING Swiss email provider @ProtonMail tells me the company will withdraw from the EU if #chatcontrol is made mandatory. @signalapp previously said it would leave the EU if it were to become mandatory. pic.twitter.com/Is75BR07Kb
— Eric van de Beek 🇳🇱 (@beek38) September 30, 2025
What can we do?
Last year, Chat Control was not put to a vote because there was a qualified majority of countries opposed to it. Thanks to the AIVD, which sees Chat Control as a security risk, the Netherlands is voting against it. Germany’s position is uncertain, but there are signs of support for the proposal. If that happens, Chat Control will go ahead. Public pressure on the German government is therefore desirable. Check out this website to see what you can do to persuade the authorities to vote differently.
There is a glimmer of hope in the European Parliament, which is opposed to this measure. If Chat Control passes the vote on 14 October, there will be a trilogue with the EP. The outcome of this is uncertain. It is possible that the worst parts of Chat Control will be scrapped, but that is not certain. For example, the EP also changed course quite late on the European Patient Record.
Finally, I believe that awareness is very important. Please share this column or my long read on Substack, where I discuss other EU digital measures in greater depth. The general public knows little about this subject and the media pays
hardly any attention to it. Yet it affects us all, because our humanity is deeply rooted in intimacy, and therefore our privacy.
Major EU Chat Control Changes
– 7 member states opposing
– 12 member states supporting
– 8 member states undecidedGermany and Belgium, previously in the "Opposing" group, are now in the "Undecided" group. pic.twitter.com/XHklW4ouZ5
— Pirat_Nation 🔴 (@Pirat_Nation) September 25, 2025
Originally published in Dutch by NieuwRechts.nl
Disclaimer: www.BrusselsReport.eu will under no circumstance be held legally responsible or liable for the content of any article appearing on the website, as only the author of an article is legally responsible for that, also in accordance with the terms of use.