We have spent years discussing unjustified wiretaps, denouncing abuses, bugged phones, conversations published without reason and we fought for a simple principle: no one can listen to anyone without a judge’s warrant (at least in a state of law, and at least in theory). Now Europe seems to want to take us in exactly the opposite direction: a form of preventive automatic interception that concerns everyone, always, in the name of security. We will vote on October 14th.
In general it is a good idea: protect minors, prevent the spread of child pornography. This is how the European proposal known as Chat Control presents itself, a law that would allow the automatic scanning of private messages on platforms such as WhatsApp, Telegram or Signal. No one will really read the conversations (they say) it will just be an algorithm that flags suspicious content (mmm).
However, many, many, too many see behind the word “protection” a much more disturbing principle: “preventive surveillance”, which sounds terrible. Every message, every photo, every conversation would be scanned in real time, that is, before encryption, which is equivalent to saying that nothing truly private exists anymore. By the way, do you remember Spielberg’s film Minority Report? The system for preventing crime was corrupted, manipulated, and ended badly. In Philip Dick’s original story it ended even worse.
For goodness sake: the arguments in favor exist: online pedophilia is a real phenomenon and no one can argue that the platforms should remain free zones. Furthermore, European authorities fear that end-to-end encryption, while legitimate, is making it impossible to investigate real abuses, and cooperation between states to identify the culprits would at least make practical sense. Ok, the point is the method, because the end that justifies the means here is on too steep a slope: the generalized presumption of guilt, the idea that to protect someone everyone must be controlled.
Furthermore, security experts (from Signal to EFF) have defined it as “institutional malware”. That is to say, the EU would end up installing control software into citizens’ private lives. Even if it wasn’t managed by human beings, it would still be a permanent eye inside conversations, with enormous margins of error, which is no small detail.
Technology does not distinguish context, nor consensus: an algorithm does not understand irony, does not know what art is and does not know the line between a mistake and a crime. We will end up with a multiplication of false positives, absurd reports, lives ruined by automatic suspicion, trials for intentions that weren’t even there. A system designed like this risks producing more victims than perpetrators.
If you are wondering “but isn’t it at least useful to stop them?”, the clear answer is: no. At least not as supporters’ propaganda suggests. Those who want to commit crimes already have the tools to evade automatic scanning (steganography to hide images inside harmless files, recompressions and small transformations that break the hashes and fingerprints used by detectors, encrypted zip packets sent via external links, dedicated apps or servers outside of jurisdiction that circumvent European rules), techniques that are all well documented and within the reach of experts and non-experts alike. Police operations (EncroChat, Ghost and similar) demonstrate that every now and then it is possible to hit some service and yet the result is always the same: a costly, spectacular and temporary operation, while the ability of malicious actors to adapt and create countermeasures remains intact. In simple language: a huge surveillance machine is built that captures the distracted and lets the determined escape. So much so that none of the large messaging platforms accept this perspective.
Apple already tried something similar with its CSAM scanning project in 2021, which was quickly canceled after user revolt: too risky, too close to mass surveillance. Now Apple defends end-to-end encryption and, as it has already done with the British Online Safety Bill, threatens to withdraw functions or services so as not to open backdoors. WhatsApp, through its manager Will Cathcart, declared that “if the EU forces us to break encryption, WhatsApp will not be able to operate here”. Signal is even clearer: if Chat Control passes, it will leave Europe.
Telegram, more anarchic and out of jurisdiction, will hardly adapt: its servers are distributed, and its founder Durov has already refused in the past to provide decryption keys to governments much more authoritarian than Brussels. In short, little is said about it in the media, yet we are talking about communication tools that we all use every day, I would say every minute.
In the aforementioned Minority Report the system collapsed when it discovered that the real threat was itself, here we risk the same epilogue, without Tom Cruise, and with much more bureaucracy. Meditate, people, meditate. The brain, for now, still remains encrypted.