Chat Control: the EU's biggest mass surveillance initiative ever
The EU is once more debating the chat control initiative, a mass surveillance law that would rely on a US company to scan every piece of communication of (almost) all EU citizens by October 2025.
On July the 1st, 2025, Denmark, which at the time of writing holds the presidency of the European Council, announced its intention to restart the discussions concerning the chat control initiative. The initiative, whose inception dates back to 2022, was never without controversy, since it aims to scrap end-to-end encryption across the entire EU. The reason? To be able to scan every piece of communication (messages, files, emails...), allegedly in order to prevent child abuse, as part of Child Sexual Abuse Material (CSAM) control.
How did we get here?
The idea of ending end-to-end encryption has been around for a while in the EU. In fact, the Child Sexual Abuse Regulation (CSAR) law was first proposed by then EU Commissioner for Home Affairs Ylva Johansson on May the 11th, 2022. As expected, it was highly polemical since the beginning, not only because subjecting all EU citizen's to mass surveillance by scanning all of their communications sounds pretty radical, but also because to do so could be illegal under current EU legislation, and in fact, in 2024, the European Court of Human Rights ruled that scrapping encryption undermines fundamental rights. Because of that, the initiative received widespread opposition from the European Parliament, and in November 2023, the Parliament amended the original proposal, ensuring, among other things, that end-to-end encryption will be preserved. Just a few months before, the Swedish presidency of the EU Council pushed towards the adoption of the CSAR. Still, no substantial advances were made. Earlier this year, Poland tried to find an agreement on the CSAR, but failed to do so and gave up during their turn holding the presidency of the Council. Now, it seems that the proposal that comes out from the Danish presidency is even more aggressive than the ones put forward by its predecessors, although the details are not yet fully known.
But proposals of this kind have not been limited only to an EU level. Talking about Sweden, the Riksdag (that is, the legislative chamber of Sweden) was called to discuss a similar law, aimed at undermining encryption. At the same time, although not being an EU member, the United Kingdom asked Apple to remove some of its most effective safety measures in order to access user data, a request to which Apple finally complied. In France, the National Assembly also rejected a proposal to implement a backdoor to circumvent encryption. Switzerland is currently on the path of adopting stronger surveillance laws, undermining encryption and online anonymity.
But what is end-to-end encryption, and why does it matter?
End-to-end encryption (E2EE) guarantees that only the sender and the receiver of a message can access it. This means that any other party involved in a message exchange, for instance, cannot access the content of the message that is being sent.
Many times E2EE has been depicted as a haven for criminals that can use the technology to shield themselves from the authorities - the case of Telegram shows that this correlation between encryption and criminality is simply not the case, as will be shown in the next section. On the contrary, E2EE is crucial to keep people safe. For instance:
- Protects people in times of political persecution and censorship under authoritarian regimes, no matter their nature (even within the EU, spyware has been used to target political opponents).
- Protects people's sensitive data, such as medical records, from data leaks.
- Protects people's fundamental right to privacy.
- Enhances people's security. Important documents and credentials stored in the cloud without encryption make people extremely vulnerable to data leaks.
- In a similar way, it protects companies and governmental agencies alike. Backdoors meant only for government agencies can be exploited by foreign actors, jeopardising national security.
Some point to the fact that companies like Meta offer strong protections to its users thanks to E2EE, and that this comes at the expense of children. Although the concerns are prima facie more than legitimate, the truth is that Meta is far from being a privacy-respecting company. Thiese kinds of companies actually collect huge amounts of metadata that can be used by governmental agencies. Notoriously, already eleven years ago, ex-NSA chief Michael Hayden said, "we kill people based on metadata". For context, metadata collection has to do not with the content of the message, but with a lot of information attached to it, such as with whom one is talking, from where, on which frequency, and so on. It is very plausible, though, that companies like Meta implement E2EE to avoid liability, as well as a means to privacy-wash their brand. And indeed, from what we have been able to see, many criticisms from NGOs devoted to preserving children's rights are aimed at Meta in particular (and more specifically, to its decision to encrypt Messenger and Instagram). Surveiling people for profit while avoiding responsibility does not seem quite correct, but this is very different from services that actually have privacy and security as a goal. Not only do we understand the concerns about platforms like Meta, but in addition, we believe that all these big commercial social media platforms, such as Meta or ByteDance, under their current form, should be illegal across the EU.
The problem is real. What about the solution?
It is important to note that this assault on encryption does not necessarily need to be ill-intentioned. The fact that policymakers are trying to pass legislation to combat online criminal activity (whether that be illegal pornographic material, trafficking and organised crime) is a sign that, indeed, these are real problems that threaten our societies. After all, child pornography websites continue to thrive across the EU. Drug trafficking and gang-related violence are on the rise, particularly in countries like Belgium, Sweden or the Netherlands, but it is also deeply rooted in other countries, such as Spain or Italy.
Now, the question is whether there is any correlation between encryption and online crime. Telegram has been notorious for hosting criminal activity, and its groups and channels are not end-to-end encrypted at all (despite many uninformed claims about Telegram's encryption). Only one to one chats can be encrypted in such a way, and that's not even done by default. One could argue that the new moderation policies of the platform can curb that phenomenon, but it is more about handing your data if you're already under a current criminal investigation than just scanning all your messages by default.
More than that, most countries across different continents already contemplate to hand encryption keys to the authorities if requested, for instance, under an ongoing investigation. The question is, does breaking the presumption of innocence, and the surveillance of every piece of communication protect children? Does it make our societies to be better and safer? Apparently not even experts on the matter have a strong consensus over the potential benefits of scrapping encryption (you can read several statements from different sources in the "further reading" box, in the bottom of the article).
In our view, the current argument to break encryption could be analogous to the following one:
- Domestic violence is a scourge in our societies.
- Domestic violence happens because nobody can check what is happening inside a home.
- In order to prevent domestic violence, we should monitor the homes, where such crime is often committed.
- Anybody can excerpt domestic violence.
- Therefore, we should monitor all houses in order to prevent domestic violence.
- To install cameras and microphones all around the house. If you are innocent, you don't have anything to fear.
- If you are against it, you are on the side of the perpetrators.
Not only does this reasoning seem obviously misleading (even some form of ad misericordiam fallacy), but it sis also as naive as dangerous.
To begin with, it has been suggested that E2EE can help to protect children, by enhancing their security and resiliency against cyber-attacks and data-theft. And we should not forget that they also have a right to privacy, being a basic human right as it is. The UN's recommendations to protect children's rights in digital environment state that:
Any digital surveillance of children should respect their right to privacy and should never be conducted without their knowledge and informed consent
Similarly, UNICEF's report on Encryption, Privacy and Children’s Right to Protection from Harm says the following:
Although frequently mentioned in the debate, it is incorrect to suggest that children will have their rights better respected if digital communications platforms remain unencrypted; this is the case regarding some risks, but not all. The debate also needs to consider severity and scale of impact. [...] There is a need to explicitly consider how protection and privacy can be most effectively ensured in conjunction and think through the potential implications of our proposed solutions – legally, globally, technologically and for the future of our democratic principles and the rule of law.
But is not only about the right to privacy. AI surveillance tools are notorious for bringing false positives and for reinforcing the racial biases present in our societies. More than that, to create a backdoor that only grants governmental access to people's communications is wishful thinking without any connection with the reality of cyber-crime. As an example of this, last October 2024 saw how many US telecommunications companies got hacked by China-backed hackers.
Moreover, the current CSAR proposal would target everybody with the exception of politicians, and military and law enforcement personnel, which could suggest that, indeed, not all citizens are equal under the law. We have said it here many times: there cannot be democracy, and especially no rule of law under mass surveillance, whether that is governmental or corporate - or the result of a symbiosis between both, as will be shown below.
Domestic problems. Foreign interests
In September 2023, a piece of investigative journalism by Giacomo Zandonini, Apostolis Fotiadis and Luděk Stavinoha came into light, revealing the ties between EU Commissioner Ylva Johansson and Thorn, a US company (of course it is) founded by actors Demi Moore and Ashton Kutcher, that carries out AI-powered detection of abusive online material. The report notes that:
The proposed regulation is excessively “influenced by companies pretending to be NGOs but acting more like tech companies”, said Arda Gerkens, former director of Europe’s oldest hotline for reporting online CSAM.
[...]
FGS Global, a major lobbying firm hired by Thorn and paid at least 600,000 euros in 2022 alone, said Thorn would not comment for this story. Johansson also did not respond to an interview request.
The report not only tracks the intense lobbying campaign on behalf of Thorn, but also shows particularly close ties between the company and former MEP Eva Kaili, who was convicted for bribery in the context of the Qatargate back in late 2022. In addition to this, both Thorn and its offshoot Safer reportedly minimised in a substantial way the possibility of false positives that their tools could bring, in their efforts to pressure the European Commission. But the relationships with Thorn are not limited to the EU Commission, nor did they stopped in the period covered by the article. A few months ago, the EU Ombudsman charged against Europol administration over its ties to Thorn. The adoption of this piece of legislation, it seems, would be one step further in Europe's insistence on relying mainly on US tech companies to surveil their populations, in a time when EU policymakers seemed to be more aware of the risks associated with the EU's high reliance on foreign (mainly US-based) tech companies.
Furthermore, the adoption of this law would just create the perfect pretext for anti-EU politicians such as JD Vance to attack Europe's current digital regulations, accusing it of being an anti-democratic bloc hostile to freedom of speech. That is, it will simply provide an excuse for such actors to attempt even more to undermine European efforts towards its digital and legislative autonomy.
Conclusion
The problems that initiatives like the CSAR aim to tackle are very real. That being said, it is essential to keep in mind one fundamental thing: there cannot be democracy, and there cannot be any rule of law, under mass surveillance. These are basically two incompatible things. And to see the EU, which should be championing both things, being so easily lobbied is as worrying as disappointing, to say the least. There must be other ways to tackle this issue. Investing consistently in prevention and education is key. Also, to make sure that these kinds of laws are thought out and designed according to the technology they are dealing with is fundamental for their success. As an example, since the UK's Online Safety Act came to light, most Internet users have found very easy ways of circumventing age-verification (via VPNs, for example). And even if there are no official plans to ban VPNs in the UK yet, its usage is being closely monitored by the government, meaning that such plan could change in the future. That is, the law, although can be well-intentioned, has been poorly implemented, and with this comes its potential to do more harm than good. And indeed, we do not expect a law to be perfect and to be able to solve a problem - and especially one of this magnitude - instantly. In fact, this should not deter us from trying it. But, once again, what is going on with the current version of the CSAR, mass surveillance, is about something else. In politics, intention is important. Responsibility is even more.
Contrary to what one might expect, we think that this kind of legislation is needed. The need for something like an Online Safety Act, or a Child Sexual Abuse Regulation makes, unfortunately, a lot of sense. The answer, though, cannot be indiscriminate mass surveillance. The trade-offs are too many, and the proposed technical implementation puts the people - and their countries - in excessive danger. Not to talk about the cooling effect that this would have on the population. And all of this is especially concerning when there are private companies lobbying and providing misleading information to the ones that are supposed to write these policies down. Things get even worse when those same politicians would be exempt from this kind of massive vigilance. And in case all this was not enough, in a time when Europe's technological dependence from the US is a clear geopolitical vulnerability, to handle the communications of almost every single EU citizen to a US company is even more difficult to understand.
It is our responsibility to look for actual solutions. As a suggestion, Secure DNS, while probably insufficient by itself, can be an effective and less harmful way of protecting children - and adults alike - while browsing the web. But beyond this, more work and research needs to be done in order to come up with nuanced and effective policies that have the sole purpose of pursuing the common good, not private interests. Indeed, the authorities, in accordance with the law, must be able to carry out their investigations. Privacy should not be a tool misused by big companies to avoid their duty to protect either. It doesn't need to be either privacy and security or child protection. It can be both. And to scan every single piece of communication, with the risk of it being leaked and falling into the wrong hands, and to treat roughly 450 million citizens as potential culprits seems quite far from either.
Excursus: stop weaponising children
The pretext of child protection is a good one. After all, only a merciless person would be against it (unless, in our European context, they are migrant children from non-Western countries; then the EU and most of its countries can feel pretty comfortable with not protecting them). And, being the protection of children such an obvious pressing issue, to use children as a tool to pass this kind of legislation seems dubious, especially when private money and interests are behind the push. And this is far from being the first time this happens. Just as an example, Hungary's anti-LGBTQ+ legislation is also aimed at protecting children - for the record, that law actually clashes with EU law.
To see a debate full of false dichotomies and sensationalist claims is, again, disappointing. In a time when authoritarian regimes and narratives thrive at ease, our societies deserve, and urgently need, something better.
Further reading:
- Access denied. How end-to-end encryption threatens children's safety online. Children's Commissioner, UK (in favour of scrapping E2EE for children's accounts).
- A Parents’ Guide to Encryption. Global Encryption Coalition (against scrapping encryption).
- Chat Control—The End Of Private Messaging As We Know It? Daniel, L. Forbes
- Chat Control: The EU’s CSAM scanner proposal. Breyer, P. Former MEP.
- Child sexual abuse online: effective measures, no mass surveillance. European Parliament Committee on Civil Liberties, Justice and Home Affairs.
- Content-Oblivious Trust and Safety Techniques: Results from a Survey of Online Service Providers. Pfefferkorn, R. Journal of Online Trust and Safety; Vol. 1 No. 2 (2022).
- CSA Regulation Document Pool. EDRi.
- Encryption Is a Preventative Tool that Protects Children. Lane, S. Internet Society.
- Encryption, Privacy and Children’s Right to Protection from Harm. Kardefelt-Winther, D., Day, E., Berman, G., Witting, S. K., and Bose, A. UNICEF, Office of Research - Innocenti. (Against scrapping encryption).
- EU Member States Still Cannot Agree About End-to-End Encryption. Pfefferkorn, R. Center for Internet and Society, Stanford Law School.
- Going Dark: The war on encryption is on the rise. Through a shady collaboration between the US and the EU. Mullvad.
- Privacy and Protection: A children’s rights approach to encryption. Child Rights International Network. (Against generalised scrap of encryption).
- The end of encryption as we know it? Hardy, E. The Parliament Magazine.
- The right to privacy in the digital age. Report of the Office of the United Nations High Commissioner for Human Rights.
- Why An Encryption Backdoor for Just the “Good Guys” Won’t Work. Amie Stepanovich, A. and Karanicolas, M. Just Security.