Chat Control, EU’s Plan for Real-Time Mass Surveillance, Takes a Dramatic Turn

Posted on Oct 20, 2023 by Glyn Moody

We discussed the EU’s plans to undermine end-to-end encryption, generally known as “Chat Control,” a year ago. The proposed law, which intends “to prevent and combat child sexual abuse,” is now entering the final stages of the EU legislative process, and the fight over whether it will include real-time mass surveillance is becoming fiercer. In the last few months there have been a number of important developments that could have a major impact on the final form of the law.

Are People in Favor of Chat Control?

In February 2023, researchers at the Dutch university TU Delft found that “out of six public statements made by the European Commission in support of its new measures, three were incorrect.” In July, the European Commission released the results of its Eurobarometer survey, a study on the protection of children against online sexual abuse. It claimed that its results showed “strong support from citizens across the EU to prevent and combat child sexual abuse online.”

This is hardly a surprise in and of itself – no decent person can be against tackling these terrible crimes. But as the MEP Patrick Breyer, whose Chat Control page is the best source for information on the plans, pointed out:

Citizens aged 18 and over were asked in many different ways whether depictions of child sexual abuse online should be automatically detected – to which most respondents agreed. However, they could not understand from these questions that private messages are to be indiscriminately searched with error-prone algorithms.

What Are the Issues?

The wording of one key question implied that it was possible to accurately and reliably detect only child sexual abuse material (CSAM), without any false positives. But as Breyer pointed out, the Swiss federal police found that up to 80% of machine-reported material turned out to be criminally irrelevant.

The continuing controversy over the Chat Control proposals led the European Council – one of the three EU institutions that draws up new laws – to postpone a vote on the topic. This was largely because several EU countries did not support the text in its present form. A few days later, Balkan Insight published the results of a wide-ranging investigation into the groups that have been actively supporting the proposed CSAM legislation. Summarizing the article’s revelations, Diego Naranjo, head of Policy for the EU digital rights group EDRi wrote:

The investigation published today confirms our worst fears: The most criticised European law touching on technology in the last decade is the product of the lobby of private corporations and law enforcement. [EU] Commissioner Johansson ignored academia and civil society in Europe while she shook hands with Big Tech in order to propose a law that will attempt to legalise mass surveillance and break encryption.

The Balkan Insight article includes one particularly significant comment from Europol, the central European hub for coordinating police intelligence in the region:

Europol officials floated the idea of using the proposed EU Centre to scan for more than just CSAM, telling the Commission, “There are other crime areas that would benefit from detection”. According to the minutes, a Commission official “signalled understanding for the additional wishes” but “flagged the need to be realistic in terms of what could be expected, given the many sensitivities around the proposal.”

This is precisely the danger of implementing Chat Control’s mass surveillance system that many have warned against. Once it’s in place, the pressure to use it for other purposes – against terrorism, for example, or for combating drugs – will be hard for politicians to resist.

In her reply to the Balkan Insight article, the EU Commissioner responsible for the Chat Control legislation, Ylva Johansson, denied that companies were being given preferential treatment in discussions, citing the fact that “the proposal does not incentivise or disincentivise the use of any given technology, leaving to the providers the choice of the technologies to be operated to comply effectively with the obligations of the proposal”. In fact, this lack of specificity is a key part of the problem with Chat Control: the EU is demanding the use of a technology that does not exist – one that allows encrypted communications to be scanned without harming privacy – as countless security and privacy experts have explained.

Biased Studies Give Unreliable Data

Following the revelations from Balkan Insight’s investigation, the technology expert Danny Mekić discovered that the European Commission had run ads on X (formerly Twitter) in an attempt to sway public opinion on Chat Control:

The campaign, which has been viewed more than four million times, uses shocking images of young girls alongside sinister-looking men, ominous music, and commits a form of emotional blackmail by suggesting that opponents of the proposed legislation would not want to protect children. Equally misleading is its claim that the proposed legislation would be supported by the majority of Europeans based on a survey that highlighted only the benefits but not the drawbacks of the proposed legislation. On the contrary, surveys by research firms YouGov and Novus, which highlighted the drawbacks, showed virtually no support for the proposal among the European population.

Aside from the question of whether the EU should be running such emotive, manipulative ads in the first place, it turned out that there was another serious issue. The ads were targeted to the countries that were dubious about Chat Control – Belgium, the Czech Republic, Finland, Netherlands, Portugal, Slovenia and Sweden – and used microtargeting to ensure that the ads did not appear to those who cared about privacy, were sceptical about the EU, or who were interested in Christianity. Microtargeting using sensitive criteria like politics and religion is questionable under EU law, and the European Data Protection Supervisor has begun a “pre-investigation procedure” to examine this issue.

The recent information about the methods employed by the European Commission in an attempt to push through its controversial Chat Control legislation led to a strong reaction from opponents. Commissioner Johansson wrote in a blog post that she had been subjected to “insults, threats, and intimidation.” In the same post, Johansson mentioned a new poll that showed 81% of Europeans “support obligations to detect, report and remove child sexual abuse.” However, as the data scientist and research methodologist Vera Wilde commented the next day, the survey provided participants with “biased and factually wrong information” about end-to-end encryption. As a result, Wilde said, “the researchers invalidated their results.” Wilde points out in another post that the key problem with Chat Control’s application of “mass screening to low-prevalence problems” is that it produce lots of false positives and false negatives:

Subjecting entire populations to screenings like Chat Control implies following up many uncertain results with investigations – potentially traumatizing a large number of innocents, including minors. Will society jeopardize many children’s well-being to save a few? How did minors consent for their private communications’ use in training this AI? How will minors’ sexual images and texts, no longer protected by end-to-end encryption, be secured against misuse? What about privacy?

Chat Control Could Hurt Children

A core problem with Chat Control surveillance is the fact that many young people engage in sexting – around 18% overall according to a 2021 study, and 22% of those aged 13–17. Much of this will inevitably be flagged up as CSAM, leading to those involved being stigmatized or even criminalized, and police forces swamped with false positives that will prevent them tackling real cases of abuse. Ironically, a new EU law designed to protect children could end up harming millions of them. It’s yet another reason why Chat Control’s real-time mass surveillance should be stopped, and other approaches developed. As Professor Susan Landau at Tufts University has just written in her Lawfare review of the attacks on end-to-end encryption around the world:

Think differently. Think long term. Think about protecting the privacy and security of all members of society – children and adults alike.

Featured image by EU Home Affairs.