Privacy Activists File Complaints against the European Commission and X over Chat Control Ad Campaign

Posted on Jan 3, 2024 by Glyn Moody

One of the big battles underway in the world of privacy at the moment involves attempts by governments around the globe to undermine encryption. The principal justification for doing so is to protect children – always a powerful, if not entirely honest, argument. In the EU, the fight is especially fierce because the controversial Chat Control law has nearly finished its passage through the legislative process, so the stakes are high. This led the European Commission to take the extraordinary step of running a series of emotive and manipulative ads on X (formerly known as Twitter) to win support for its position.

As a result, the European Data Protection Supervisor (EDPS) began a “pre-investigation procedure” to examine this issue. Since then, there have been two important developments, both involving the privacy activist Max Schrems and his noyb.eu organization. In November, noyb filed its own formal complaint to the EDPS against the European Commission:

Part of this seemingly aggressive attempt to promote the chat control was a targeted advertising campaign on X (formerly Twitter) to change public opinion. While online advertising isn’t illegal per se, the EU Commission targeted users based on their political views and religious beliefs. Specifically, the ads were only shown to people who weren’t interested in keywords like #Qatargate, brexit, Marine Le Pen, Alternative für Deutschland, Vox, Christian, Christian-phobia or Giorgia Meloni. The EU Commission previously raised concerns over the use of personal data for micro-targeting and described the practice as “a serious threat to a fair, democratic electoral process”.

Since people’s political opinions and religious beliefs are specifically protected by the EU’s GDPR, the European Commission’s exploitation of them for its targeted ad campaign is possibly illegal and certainly ironic. Schrems and noyb have therefore requested that the EDPS investigate whether the EU’s main privacy law has been broken, and if so, to impose a fine on the European Commission. However, that was not the end of the matter. A few weeks later, noyb failed a complaint under the GDPR against X-Twitter itself:

In theory, this violation shouldn’t even be possible: X states in its advertising guidelines that political affiliation and religious beliefs should not be used for the purpose of ad targeting. In reality, it seems that X is not enforcing the ban in any way, making it practically meaningless. The EU Commission’s campaign was shown to at least several hundred thousand Dutch X users. The post in question is still available here.

Interestingly, the noyb post about its complaint suggests that the use of sensitive personal data for microtargeted ads represents a violation of not only the GDPR, but also the Digital Services Act, which is designed to provide further stringent controls of major online platforms.

In a related move, the European Commission has just published an evaluation report of the current Chat Control framework. This is purely voluntary, something the new Chat Control legislation wants to change by requiring major platforms to carry out scans of private chats, messages, and emails at all times. However, Patrick Breyer, a member of the European Parliament from the Pirate Party, believes the report fails to justify the loss of privacy that mandatory Chat Control would entail:

The long overdue evaluation of voluntary chat control turns out to be a disaster: Provided figures on suspicious activity reports, identifications and convictions lack any proven connection to the chat control bulk scanning of private messages …. ‘Identifying’ the senders of self-generated nudes in consensual sexting is hardly a challenge and does not protect anyone from child sexual abuse. All in all, there is no evidence that the industry-driven mass surveillance of our private communications by US services makes a significant contribution to saving abused children or convicting abusers. To the contrary, it criminalises thousands of minors, overburdens law enforcement and opens the door to arbitrary private justice by big tech.

Breyer points out that the report fails to mention a key fact about Chat Control and its underlying approach. It turns out that only 25% of photos or videos that are disclosed to moderators and police are of any use for law enforcement. That is, three quarters of the leaked private images and videos are “criminally irrelevant,” but end up in the hands of staff, posing risks to users’ privacy and security. The scale of the problem is huge. According to Breyer, in 2022 alone 750,000 private messages involving EU citizens were disclosed without any relevance for law enforcement.

As Breyer’s detailed page devoted to Chat Control indicates, the continuing tussle over Chat Control means that it will not now be agreed and passed before the upcoming 2024 elections for the European Parliament. This makes it hard to predict what the final outcome will be, since the makeup of the future EU Parliament may be very different from its current constitution. Because of this delay, the current voluntary system of Chat Control is likely to be extended to ensure that at least some kind of framework is in place while a new one is negotiated.

In 2024, the Chat Control saga will continue to bring together some of the key players of the privacy world in 2023: the EU, the EDPS, Max Schrems, and the noyb organization. The result will matter, because it is likely to feed into the other discussions regarding weakening encryption that are taking place around the world, although less prominently than those concerning the EU’s Chat Control.

Featured image by dimitrisvetsikas1969.