Two Years After Privacy Risks of Client-Side Scanning Paper Was Published, It’s Time for Governments to Move On
Back in 2021, we wrote about an important preprint written by 13 of the world’s top security gurus, including Ross Anderson, Whitfield Diffie, Ronald L. Rivest, and Bruce Schneier. It concerns the serious problem of Child Sexual Abuse Material (CSAM), and the idea of tackling it using client-side scanning (CSS) – that is, code on your smartphone that would constantly scan images for illegal material. The experts had no doubts about how harmful the approach would be to privacy. “The introduction of scanning on our personal devices—devices that keep information from to-do notes to texts and photos from loved ones—tears at the heart of privacy of individual citizens,” they wrote.
The preprint has now been published in the Journal of Cybersecurity. In the intervening two years, the debate over CSS has not diminished, and there has been one significant development which is reflected in a new section in the text. In May 2022, the European Commission released a proposal “laying down rules to prevent and combat child sexual abuse.”
As is typical of much contentious legislation, the proposal requires online companies to do something – in this case, to recognize CSAM in both known and “previously unseen” forms – but does not specify how this is to be done. Although this is dressed up as a “technologically neutral approach,” it is really about concealing the fact that nobody knows how to detect CSAM reliably without breaking encryption, which the EU professes to support. As the security experts note in their new section on the EU proposal:
“So while the proposed regulation specifically endorses end-to-encryption, noting that it is “an important tool to guarantee the security and confidentiality of the communications of users, including those of children,” the reality is that there are no feasible solutions for on images that may or may not exist within an encrypted message. Thus, the technical capabilities within the proposal cannot be met without either CSS or abandoning end-to-end encryption. That leaves only client-side scanning, a subject on which the proposal is completely silent. But as we have observed here, such a ”solution” cannot be efficacious. Thus, even without considering encryption or CSS, the scheme cannot be implemented with current technology.”
The EU’s vague plans effectively mandate CSS, which makes the newly published paper exploring its many problems even more relevant. The previous PIA blog post explained the two main technologies used for image scanning – perceptual hashing and machine learning – and went on to look at the particular security risks of using them to carry out surveillance on the client. Here, I’d like to revisit this important work by focusing on the section devoted explicitly to privacy. Perhaps the most obvious threat that CSS represents to privacy is the following:
“The deployment of CSS changes the game completely by giving access to data stored on users’ devices. First, it facilitates global surveillance by offering economies of scale. Second, while proposals are typically phrased as being targeted to specific content, such as CSAM, or content shared by users, such as text messages used for grooming or terrorist recruitment, it would be a minimal change to reconfigure the scanner on the device to report any targeted content, regardless of any intent to share it or even back it up to a cloud service. That would enable global searches of personal devices for arbitrary content in the absence of warrant or suspicion. Come the next terrorist scare, a little push will be all that is needed to curtail or remove the current protections. Automated reporting provides a means to scale up such attacks.”
This is the fundamental privacy problem with client-side scanning. Once there is code designed to scan material on a device, there is nothing to stop it being extended – perhaps incrementally or surreptitiously – to go further. That might include scanning every service and store on the device, without the need to go through the judicial system to obtain permission to do so. Because the scanning software would be running all the time, and inscrutably, people would not even know what had been scanned or what had been sent back to central servers.
The other obvious threat involves an expansion of target material. Politicians use the fight against CSAM because nobody could object to tackling the problem. But once CSAM scanners are in place, the technology can easily be repurposed to seek out other material. In fact, there are multiple ways to do this. An authorized party can extend the search by demanding that the service provider, or the targeted-material provider creating the list of targeted content, or the provider training the machine-learning model, add new material. Unauthorized parties could do the same using illicit methods such as bribing or coercing staff, or breaking into the computers they use.
This threat is particularly troubling in countries that already use the internet to throttle political debate and to spy on opponents. Once CSS technology is built into operating systems and devices, it will be a simple task to modify the search list, which means repressive governments could quickly demand additional terms to be blocked or tracked. Software and hardware companies will have little option but to obey: if they don’t, there are a range of punitive measures readily available, including fines, bans, and even imprisonment of local employees. The only thing stopping this now is the absence of CSS capabilities in current models. Once it exists, it will almost certainly be abused.
It is rather depressing that the debate about CSS has not moved on in the two years since the original preprint appeared. The basic facts about the technology and its weaknesses have not changed, while the importance of protecting digital privacy has increased, for reasons the paper’s authors explain:
“In a world where our personal information lies in bits carried on powerful communication and storage devices in our pockets, both technology and laws must be designed to protect our privacy and security, not intrude upon it. Robust protection requires technology and law to complement each other. CSS would gravely undermine this, making us all less safe and less secure.”
It is time for governments to understand this and move on to alternative approaches to tackling CSAM – ones that don’t involve breaking encryption or putting spies in our pockets.