Children Disagree with Chat Control: They Want Privacy, Not Surveillance

Posted on Jun 8, 2023 by Glyn Moody

One of the laziest justifications for more surveillance is that we must “think of the children,” i.e. it is acceptable to reduce or even abolish privacy in some contexts in order to protect the most vulnerable in society. As we explained on the PIA blog last year, this is precisely the reasoning behind the EU’s “Chat Control” proposals, which would oblige online service providers to search digital communications for suspicious content – and do so generally and indiscriminately. The widespread use of end-to-end encryption makes the latter difficult, but not impossible.

The battle over the new EU legislation that would eliminate end-to-end encryption is continuing and fierce. Those in favor of undermining end-to-end encryption claim that today’s messaging systems are so well protected by strong encryption that the police cannot do their job catching criminals who harm and exploit children. On the other side, privacy campaigners and just about every technical expert point out that it is not possible to break encryption safely.

If backdoors are introduced, they bring with them general weak points that can be exploited by anyone. The alternative, client-side scanning, is arguably even worse.

What Do the Children Think of Ending Online Privacy?

One group whose opinion has rarely been heard is those the new Chat Control is supposed to protect: young people. The EDRi network of digital rights organizations and experts has conducted a large-scale survey of 8,000 minors across 13 EU countries. 80% of those questioned said they would not feel comfortable being politically active or exploring their sexuality if authorities were able to monitor their digital communication. Here are the other main results:

  • 66% of respondents don’t approve of Internet providers monitoring their digital communication for suspicious content.
  • 67% rely on encrypted communication apps like WhatsApp or Signal.
  • 1 in 3 respondents use communication apps, dating apps, or other apps to send intimate photos.
  • only 2% of minors think that scanning all private communications for harmful material is the most effective and appropriate way to protect them from harm on the Internet.

An excellent post by the academic Paul Bernal, written in 2021, points out that while privacy is important for everyone, it is arguably even more important for young people:

We need privacy from those who have power over us – an employee needs privacy from their employer, a citizen from their government, everyone needs privacy from criminals, terrorists and so forth. For children this is especially intense, because so many kinds of people have power over children. By their nature, they’re more vulnerable – which is why we have the instinct to wish to protect them. We need to understand, though, what that protection could and should really mean.

Is Parental Monitoring Justified?

Naturally, parents are the principal group that wishes to protect children. An article in The Guardian last year looked at the growing use of location-tracking apps installed on a young person’s mobile phone for the purpose of allowing parents to monitor where they are and what they are doing. The scale of this is indicated by the following information about one of the leading apps:

Life360 is used by 32 million people in more then 140 countries; it’s currently the seventh most downloaded social-networking app on the App Store and its San Francisco-based company has been valued at more than $1bn. A survey of 4,000 parents and guardians in the UK in 2019 found that 40% of them used real-time GPS location tracking on a daily basis for their children; 15% said that they checked their whereabouts “constantly”.

The Guardian article makes clear that there is a tension between wanting to keep an eye on children in order to protect them from harm, and the desire to grant them freedom so that they can grow as a person – an equally important consideration. There is no obvious or easy solution to that; it probably requires negotiation between parents and children on a case-by-case basis.

Schools Can Also Invade Children’s Privacy

Schools are another location where questions of safety and privacy interact in a complex way. MIT Technology Review has a fascinating article about the routine use in Danish schools of software that monitors children and their well-being in order to help them if necessary. For example, the Web app Woof uses a cartoon dog to ask children questions about their life and mood:

Teachers and administrative staff can read weekly reports on a class’s overall self-reported mood and how factors like their sleep hygiene, social activity, academic performance, and physical activity affect that mood. Classrooms are profiled, and interventions are recommended to improve the scores in categories where they are doing less well. Finally, the teacher and the children look at the data together and help each other with tools and strategies to improve these sticking points.

Woof anonymizes the data by providing classroom averages, rather than the results for each child. However, other platforms such as Bloomsights, Moods, and Klassetrivsel have no qualms about providing information about individual children to teachers. One argument is that anonymous data makes it harder to identify which particular child has problems, and thus to provide timely help.

According to the MIT Technology Review article, Klassetrivsel doesn’t even require consent from parents or children before the app is used. It argues that since the tool is being used for “well-being purposes” at a public institution, it is exempt under Danish law. Fortunately, Danish parents can opt out entirely if they don’t want data collected on their children.

As these various situations involving children indicate, there is a great degree of variability in the consent required, and in the privacy that results. That’s another reason why the simplistic all-or-nothing approach of the EU’s Chat Control proposals are a bad fit for what young people want, and for how things are in real life.

Feature image by Woof Technologies.