EFF Agrees: Protecting Privacy Is the Best Way to Address the Internet’s Biggest Problems

Posted on Nov 24, 2023 by Glyn Moody

The Electronic Frontier Foundation (EFF) is probably the best-known organization that’s fighting for digital rights. It’s just released a new report with a title that’s certain to interest the readers of this blog. Privacy First: A Better Way to Address Online Harms offers a wide-ranging look at some of the key problems in the online world along with suggestions for how to address them. Strikingly, it identifies one overriding factor in all these problems:

The truth is many of the ills of today’s internet have a single thing in common: they are built on a system of corporate surveillance. Multiple companies, large and small, collect data about where we go, what we do, what we read, who we communicate with, and so on. They use this data in multiple ways and, if it suits their business model, may sell it to anyone who wants it—including law enforcement. Addressing this shared reality will better promote human rights and civil liberties, while simultaneously holding space for free expression, creativity, and innovation than many of the issue-specific bills we’ve seen over the past decade.

This is precisely what this blog has been pointing out for years: corporate surveillance, particularly in the form of surveillance advertising, is one of the biggest threats to online privacy, and addressing it will reduce many of today’s most serious problems. The EFF report provides an excellent explanation of how surveillance advertising lies at the heart of those issues.

For example, many people are worried about the impact of social media algorithms on children’s health. The EFF points out that the aggregation of children’s personal data allows predatory and exploitative ads to target children. As a consequence, if online behavioral advertising is banned, you remove most of the incentive to collect and weaponize children’s preferences, along with the drive to turn children into consumers and many of the concerns about social media use by young people. As the EFF points out, this is a strategy that focuses on “the key reasons underlying these harms, rather than trying to stick a band-aid over the top.”

Another concern is that law enforcement agencies will be able access personal data gathered through surveillance advertising, and from other sources. If you ban the former, there is far less data available for authorities to demand from companies. Similarly, the impact of online services on local journalism is a worry that can be addressed by cutting off the huge profits made from behavioral advertising. Instead, echoing many PIA posts, the EFF calls for a different approach:

“Contextual ads” can limit that claimed competitive advantage and protect users from tracking. True contextual ad markets are harder for tech giants to capture. While a tech company may know everything about a reader’s web history and recent purchases, no one knows more about the content of a publication than its direct publisher.

In the same way, banning surveillance advertising would reduce the disproportionate power of internet giants who derive most of their profits from exploiting these huge data stores. In turn, this would allow more competition in the social media sector, since the playing field would be less tilted in favour of bigger companies. Even worries about foreign government surveillance – for example, by China thanks to the rise of TikTok, or indirectly through data brokers – can be assuaged by cutting off the supply of highly personal information gathered through surveillance advertising. The EFF notes that growing concerns about the implications of AI – something we wrote about back in February – can also be lessened if privacy protections are improved in general.

In terms of what new, stronger privacy laws addressing these problems should contain, the EFF has a handy checklist of key components. They include:

  • No behavioral ads based on surveillance advertising – this is the key to everything
  • Real minimization – only strictly necessary personal data can be processed by companies
  • Strong opt-in consent, which must be informed, voluntary, and specific
  • User rights allowing access to data, the ability to port it elsewhere, correct it and delete it
  • No preemption by federal law in the case of the US: federal privacy law must be a floor and not a ceiling
  • Strong enforcement and meaningful impact
  • No pay-for-privacy policies – something that Meta just introduced in the EU
  • No deceptive designs in the form of dark patterns

The rest of the EFF’s report fills out many of these ideas. In doing so, it links back to dozens of its earlier posts about privacy, making this new document an easy way to find those important analyses. The report concludes by noting that:

Doing privacy first is an alternative, practical, way forward that has a real shot at solving the shared problem that fuels many of today’s harms. It creates a path toward a better future, where the interests of the companies that create the technical platforms and tools that we all rely on are better aligned with our interests in living our lives consistent with human rights and civil liberties. We would be less stuck in a world of relentless tracking, discrimination, and the technology monopolies that limit and control our access to information and opportunities.

It’s great that EFF has come out so clearly for something we have been advocating for six years now: moving from intrusive, privacy-busting surveillance advertising to the more respectful, but equally effective contextual approach. Now we just have to convince the politicians.

Featured image by EFF.