Privacy issues with Palantir’s move into law enforcement, and how to tackle them

Posted on Aug 14, 2017 by Glyn Moody

The name “Palantir” bespeaks mystery. Its original meaning is the powerful artifact in Tolkien’s “Lord of the Rings” that allows direct communications with distant lands. It is now also the name of a secretive company, Palantir Technologies, founded in 2004 by a number of Silicon Valley luminaries, chief among them Peter Thiel. At heart, Palantir is a data analysis company, “focused on creating the world’s best user experience for working with data, one that empowers people to ask and answer complex questions without requiring them to master querying languages, statistical modeling, or the command line.”

Its Web site details two main products, Palantir Gotham and Palantir Metropolis. It deploys these in a wide range of markets, including disaster preparedness, healthcare delivery, insurance analytics, crisis response, defense, disease response and pharma R&D. One sector likely to be of particular interest to readers of this blog is Palantir’s work in providing data analysis tools for law enforcement. A striking feature of a corporate white paper on the subject is its constant emphasis on protecting privacy through what it calls “civil liberties engineering“:

“This white paper describes key technical features of the Palantir Platform that can be configured to protect privacy and civil liberties and meet evolving regulatory standards. It also describes our approach to civil liberties engineering as a confluence of efforts working with customers, developers, advisors, and other relevant stakeholders to build and help implement solutions that effectively address the law enforcement challenges of today and tomorrow.”

A key aspect of protecting the privacy of those whose information is held on Palantir’s police databases is through the use of access controls:

“The Palantir Platform is designed with robust and granular access controls that allow LEAs [law enforcement agencies] to ensure users are exposed only to the information they need to do their job and are lawfully entitled to see.”

An important new investigation of Palantir’s work with LEAs by Wired reveals that those access controls are not always applied. Moreover, in one particular case, repeated requests to fix this privacy vulnerability went unanswered by Palantir’s engineers. According to Wired, police forces are encountering a wide range of problems when deploying Palantir’s products:

“In the documents our requests produced, police departments have also accused the company, backed by tech investor and Trump supporter Peter Thiel, of spiraling prices, hard-to-use software, opaque terms of service, and “failure to deliver products” (in the words of one email from the Long Beach police).”

Those issues are secondary to the important underlying trends that Wired has discovered. For example, the basic premise of Palantir’s products – that it will help users manage large quantities of data – has encouraged US police forces to add more and more sensitive information in the hope of finding useful links and patterns. These include databases of regional crime data, field interviews, explosive-related incidents, jail visitation records plus information about large numbers of people who had never been convicted of any crime. For example, information about cars and their drivers, as well as the output from automated license plate readers. Analyzing so much data becomes impossible for human operatives, and so Palantir has automated the process:

“software collects data within certain geographical areas from multiple sources, and is used to provide regular, automatic intelligence updates on critical infrastructure sites – similar to how your Facebook feed scours hundreds of friends’ pages to produce a rolling digest of what it thinks are the most interesting posts. With this kind of algorithmic filtering, no human need ever look at the actual raw intelligence data. But as any Facebook user knows, such filters can produce results of wildly variable quality. And in police work, bad data can be dangerous.”

This kind of approach also leads to opaque results – it is hard for law enforcement officers to understand why particular information is surfaced, and correspondingly hard for members of the public to challenge policing decisions made on the basis of that information. More data also means wider scope for abuse. The increasing reliance by law enforcement agencies on these massive, interlinked databases running on Palantir software brings with it another major problem:

“Palantir sells its technology to police forces on the basis that it breaks down silos, connects databases, and enables sharing between jurisdictions, saving everyone time and resources. The promise, however, comes with one big catch: You don’t get that benefit unless other agencies are also using Palantir.”

This is a classic network effect: the greater the number of US police forces that use Palantir’s products, the more they will want or even need to encourage their colleagues to do the same in order to derive maximum benefit from the sharing of information. The lock-in that results has led to rapidly increasing prices that police forces have found hard to resist given their growing dependence on Palantir systems.

The obvious solution is to find another supplier, or to build an equivalent system from scratch. But as the New York Police Department (NYPD) discovered recently, there’s another issue that results from using Palantir’s products. Although Palantir told Buzzfeed that NYPD’s “data and analysis are available to them at all times in an open and nonproprietary format,” it seems that the company refuses to provide the analysis in a format that allows it to be imported into the new system that has been written by NYPD engineers in-house. To continue to access previous analysis of its data, the NYPD would be forced to run two systems – its own plus Palantir’s – which would negate much of the benefit of both.

If protecting fundamental rights to privacy and civil liberties is indeed a “core component” of Palantir’s mission, as it trumpets in its 2016 Annual Report, it should offer a truly open and nonproprietary format for transferring both data and analyses – something it seems reluctant to do, judging by NYPD’s experience. That would allow other companies to offer solutions supporting those open formats as an alternative to being locked into a Palantir monoculture. The history of technology teaches us that competition leads to better, safer solutions, with less risk of serious security failures propagating widely and causing proportionately greater harm to privacy than ones that affect only parts of the data analysis ecosystem. Moreover, fully open systems are more transparent in their workings and more amenable to meaningful oversight. It’s not as if you need a crystal ball to see these things…

Featured image by Palantir.

Comments are closed.

2 Comments

  1. Claudia Hacker

    Ridiculous premise! If GM has a new car component, it should share it with other car companies? I understand it’s not the same thing, but my understanding is Palantir is a for profit company. They would not get far if they do all the initial work and then once the hard part is done, give away their methods to others. Sounds like the author has a personal vendetta to me!

    7 years ago
    1. Glyn Moody

      It’s not about giving away the methods, just the results of the methods. This is about sharing the data and analyses of that data, both of which belong to the law enforcement agencies. Palantir says it supports open data formats, but in practice, it doesn’t. It should for everyone’s sake.

      7 years ago