Facial recognition concerns go mainstream in the US, as cities and companies bring in bans

Posted on Aug 3, 2019 by Glyn Moody

Recently, FaceApp was much in the headlines. It allows users to submit photos of faces and modify them in interesting ways – making peple look older, or younger, or changing their expression. It was undoubtedly a cool use of AI technology. But what was most interesting about the episode is how quickly people realized that there were important privacy implications. Many worried that their images were being harvested for use in facial recognition systems. The fact that the company behind FaceApp was Russian added to the fears, even though these seem largely unjustified.

The storm over FaceApp is an indication that awareness of facial recognition – and its very real dangers – has now gone mainstream. One remarkable demonstration of this is that one of the leading police body-cam companies, Axon, has voluntarily banned the use of facial recognition on its devices. That follows the release of the first report by the Axon AI & Policing Technology Ethics Board, which wrote:

Face recognition technology is not currently reliable enough to ethically justify its use on body-worn cameras. At the least, face recognition technology should not be deployed until the technology performs with far greater accuracy and performs equally well across races, ethnicities, genders, and other identity groups.

Mirroring this new appreciation of the risks, there are an increasing number of moves in the US to limit the use of facial recognition. In May of this year, San Francisco became the first US city to ban the use of facial recognition by police and other agencies. Somerville, Massachusetts, was next to introduce a halt, followed by Oakland, which intends to “prohibit police from both acquiring the software or using it at all, including if used by other police agencies”. A useful map from Fight for the Future shows both places with bans and the rather more widespread use of facial recognition by airports, local and state police.

Alongside what might be called “traditional” applications of facial recognition, there’s a new context where its use raises important privacy issues. It arises from the increasing availability of low-cost “smart” home products, something that has been discussed a number of times on Privacy News Online. The technology offers landlords new opportunities to carry out continuous and routine surveillance of their tenants. Some want to use facial recognition systems to replace conventional key systems. As well as potential technical problems such as faces not being recognized, or unauthorized individuals being granted access, there is a more general concern. Such systems could infringe on people’s privacy by keeping detailed records of when they enter and leave – and with whom. Federal legislation has been drawn up in a first attempt to address this issue, as Cnet reports:

The proposed bill would prohibit all public housing units that receive funding from the Department of Housing and Urban Development from using technology like facial recognition, according to a person familiar with the legislation.

The bill would also require HUD to submit a report on facial recognition, detailing its impact on public housing units and their tenants.

Fight for the Future wants to go even further. It has launched a campaign to outlaw facial recognition in the US completely:

Like nuclear or biological weapons, facial recognition poses a threat to human society and basic liberty that far outweighs any potential benefits. Silicon Valley lobbyists are disingenuously calling for light “regulation” of facial recognition so they can continue to profit by rapidly spreading this surveillance dragnet. They’re trying to avoid the real debate: whether technology this dangerous should even exist. Industry-friendly and government-friendly oversight will not fix the dangers inherent in law enforcement’s use of facial recognition: we need an all-out ban.

Part of the problem is that it is so easy to gather facial images. For example, recently it emerged that the FBI, and US Immigration and Customs Enforcement, are both mining state driver’s license databases for photos of US citizens, without their consent. The images are then run through facial recognition software in order to spot people of interest, for example immigrants.

More generally, the Internet is a rich resource that can be mined for images of people’s faces. Companies like Facebook and Google are gathering huge numbers every day, which they can then feed into their algorithms for various purposes. For those without direct access to such datasets, there are other routes. For example, popular sites with facial images can be scraped, usually without the permission of the owner. There are also face and person recognition datasets that were created for academic purposes. The site Megapixels is gathering information to create a consolidated resource about them. A typical example is Microsoft’s Celeb Dataset. According to Megapixels:

MS Celeb is the largest publicly available face recognition dataset in the world, containing over 10 million images of nearly 100,000 individuals. Microsoft’s goal in building this dataset was to distribute an initial training dataset of 100,000 individuals’ biometric data to accelerate research into recognizing a larger target list of one million people “using all the possibly collected face images of this individual on the web as training data”.

The dataset was first released in 2016, but later withdrawn as concerns grew about how it might be misused. However, as Megapixels notes, it is still available in several repositories on GitHub, and as personal copies held by researchers around the world. The key problem is that once they have been released, there is no way to stop such images being further shared. The same is true of facial recognition. If a name has been associated with a face, that linkage is likely to be propagated to other collections that incorporate the data. Once anonymity is lost in the digital world it is hard, if not impossible, to get it back.

Featured image of Microsoft’s Celeb dataset by Megapixels.