US FTC Order Sets Minimum Privacy Requirements for Biometric Surveillance in Commercial Settings

Posted on Jan 11, 2024 by Glyn Moody

Over the last couple of years, it has become increasingly common for customers to be watched, tracked, and analyzed in stores. A key element of this surveillance is AI-based facial recognition. Companies say that the facial recognition software helps prevent shoplifting by recognizing people who have allegedly stolen goods before, even though it recently emerged that claims about “organized” shoplifting have been exaggerated.

However, facial recognition technology is not a panacea, and represents a serious threat to privacy. This was recognized by the US Federal Trade Commission (FTC) as far back as 2012, when it issued best practices for companies using facial recognition. In May 2023, the FTC warned about the misuse of facial recognition technology and how it raised “significant consumer privacy issues and the potential for bias and discrimination.” Now, the FTC has announced that it is banning the Rite Aid drugstore chain from using AI facial recognition for surveillance over the next five years:

In a complaint filed in federal court, the FTC says that from 2012 to 2020, Rite Aid deployed artificial intelligence-based facial recognition technology in order to identify customers who may have been engaged in shoplifting or other problematic behavior. The complaint, however, charges that the company failed to take reasonable measures to prevent harm to consumers, who, as a result, were erroneously accused by employees of wrongdoing because facial recognition technology falsely flagged the consumers as matching someone who had previously been identified as a shoplifter or other troublemaker.

According to the FTC, Rite Aid’s action subjected consumers to “embarrassment, harassment, and other harm”. The FTC says that the company’s employees, acting on false alerts, “followed consumers around its stores, searched them, ordered them to leave, called the police to confront or remove consumers, and publicly accused them, sometimes in front of friends or family, of shoplifting or other wrongdoing”. A particular issue is that these actions disproportionately affected people of color.

An FTC blog post provides more details about the surveillance system. According to the FTC, Rite Aid oversaw the creation of a “watchlist database” of images of people the company claimed had engaged in actual or attempted criminal activity at one of its stores. Along with often low-quality images taken from CCTV, the database stored information such as first and last names, year of birth, and the alleged criminal activity. Eventually, the watchlist database held information about tens of thousands of people. If someone entered the store who supposedly matched an image in the database, employees received an alert. The FTC post includes details of how badly the system allegedly worked, including the following:

During one five-day period, Rite Aid generated over 900 separate alerts in more than 130 stores from New York to Seattle, all claiming to match one single image in the database. Put another way, Rite Aid’s facial recognition technology told employees that just one pictured person had entered more than 130 Rite Aid locations from coast to coast more than 900 times in less than a week.

The FTC complaint alleges that Rite Aid failed to:

  • Consider and mitigate potential risks to consumers from misidentifying them
  • Test, assess, measure, document or inquire about the accuracy of its facial recognition technology before deploying it
  • Prevent the use of low-quality images in connection with its facial recognition technology
  • Monitor or test the accuracy of the technology regularly after it was deployed
  • Train employees tasked with operating facial recognition technology in its stores adequately, or flag that it could generate false positives

The proposed settlement would ban Rite Aid from using facial recognition systems at its retail stores for five years. Moreover, the company would have to delete the videos and photos it collected between 2012 and 2020, as well as the data, models, and algorithms involved. And as the FTC blog post explains:

If Rite Aid has an automatic biometric security or surveillance system in place in the future, under the proposed order, it must give individualized, written notice to any consumer the company adds to its system and anyone that it takes action against as a result. Rite Aid also would have to implement a robust consumer complaint procedure. In addition, the company would have to clearly disclose to consumers at retail locations and online if it’s using automatic biometric security and surveillance and the notices must be placed where consumers can read them in time to avoid the collection of their biometric information.

What makes this FTC order so important is that it not simply about one company. It establishes a wide-ranging set of requirements that must be met by any company that wants to use biometric surveillance and automated decision-making technology. As the FTC Commissioner Alvaro M. Bedoya emphasized in a statement on the move: “I want industry to understand that this Order is a baseline for what a comprehensive algorithmic fairness program should look like.” In this respect, it could play a similar role in the US as the various privacy laws do in the EU.

Bedoya also added the following important comment: “No one should walk away from this settlement thinking that this Commission affirmatively supports the use of biometric surveillance in commercial settings”. He even went so far as to say: “there is a powerful policy argument that there are some decisions that should not be automated at all; many technologies should never be deployed in the first place.” That’s a strong statement about the importance of privacy, and how it must be protected when powerful new technologies are introduced. Bedoya urged legislators who want better protections against biometric surveillance to pass laws to that effect. Even if that doesn’t happen, this latest FTC action alone should enhance privacy protections in the US, and support those fighting to do the same elsewhere.

Featured image by Carol M. Highsmith.