Yeah, So It Turns Out Rite Aid Has Been Using Face Recognition at Hundreds of Stores

We may earn a commission from links on this page.
A Rite Aid in Chelsea, January 2018.
A Rite Aid in Chelsea, January 2018.
Photo: Bryan A. Smith/AFP (Getty Images)

Rite Aid used face recognition to scan the faces of every single customer who walked into hundreds of its stores in a program encompassing most of the past decade, Reuters reported on Tuesday.

The program spanned eight years, two vendors, and was implemented in 200 stores—a disproportionate number of which were located in non-white, low-income neighborhoods. CCTV cameras would record everyone entering those stores. Then their faces would be analyzed and added to a unique “profile,” though customers were very likely unaware of its existence. Managers could also approve Rite Aid loss prevention (security) staffers to add individuals they found to be suspicious or potentially engaged in criminal activity to a watch list. Any time an individual the system identified as on the watch list entered the store, as well as prompted to check for accuracy. They could then take action such as asking the person to leave.

Advertisement

Reuters visited all 75 Rite Aid locations in Manhattan and central Los Angeles from October 2019 to July 2020, finding face recognition systems in 33 of them. Stores located in impoverished neighborhoods or which had high populations of non-white people were several times more likely to have the cameras:

Stores in more impoverished areas were nearly three times as likely as those in richer areas to have facial recognition cameras. Seventeen of 25 stores in poorer areas had the systems. In wealthier areas, it was 10 of 40. (Ten of the stores were in areas whose wealth status was not clear. Six of those stores had the equipment.)

... Of the 65 stores the retailer targeted in its first big rollout, 52 were in areas where the largest group was Black or Latino, according to Reuters’ analysis of a Rite Aid planning document from 2013 that was read aloud to a reporter by someone with access to it. Reuters confirmed that some of these stores later deployed the technology but did not confirm its presence at every location on the list.

Advertisement

Face recognition tech is demonstrably racist: research has shown that systems relying on it often have high error rates and fare worse when trying to identify Black and other people of color, particularly women. Even if developers somehow manage to eliminate racial bias in recognition, that doesn’t eliminate bias on behalf of the operator, so the technology can easily be weaponized to target people of color, activists, immigrants and refugees, the homeless, or other groups. Claire Garvie, an author of two reports by the Georgetown Law Center on Privacy & Technology, told the New York Times last year that over 2,800 people had been arrested from 2010 to 2016 based on face recognition scans and that the technology had been used in over 8,000 cases in 2018.

Advertisement

Despite the obvious potential that face recognition could serve as an arbitrary pretext for cops, there’s no federal law governing its use. That means it is completely unregulated except in a handful of states that have passed biometrics privacy laws and in cities which have banned the use of face recognition by government officials.

Advertisement

According to Reuters, from 2012 to 2017 Rite Aid’s system deployed on a system called FaceFirst, which had high error rates and spat out hundreds of hits when prompted with fuzzy photos.

“It doesn’t pick up Black people well,” one of the staffers who worked at a location in a Black neighborhood in Detroit, told Reuters. “If your eyes are the same way, or if you’re wearing your headband like another person is wearing a headband, you’re going to get a hit.”

Advertisement

In 2018, Rite Aid began to transfer to technology provided by DeepCam, which staffers told the news agency took pictures every time a person stepped in front of a camera to build unique profiles of their faces using machine learning and was considerably more accurate. DeepCam also had a deep business relationship with a Chinese company that was in turn mostly capitalized by a Chinese government fund, according to Reuters. The news agency reported it couldn’t find any evidence that data had been transferred to China.

A Rite Aide spokesperson told Reuters the cameras were clearly labeled with “signage,” but Reuters found over a third of the 75 stores didn’t have signs showing customers and staff were under surveillance. The spokesperson also said that the system was only ever intended to prevent crime and relied on “multiple layers of meaningful human review,” but that the retailer had now turned off DeepCam in all of its stores.

Advertisement

“This decision was in part based on a larger industry conversation,” the spokesperson told Reuters. “Other large technology companies seem to be scaling back or rethinking their efforts around facial recognition given increasing uncertainty around the technology’s utility.”

“We cannot stand for racial injustice of any kind, including in our technology,” FaceFirst CEO Peter Trepp told Reuters, saying that their information had “extensive factual inaccuracies” and was not based on “credible sources.”

Advertisement

While some companies are backing away from face recognition—for now, anyways—the trend appears to be moving in the other direction on the national level. Airports in Hawaii are using face recognition to screen for individuals who may have the coronavirus, face recognition cams are starting to be deployed in schools, and the federal government and police departments across the county are using it, largely with no safeguards other than contractual terms laid out by vendors.

Following widespread protests against police racism and the police killing of Minneapolis man George Floyd, IBM announced it will no longer work on face recognition tech at all. But Microsoft has only said it will maintain a moratorium on police sales until federal laws regulating its use are passed. Amazon, which has close relationships with the Departments of Defense and Homeland Security, said in June it will only halt police sales for a year. Numerous smaller surveillance firms have continued to provide face recognition tools to police.

Advertisement

[Reuters]

Advertisement