Ticker

6/recent/ticker-posts

Facial recognition technology is already widely used, but it still may not be able to recognize you


 


Facial recognition technology is everywhere, from iPhones to candy vending machines to the TSA.
However, there is one problem: they may not recognize you.
As Business Insider reported last month, some Walmart Spark delivery workers are experiencing this problem. Since last fall, Spark has asked its drivers to take three selfies through the app, compare the images to their photo ID, and regularly check who they are.
However, some drivers claim they were blocked from accessing the facial recognition app after their identity was verified, despite using the account in their own name.
It turns out that such experiences are not uncommon in the wider world of facial recognition. And they are more common if you are people of colour.

FACIAL RECOGNITION FREQUENTLY MISIDENTIFIES PEOPLE OF COLOR

The New York Times reported in 2018 that Joy Buolamwini, a researcher at the MIT Media Lab, found that facial recognition technology misidentifies Black women up to 35 percent of the time. In contrast, white men were misidentified only one percent of the time.

Why: AI models are disproportionately built using photos of white men. The study says that when there are fewer photos of people from other racial and gender groups, facial recognition technology becomes less accurate at identifying people from this background.

A 2019 federal study found similar evidence, showing that Asians, African Americans and women are far more likely to be misidentified by technology than white men.

And a 2022 report from MIT Technology Review found that Uber's facial recognition feature, used like Spark for identity verification, may have misidentified drivers in India. One reason for this is that AI models are not as well trained as Caucasian faces in South Asia. .

The report also shows that simple changes in appearance, such as growing a beard, can cause the technology to fail to recognize people's faces.


FACIAL RECOGNITION DEVELOPERS SAY MISTAKES ARE POSSIBLE

None of this seems to have stopped the technology from reaching new places. For example, large landlords have been outfitting their office buildings with new technology as workers return to the office.

As The New York Times reported in February, airline passengers could be subjected to facial recognition at several points in the airport, from security checkpoints to customs and passport control.
Walmart is partnering with Persona to add facial recognition to its Spark app. While the Persona website isn't specifically focused on identifying people of color, it acknowledges that the technology carries risks.

"For example, a person's glasses can trick a computer into thinking it's a reflection on a screen, or a low-resolution photograph can trick a computer into thinking it's a digital reproduction," the website says.

"These deficiencies could increase the risk of false negatives during the verification process, which could result in legitimate users being denied verification," he continued. Persona and Walmart did not respond to BI's requests for information about the technology or the partnership.
TSA and CBP did not respond to BI's request for comment. In February, a TSA spokesperson told the New York Times that phasing out biometric technologies such as facial recognition would “take years,” but did not specifically mention race.


RETAILERS' ATTEMPTS TO USE FACIAL RECOGNITION TO CATCH SHOPLIFTERS HAVEN'T GONE WELL

In some cases, you may be able to forego facial recognition technology and its high error rate. For example, if a facial scan is required to board the flight in front of you instead of a boarding pass, gate agents will be required to present a hand-held ID and control card instead, CBP said.

However, Gideon Christian, a law professor at the University of Calgary who has written about the legal and social aspects of facial recognition technology, said facial recognition often cannot be circumvented. He mentioned the decision by some retailers to use technology in their stores to catch shoplifters.

For example, the FTC announced in December that drugstore chain Rite Aid had improperly used facial recognition technology in “hundreds of stores” to the detriment of women and people of color.

Samuel Levin, the FTC's consumer protection director, said at the time that Rite Aid's use of facial recognition technology "exposed customers to humiliation and other harm, violated the order, and compromised sensitive consumer information."

As a result, the FTC banned Rite Aid from using facial recognition technology in its stores for five years. Rite Aid accepted the settlement but disputed the FTC's claims about where and how facial recognition was used. The company said it had only tested the technology in a “limited number of stores,” for example.

Christian said the racial bias caused by facial recognition is what Black people experience when shopping today.

“Facial recognition technology basically makes the same unsubstantiated claims because the face matches another person in the database and an alert is triggered,” Christian said in an interview.

Christian told BI that the lack of privacy and accuracy issues should be reason enough to stop using it.

"I don't think the fundamental rights of every person who enters a retail store should be violated just because you want to catch a few shoplifters," he said.

Post a Comment

0 Comments