© 2024 WUTC
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

It Ain't Me, Babe: Researchers Find Flaws In Police Facial Recognition Technology

Stephen Lamm, a supervisor with the ID fraud unit of the North Carolina Department of Motor Vehicles, looks through photos in a facial recognition system in 2009 in Raleigh, N.C.
Gerry Broome
/
AP
Stephen Lamm, a supervisor with the ID fraud unit of the North Carolina Department of Motor Vehicles, looks through photos in a facial recognition system in 2009 in Raleigh, N.C.

Nearly half of all American adults have been entered into law enforcement facial recognition databases, according to a recent report from Georgetown University's law school. But there are many problems with the accuracy of the technology that could have an impact on a lot of innocent people.

There's a good chance your driver's license photo is in one of these databases. The report from the school's Center on Privacy & Technology says more than 117 million adults' photos are stored in them. Facial recognition can be used, for instance, when investigators have a picture of a suspect and they don't have a name.

They can run the photo through a facial recognition program to see if it matches any of the license photos. It's kind of like a very large digital version of a lineup, says Jonathan Frankle, a computer scientist and one of the authors of the report, titled "The Perpetual Line-Up."

"Instead of having a lineup of five people who've been brought in off the street to do this, the lineup is you. You're in that lineup all the time," he says. Frankle says the photos that police may have of a suspect aren't always that good — they're often from a security camera.

"Security cameras tend to be mounted on the ceiling," he says. "They get great views of the top of your head, not very great views of your face. And you can now imagine why this would be a very difficult task, why it's hard to get an accurate read on anybody's face and match them with their driver's license photo."

Frankle says the study also found evidence that facial recognition software didn't work as well with people who have dark skin. There's still limited research on why this is. Some critics say the developers aren't testing the software against a diverse enough group of faces. Or it could be lighting.

"Darker skin has less color contrast. And these algorithms rely on being able to pick out little patterns and color to be able to tell people apart," Frankle says.

Because of its flaws, facial recognition technology does bring a lot of innocent people to the attention of law enforcement.

Patrick Grother says most people have a few doppelgangers out there. He's a computer scientist with the National Institute of Standards and Technology — part of the Commerce Department. "The larger you go, the greater the chance of a false positive," he says. "Inevitably if you look at a billion people you will find somebody that looks quite similar."

And even with the photos taken at the Department of Motor Vehicles, there can be differences in how they are shot. Grother thinks if those photos are going to be used for facial recognition, more uniform standards in lighting, height and focus are needed. "Without those things, without those technical specifications, then face recognition can be undermined," he says.

And yet facial recognition software is sophisticated enough to be useful in critical situations. Anil Jain, a computer science professor at Michigan State University, did an experiment after the Tsarnaev brothers, who committed the Boston Marathon bombings, were caught. He wanted to see if facial recognition technology could have helped police name them sooner. Police had photos of them from a security camera. Jain ran those photos against a database of a million driver's licenses. The software found 10 matches for the younger brother.

"We were able to locate him in the top 10 candidates," Jain says. "But the older brother we couldn't locate, and the reason was he was wearing the dark glasses."

Of course it did identify nine people who were not guilty.

In a statement responding to the Georgetown study, the FBI says it uses facial recognition only as an investigative lead, not for positive identification.

The Georgetown authors aren't saying that this technology should never be used — only that lawmakers need to create standards; otherwise, it can be misused and harm innocent people.


Copyright 2021 NPR. To see more, visit https://www.npr.org.

Tags
Laura Sydell fell in love with the intimate storytelling qualities of radio, which combined her passion for theatre and writing with her addiction to news. Over her career she has covered politics, arts, media, religion, and entrepreneurship. Currently Sydell is the Digital Culture Correspondent for NPR's All Things Considered, Morning Edition, Weekend Edition, and NPR.org.