Tech

ICE's Use Of Facial Recognition Tech Could Threaten Women Of Color

Face recognition tech is notoriously inaccurate when it comes to ID'ing people of color. So what might ICE's use of the tech mean?

ICE's Use Of Facial Recognition Tech Could Threaten Women Of Color
Getty Images
SMS

Although they weren't present, both the FBI and ICE were under fire at a bipartisan hearing this week about the ethics of using facial recognition technology in their work. The Washington Post recently reported that ICE has been secretly using the tool to comb through driver's license photos — to target undocumented immigrants in states where they're legally allowed to get licenses.  

Lawmakers take issue with violating citizens' privacy and want more regulations, while activists want to ban it altogether. Although they're concerned about the invasion of privacy of all people, one of the main arguments against the technology is that it's been found to misidentify people of color — especially women. 

"Hi, camera. I've got a face. Can you see my face? No-glasses face? You can see her face. What about my face? I've got a mask. Can you see my mask?" 

Joy Buolamwini is a computer scientist at MIT who found that facial recognition technology misclassified darker women's gender nearly 35% of the time. Meanwhile, when it came to white men, it was inaccurate less than 1% of the time.

Researchers: FBI, ICE Use Facial Recognition Software On DMV Photos
Researchers: FBI, ICE Use Facial Recognition Software On DMV Photos

Researchers: FBI, ICE Use Facial Recognition Software On DMV Photos

Georgetown researchers say the FBI and ICE have been scanning images for half a decade.

LEARN MORE

Buolamwini: "When we analyze … by subgroups, we found all companies performed worse on darker females."

The ACLU also did its own test of Amazon's face recognition technology and found it mistakenly matched members of Congress with mugshots of people who committed crimes. The false matches were disproportionately people of color.

Suresh Venkatasubramanian is a computer science professor at the University of Utah. He says: "So what you're going to get is, for example, a system that's trained to do facial recognition is likely to have more of a false positive on minority groups, which means more people are going to be caught up in dragnets for no reason because the system flags them as a false positive. And that's going to be a big problem."

While we don't yet know how ICE is using information collected from facial recognition technology, it does raise concerns for the vulnerable populations with whom the agency interacts. For example, the number of pregnant women in ICE custody in inhumane conditions increased during President Trump's administration, and as many as 28 of them miscarried. Activists say ICE's reliance on flawed facial recognition system poses dangers beyond misidentification. 

We reached out to ICE to ask how they're using the technology, but they provided no detail: 

"Due to law-enforcement sensitivities, ICE will not comment on investigative techniques, tactics or tools."

At the hearing, Republican Representative Michael McCaul said the tool has "really protected the nation" from security threats. 

San Francisco Votes To Ban City Depts. From Using Facial Recognition
San Francisco Votes To Ban City Depts. From Using Facial Recognition

San Francisco Votes To Ban City Depts. From Using Facial Recognition

City officials voted 8-1 in favor of legislation that would ban facial recognition tools from being used by local government.

LEARN MORE

While Buolamwini says the tech's accuracy can be improved, Venkatasubramanian says the larger question is whether we should even adopt such systems considering the threats they pose to marginalized people.  

He said: "Historically, we know that whenever a new technology for surveillance comes up, it's always the underprivileged, the minorities and the poor, who are the ones who get surveilled the most, and the people who have wealth and privilege have ways of escaping that surveillance."