Police use of facial recognition technology gets spotlight

0

For all intents and purposes, facial recognition technology for law enforcement agencies is in its beta stage. Now that doesn’t mean police departments haven’t begun to (problematically) implement it in some form — with mixed results, along with legitimate concerns about racial bias and privacy. But as law enforcement agencies across the country say it can be a vital resource, tech companies are pushing to develop a better version of it. In turn, local governments and communities are figuring out how and when to put safeguards in place to ensure its usage is ethical.

Facial recognition software scans through thousands of photographs for indicators that might lead law enforcement to a suspect or other person of interest. In the past that was done manually, says Jim Burch, president of the National Police Foundation. Facial recognition software takes the manpower and time out of this search by scanning photographs in an instant, drawing from a database of driver’s license and passport photos, mug shots and other photos already in the government database.

“When we’ve talked to chiefs about this, they really want the efficiency that it brings and the ability to sort through hours of video or hundreds of photographs in order to quickly identify a person to prevent the loss of life or damage to property,” Burch says, before adding: “Law enforcement has operated for hundred of years without it. Can we operate without it? Absolutely.” 

The ethics of using the technology quickly become sticky: Photos on social media are legally considered open source, and so some programs scan those. Should officers be allowed to capture images from body cameras to enter into the database, and should they be able to use body cameras during on-site encounters with community members? If law enforcement discloses that it’s capturing images at a public event, does it make it OK for them to add your photo into the database or is that Big Brother come to life?

“What people are terrified of, understandably so, is law enforcement taking a camera and scanning it across a crowd of people and allowing technology to say who is in this crowd. Are there people who might have open warrants, might be suspects?” Burch says. “I don’t think I’ve met a single chief who has said we will use the technology to tell us who this person is. Instead they’re using technology to come up with a small subset of possibilities.”

The issues with the technology have largely been realized through trial and error. Burch says it was a “bad idea” when London officers disclosed they were cataloging images at a public event. In Orlando, law enforcement pulled back on the technology because of the number of false identifications it logged. One large manufacturer of body cameras assembled an ethics board and determined facial recognition technology shouldn’t be used for in-person encounters. 

Tech developers need to figure out their issues as well. In a study of 200 facial recognition algorithms, federal researchers found that people of color were far more likely to be misidentified by the technology, likely caused by a lack of diversity from developers themselves. In response, cities like Oakland and San Francisco have banned the technology.

On the racial bias of these technologies, Burch says, “It is a major problem, and it has to be dealt with. It is a significant flaw and can very well be a fatal flaw until it’s addressed.”

Boulder Police Department uses a form of facial recognition software that scans government photos (driver’s licenses and such), but it and Longmont Police Department also ask residents with Ring home security cameras to voluntarily sign up for police to access their footage in the event of a crime.  

For a longer discussion with law enforcement experts on both sides of the debate, tune into the National Law Enforcement Officers Memorial and Museum’s webinar on April 23. Reserve your spot at lawenforcementmuseum.org/events.