A major vendor of police body cameras, Axon, has decided against adding facial recognition to its products over “ethical concerns.”
“In addition, there are technological limitations to using this technology on body cameras,” Axon’s CEO said in a blog post on Thursday. “Consistent with the board’s recommendation, Axon will not be commercializing face matching products on our body cameras at this time.”
Axon made the decision after a company-assembled ethics board also opposed adding the technology to police body cameras. For over a year, the ethics board has been studying the issue, and concluded the technology is still unreliable when it comes to successfully identifying people across different races, genders, and age groups.
“We focused on face recognition under real-world conditions—specifically, when deployed on a body camera. Under such conditions, it is our understanding that face recognition technology performs quite poorly, both in terms of false positives and false negatives,” the ethics board said in its report.
Axon CEO Rick Smith said his company is committed to developing technologies in “an ethical and responsible manner.” But despite today’s decision, the company isn’t completely abandoning facial-recognition tech either. According to Smith, Axon’s AI team will continue to evaluate the technology.
“We do believe face matching technology deserves further research to better understand and solve for the key issues identified in the report, including evaluating ways to de-bias algorithms as the board recommends,” he added. The company is also currently using some facial-recognition technology to detect and blur out faces from videos. However, no identity matching is occurring.
The American Civil Liberties Union said Axon’s decision highlights why facial-recognition technologies need to be regulated. “One of the nation’s largest suppliers of police body cameras is now sounding the alarm and making the threat of face surveillance technology impossible to ignore,” said Matt Cagle, an ACLU technology and civil liberties attorney, in a statement.
“Body cameras should be for police accountability, not surveillance of communities,” he added. “Face surveillance technology is ripe for discrimination and abuse, and fundamentally incompatible with body cameras —regardless of its accuracy.”
However, other companies continue to develop facial-recognition systems for use in law enforcement. Amazon, for instance, argues its own technology is highly accurate, and can help police find missing children and identify suspects in crimes.
“New technology should not be banned or condemned because of its potential misuse,” Amazon Web Services VP Michael Punke said in February. “Instead, there should be open, honest, and earnest dialogue among all parties involved to ensure that the technology is applied appropriately and is continuously enhanced.”