Police body cameras equipped with artificial intelligence have been trained to detect the faces of about 7,000 people on a ''high risk'' watch list in the Canadian city of Edmonton, a live test of whether facial recognition technology shunned as too intrusive could have a place in policing throughout North America.
But six years after leading body camera maker Axon Enterprise, Inc. said police use of facial recognition technology posed serious ethical concerns, the pilot project — switched on last week— is raising alarms far beyond Edmonton, the continent's northernmost city of more than 1 million people.
A former chair of Axon's AI ethics board, which led the company to temporarily abandon facial recognition in 2019, told The Associated Press he's concerned that the Arizona-based company is moving forward without enough public debate, testing and expert vetting about the societal risks and privacy implications.
''It's essential not to use these technologies, which have very real costs and risks, unless there's some clear indication of the benefits,'' said the former board chair, Barry Friedman, now a law professor at New York University.
Axon founder and CEO Rick Smith contends that the Edmonton pilot is not a product launch but ''early-stage field research'' that will assess how the technology performs and reveal the safeguards needed to use it responsibly.
''By testing in real-world conditions outside the U.S., we can gather independent insights, strengthen oversight frameworks, and apply those learnings to future evaluations, including within the United States,'' Smith wrote in a blog post.
The pilot is meant to help make Edmonton patrol officers safer by enabling their body-worn cameras to detect anyone who authorities classified as having a ''flag or caution'' for categories such as ''violent or assaultive; armed and dangerous; weapons; escape risk; and high-risk offender,'' said Kurt Martin, acting superintendent of the Edmonton Police Service. So far, that watch list has 6,341 people on it, Martin said at a Dec. 2 press conference. A separate watch list adds 724 people who have at least one serious criminal warrant, he said.
''We really want to make sure that it's targeted so that these are folks with serious offenses," said Ann-Li Cooke, Axon's director of responsible AI.