By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Today in CanadaToday in CanadaToday in Canada
Notification Show More
Font ResizerAa
  • Home
  • News
  • Lifestyle
  • Things To Do
  • Entertainment
  • Health
  • Tech
  • Travel
  • Press Release
  • Spotlight
Reading: Edmonton Police Service partners with U.S. company to test use of facial-recognition bodycams
Share
Today in CanadaToday in Canada
Font ResizerAa
  • News
  • Things To Do
  • Lifestyle
  • Entertainment
  • Health
  • Travel
Search
  • Home
  • News
  • Lifestyle
  • Things To Do
  • Entertainment
  • Health
  • Tech
  • Travel
  • Press Release
  • Spotlight
Have an existing account? Sign In
Follow US
Today in Canada > News > Edmonton Police Service partners with U.S. company to test use of facial-recognition bodycams
News

Edmonton Police Service partners with U.S. company to test use of facial-recognition bodycams

Press Room
Last updated: 2025/12/03 at 1:15 AM
Press Room Published December 3, 2025
Share
SHARE

The Edmonton Police Service announced Tuesday it will become the first police force in the world to use an artificial intelligence (AI) product from Axon Enterprise to trial facial-recognition-enabled bodycams. 

“I want to make it clear that this facial-recognition technology will not replace the human component of investigative work,” acting Supt. Kurt Martin with EPS’ information and analytics division said during a news conference.

“In fact, the resemblances that are identified by this software will be human-verified by officers trained in facial recognition.”

Martin said the police force’s goal is to test another tool in its operations toolbox that can help further ensure public and officer safety while also respecting privacy considerations.

Axon Enterprise, an Arizona-based company, develops weapons and technology products for military, law enforcement and civilians in jurisdictions where legal. 

Starting Wednesday, up to 50 Edmonton police officers who are currently using bodycams will begin to use Axon’s facial-recognition-enabled cameras on their shifts for the remainder of the month.

Why now? 

In 2023, the provincial government announced plans to mandate bodycams for all police officers in Alberta. For EPS, the use of bodycams by its members began to roll out in 2024.

The partnership with Axon is separate from the provincial mandate, Martin said. 

“The proof of concept is a limited testing period to determine the feasibility of facial recognition on body-worn video cameras and their functionality within policing,” he said.

WATCH | Edmonton police announce plans to test use of facial-recognition technology in bodycams:

Edmonton police test AI facial recognition on body cameras

Through December, police officers in Edmonton will be testing facial recognition technology in their body-worn cameras, using it to potentially match people interacting with officers with people in the police database. Edmonton police stress this is only testing proof of concept, but one expert is raising concerns that Canadians are being used as guinea pigs.

Martin said the trial will test the technology’s ability to allow officers to use mugshots to identify individuals who are already in the system because of “officer safety flags and cautions from previous interactions.”

The technology will also allow police to assess safety risks and be aware of individuals who have outstanding warrants for serious crimes such as murder, aggravated assault and robbery.

“In total, there are 6,341 individuals who have a flag or caution,” Martin said. “Currently, there are over 20,615 charges that have gone to warrant in Edmonton.

“As police officers, we have an obligation to attempt to execute these warrants in a timely manner to ensure that people who are charged with criminal offences can be tried within a reasonable time, as per their Charter rights … and under the timeline set by the Supreme Court of Canada.” 

How does it work? 

When officers using the cameras that will be part of the test are in the field, the facial-recognition system will not be actively running, said Ann-Li Cooke, Axon Enterprise’s director of responsible AI.

She said the system is intended to be activated by officers during investigations or enforcement, at which point the cameras will start recording.

When these body-worn cameras are actively recording, the facial-recognition technology will run automatically in “silent mode,” Cooke said. 

Officers won’t get any alerts or notifications about facial resemblance while on duty.

If a person is within four metres of a bodycam, their face is detected and the data is sent to the cloud to compare against the EPS database of persons of interest.

If it’s not a match, then the facial data is immediately discarded, Cooke said.

“The system really starts on the database upload, so Edmonton Police Service and any police agency would identify who their database would be constructed of,” she said, noting that these include serious warrants and officer safety alerts.

“We really want to make sure that it’s targeted, so that these are folks with serious offences who are uploaded into this database.”

Cooke said Axon does not have control to view, dictate or govern what kinds of people are uploaded into the database, which is entirely owned and operated by EPS.

After footage is captured, EPS officers trained to analyze facial-recognition data will review the footage to see if the software works as intended and if there is a match.

What are the concerns? 

Ian Adams is an assistant professor of criminology at the University of South Carolina who studies the intersection of policing and technology and sits on the task force on AI for the Council on Criminal Justice. 

Adams said there should be caution exercised in first understanding AI before it is used among the wider public.  

“We’re in the fastest technology-adoption phase in modern policing, faster even than body cameras,” he said in an interview with CBC News.

“And unfortunately, that’s necessarily happening before we understand a great deal about the use of these technologies, whether they capture their intended benefits, of course, and then their unintended consequences.” 

EPS said it has submitted a privacy impact assessment to Alberta’s information and privacy commissioner to look at whether the trial use of the technology respects public privacy and is carried out legally.  

Diane McLeod, Alberta’s information and privacy commissioner, said she has considerable concerns with EPS proceeding with the use of facial-recognition technology. 

“From a privacy perspective, there are a number of issues with facial-recognition technology, particularly around accuracy,” she said in an interview with CBC News. 

“Under our Protection of Privacy Act, which the EPS is subject to, they actually have a duty of accuracy, so they would have to be able to establish in their privacy impact assessment that accuracy meets the threshold for the Protection of Privacy Act.”

McLeod said facial-recognition technology has a proven track record of being problematic and pointed to a report by several Canadian privacy commissioners looking into the practices of an American technology company called Clearview AI. 

The commissioners found Clearview AI violated Canadian privacy laws by collecting photos of Canadians without their knowledge or consent.

The report by the commissioners found that Clearview AI’s technology created a significant risk to individuals by allowing law enforcement and companies to match photos against its database of more than three billion images.

Canada’s privacy commissioner, along with commissioners from Alberta, B.C. and Quebec, issued an order in 2021 for Clearview AI to stop operating in the country and to delete images of Canadians collected without their consent.

McLeod said the matter is still going through the court. 

A 2019 report from the Axon AI and policing technology ethics board found that at the time, facial-recognition technology was not reliable enough to ethically justify its use on bodycams. 

“At the least, face-recognition technology should not be deployed until the technology performs with far greater accuracy and performs equally well across races, ethnicities, genders and other identity groups,” the report said.

In response to the report, Cooke said there has been a development in the technology since 2019. 

“There are gaps in both race and gender at that time,” she said. “As we did our due diligence on evaluating multiple models, we were also looking to see if there were race-based differences, and we found that in ideal conditions, that is not the case. 

“Race is not the limiting factor today, the limiting factor is on skin tone. And so when there are varying conditions, such as distance [or] dim lighting, there will be different optical challenges with body-worn camera[s] — and all cameras — in detecting and matching darker-skinned individuals than lighter-skinned individuals.” 

However, Gideon Christian, an associate professor of AI and law at the University of Calgary, said the inequities attached to facial-recognition technology are too great to ignore and that he believes there is not enough recent research to suggest any significant improvement. 

“Facial-recognition technology has been shown to have its worst error rate in identifying darker-skinned individuals, especially black females,” he said.

In some case studies, Christian said facial-recognition technology has shown about a 98 per cent accuracy rate in identifying white male faces, but that it also has about a 35 per cent error rate in identifying darker-skinned women.

“It came to me as a huge surprise that Edmonton police choose to be the guinea pig for this Charter-rights-infringing experiment,” he said.

Christian said he believes the implications for privacy in the public sphere are chilling. 

“What we’re basically seeing is a situation where this tool for police accountability suddenly converts to a tool for mass surveillance.” 

University of British Columbia law professor Benjamin Perrin said he believes there will need to be rigorous safeguards in place when using facial-recognition technology.

“We need to see a full impact assessment on how this would impact the Charter rights of people who are being filmed in these interactions with police,” he said.

The Edmonton police commission and the chief’s committee will review the results of the facial-recognition bodycams before deciding on its future use in 2026.

Quick Link

  • Stars
  • Screen
  • Culture
  • Media
  • Videos
Share This Article
Facebook Twitter Email Print
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

You Might Also Like

News

Starting pitcher Dylan Cease finalizes 7-year, $210M US deal with Blue Jays

December 3, 2025
News

In text messages, Ontario women accused of murder described boy in their care as ‘loser,’ trial hears

December 3, 2025
News

Algoma Steel is cutting 1,000 jobs. So why did it receive millions from the government?

December 2, 2025
News

Artists restoring stained glass across Ontario for nearly half a century close up shop

December 2, 2025
© 2023 Today in Canada. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
Welcome Back!

Sign in to your account

Lost your password?