AI Police Body Cameras in Canada: Ethical Concerns & Public Debate (2026)

A bold warning sits at the heart of Edmonton’s new pilot: police body cameras with built-in artificial intelligence could quietly expand the reach of facial recognition in everyday policing. But here’s where it gets controversial—what happens when a city tests real-time identification in public spaces? The answer so far has sparked heated debate that crosses borders and politics, raising essential questions about safety, privacy, and civil liberties.

In Canada’s northernmost large city, Edmonton, body-worn cameras are being equipped with facial recognition capabilities trained to identify roughly 7,000 individuals deemed to be on a high-risk watch list. This live test is framed as a practical examination of whether facial recognition—long criticized for intrusiveness—could have a place in policing across North America, under strict safeguards and oversight.

Six years after Axon Enterprise, Inc.—a leading producer of body cameras—warned that facial recognition in policing poses significant ethical concerns, the Edmonton pilot has rekindled alarm well beyond the city. A former chair of Axon’s AI ethics board, which temporarily halted facial recognition development in 2019, told The Associated Press that moving forward without wide public debate, thorough testing, and expert scrutiny could overlook societal risks and privacy implications.

“It’s essential not to deploy these technologies, which carry real costs and risks, unless there’s clear evidence of meaningful benefits,” said Barry Friedman, now a law professor at New York University.

Axon’s founder and CEO, Rick Smith, characterizes the Edmonton effort as “early-stage field research” rather than a product launch. He argues that testing outside the United States can yield independent insights, strengthen oversight, and inform future evaluations, including those in the U.S.

The pilot’s stated goal is to enhance officer safety by enabling body cameras to flag individuals categorized as having a “flag or caution” in several high-risk categories, such as violent or assaultive behavior, armed and dangerous status, weapons possession, escape risk, and high-risk offenders. Edmonton Police Service officials reported a watch list containing 6,341 people as of a December 2 press briefing, with an additional 724 individuals warrant-listed separately.

Axon’s Ann-Li Cooke, who leads responsible AI efforts, emphasized that the program is intended to be tightly targeted toward serious offenses. If the pilot expands, it could influence policing models far beyond Edmonton and even beyond Canada.

Axon is a dominant U.S. supplier of body cameras and has increasingly marketed them to agencies overseas. The company recently outbid Motorola Solutions, a Chicago-based competitor, to provide body cameras to the Royal Canadian Mounted Police. Motorola acknowledged its own capability to integrate facial recognition but said it has deliberately refrained from using it for proactive identification, while not ruling out future use.

Alberta’s 2023 mandate requiring body cameras for all provincial police agencies—including Edmonton—was framed as a transparency measure to document interactions, improve evidentiary accuracy, and shorten investigation and complaint timelines.

Public response to real-time facial recognition in public spaces remains mixed across the United States and beyond. Civil-liberties groups have argued that facial recognition can be biased, inaccurate, and overbroad, leading to disproportionate impacts on marginalized communities. Past studies have shown racial, gender, and age biases in the technology, and real-time performance on video feeds often lags behind performance on still photos or mug shots.

Regulatory responses vary widely. Several U.S. states and municipalities have restricted police use of facial recognition, even as federal policy under different administrations has shifted the regulatory landscape. The European Union has banned real-time public face-scanning police technology, with exceptions for grave crimes like kidnapping or terrorism. The United Kingdom, though no longer under EU rules, has begun testing the tool on city streets and is weighing broader deployment.

Details about Edmonton’s pilot remain largely undisclosed. Axon does not manufacture its own facial-recognition model and has declined to disclose the third-party provider involved. The Edmonton rollout is set to continue through December and will operate only during daylight hours, with about 50 officers testing the tech. For now, matches will be analyzed retrospectively at the station rather than confirmed in the field.

Even as the pilot aims to avoid routine surveillance, critics worry about privacy and rights protections. Edmonton’s police say the approach respects individual privacy, but the province’s privacy commissioner is reviewing the project’s privacy impact assessment as part of a formal evaluation of how sensitive personal data is handled.

Temitope Oriola, a criminology professor at the University of Alberta, notes that Edmonton is effectively turning itself into a test bed for this tool, acknowledging that the technology’s real-world performance remains uncertain. He cautions that even if improvements emerge, the broader impact on trust and community relations—especially with Indigenous and Black residents—must be considered.

Axon has faced prior friction over its technology programs, including board resignations in 2022 over drone deployments linked to Tasers. Since stepping back from facial recognition, Axon has pursued a cautious, lab-informed approach to field testing, arguing that the technology has grown more accurate and suitable for controlled real-world trials.

Axon also recognizes that facial recognition systems are affected by conditions such as distance, lighting, and camera angle, which can disproportionately affect darker-skinned individuals. The company requires human review for every match and is examining what training and oversight are necessary to mitigate known risks.

Friedman argues for greater transparency about the testing, including public reporting on how accuracy has evolved and the ethical considerations involved. He questions whether decisions to deploy such technology should rest with police agencies and vendors alone, urging legislative input and rigorous scientific validation instead.

A pilot can be a useful step, Friedman concedes, but without clear accountability, transparency, and broad societal discussion, the move feels precipitous. The Edmonton case thus stands at the crossroads of policy, technology, and public trust, inviting readers to weigh the potential safety benefits against the privacy and civil-liberty costs—and to join the conversation about where to draw the line.

Would this kind of real-time facial recognition in policing ever be acceptable on a wider scale, or should it remain confined to strictly limited, tightly supervised contexts? What safeguards would be essential to ensure fairness and accountability, and who should oversee them—in cities, provinces, or national bodies?

AI Police Body Cameras in Canada: Ethical Concerns & Public Debate (2026)

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Reed Wilderman

Last Updated:

Views: 6118

Rating: 4.1 / 5 (52 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Reed Wilderman

Birthday: 1992-06-14

Address: 998 Estell Village, Lake Oscarberg, SD 48713-6877

Phone: +21813267449721

Job: Technology Engineer

Hobby: Swimming, Do it yourself, Beekeeping, Lapidary, Cosplaying, Hiking, Graffiti

Introduction: My name is Reed Wilderman, I am a faithful, bright, lucky, adventurous, lively, rich, vast person who loves writing and wants to share my knowledge and understanding with you.