AI is coming to the Ottawa Police Service. Here’s how they’ll use the new tech

News Room
By News Room 10 Min Read

From facial recognition software to transcribing incident reports, the Ottawa Police Service will be turning to

artificial intelligence

to help investigators solve cases with a policy that addresses concerns around privacy, bias and public trust.

“There’s a lot of police services that are wondering where to go (with AI) and it is a very fluid and ever-changing process,” Chief Eric Stubbs told the Ottawa Police Services Board in a presentation on March 25.

“We also recognize that, when used appropriately, AI has the potential to support community safety and improve our service delivery. At the same time, we’re clear that the innovation must be grounded in strong governance, clear controls and accountability. There are legitimate concerns related to privacy, bias, public trust, and those considerations have been central to our approach,” Stubbs said.

Ottawa police executives plan to finalize and introduce the AI policy in late April. According to the March 25 presentation to the board, the policy will explicitly include issues surrounding Charter protections, human rights obligations, privacy requirements and privacy impact assessments.

 Ottawa Police Chief Eric Stubbs in a file photo.

“We need to make sure that any use of AI in policing is grounded in Canadian law, and, of course, upholds the Canadian Charter of Rights and Freedoms. It’s human rights that set the boundaries, not the technology,” Alta Vista Coun. Marty Carr, who serves as OPSB vice-chair, said in an interview.

“We have to make sure that any AI tools that we are using support privacy rights, equality, due process and does not undermine them. And so, as part of the board’s role, we are developing an AI policy that’s a necessity under our governance and oversight role, and that role is to ensure any use of technology is legally compliant and supports accountability and public trust.”

One of the key uses of AI that has generated a volume of discussion around privacy rights involves facial recognition tools.

Facial recognition software would assist police in generating “investigative leads” based on images and videos that are legally obtained from a crime scene, according to Deputy Chief Trish Ferguson, and would then be compared with mug shots in the police database.

“That will be significantly helpful for a number of cases where we have shoplifting, frauds, property crimes. We will be seeing a more effective use in being able to solve these cases,” Ferguson said.

The Toronto Police Service unveiled its AI policy in 2022 that included the adoption of the NeoFace Reveal software for facial recognition.

The software uses a fixed algorithm to match images from the police database with images captured from the crime scene, according to the Toronto Police Service policy.

Any potential matches are then examined and reviewed by a trained facial recognition analyst and would then be forwarded to the investigator for additional review.

The application detects potential matches only through the collected images “and not through any other data to reduce bias,” according to the policy.

Investigators must then conduct their own actions to continue investigation, according to policy, and the use of the application and each investigative step would be logged and disclosed to the courts for any resulting prosecution.

Ottawa police would be joining forces like Halton, Peel and York Regional Police in onboarding similar AI-based technology.

 Ottawa councillor Marty Carr during a city council meeting in a file photo.

“All police boards and police services right now are developing policies on the use of AI, and so we’re learning from them, obviously, with respect to the benefits they bring and the risks,” Carr said.

“There’s been some really strong, excellent guidance that’s been put forward by the Ontario Human Rights Commission in their submission to the Toronto Police Service Board for their policy. So that’s going to be very important as we we move forward,” Carr said.

“We know AI will be able to reduce that administrative work (for) front-line members, and that frees up time that can be directed to proactive policing and hot spots investigation. So we know there’s benefits of AI, but we, as a board, want to make sure that any technology is legally compliant.”

Deputy Chief Steve Bell said AI would only be used as a “decision-assist tool, not a decision-making tool.”

“At the end of any technology there needs to be a human manipulating it,” Bell said.

“There is no technology — AI or any other — that is going to do our officers’ jobs,” Bell said. “But we know there are data points, there’s redacting abilities, there’s generating abilities, there’s data-search abilities that exist in the AI sphere that can be effective for our officers to help support them in providing their role.”

The technology can also afford officers more time to do investigative or community work, Bell said, instead of the time-consuming task of typing out reports that could be assisted by AI.

 Ottawa Police Chief Eric Stubbs (left) and Deputy Chief Steve Bell during a meeting at city council in a file photo.

Carr said OPS and the board implement a system of safeguards — in procurement, governance, operations and oversight — to mitigate risks of bias when using AI.

“That starts with the human rights foundation, (and) it’s also built into the requirement for privacy impact assessments before adopting any new tool,” Carr said.

Ensuring compliance with the Community Safety and Policing Act and privacy legislation is a “robust” process, Carr said, that includes “demanding transparency from vendors so that they disclose how their models are trained, known biases and risk mitigation.”

Police would be required to test systems for bias before and after using any technology and would present their due diligence work to the board for approval, Carr said.

Carr echoed Bell’s assertion that the technology would be available only to assist the officer.

“It doesn’t replace officer judgment and the training they receive, which includes bias awareness,” she said.

Other technologies would assist police with evidence, triage and analysis of large volumes of data, according to Ferguson.

“There’s data from phones, from production orders, (financial records) that are obtained from banks … the ability to get through that accurately and a bit quicker will be very helpful in terms of our speed that we can get to cases, and obviously the data analytics will help support prioritization.”

A more efficient AI-assisted analysis would speed up early investigative stages and would ultimately improve clearance rates, another key priority the OPS has identified.

“The faster we are able to get through cases and to the accurate information and the perpetrators, means that we can get to the next case quicker,” Ferguson said. “You’ve heard about the frustration and the backlog in certain areas and this will assist us in being able to comb through the information, that we have a bit quicker evidence review, making sure that we are pinpointing on the most precise pieces of evidence that we need to prove cases and the elements of the offense.

“We’re modernizing the tools that we already have, and really it’s keeping up with the pace of crime that’s going on right now,” she said. “We see in a lot of cases where we are behind the eight ball and the criminals are much further ahead of us (in utilizing AI). But a lot of these tools will assist us in being able to shorten that gap.”

Related


Our website is your destination for up-to-the-minute news, so make sure to bookmark our homepage and sign up for our newsletters so we can keep you informed.

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *