An artificial intelligence and data ethicist said she feels the Edmonton Police Service's recent use of AI to create a photo of an unidentified deceased woman raises several questions.
"I think it's complicated, and I'm trying to think about it in a very holistic way," said Katrina Ingram, CEO of Ethically Aligned AI, an Edmonton company that helps organizations and individuals build and deploy ethical AI solutions.
As the use of AI becomes more common across all industries and sectors, including policing, the ethics of how these tools are applied is a growing conversation.
Edmonton police said a woman's body was found in a waste bin in downtown Edmonton in late December 2024. Officers released sketches of the woman and her tattoo, as well as stock images of her jacket and boots, in an attempt to confirm her identity in March 2025. In November, EPS then used AI to create an "image that is an approximate likeness of the deceased female" in hopes that it would generate tips about her identity.
Ingram said there are ethical questions about how AI tools are built and function, due to the data that they acquire to generate images and other content. "Should we use these tools? And there are even some questions about the lawfulness of the data that was acquired to build these tools, which really starts to raise questions for law enforcement agencies, because as a law enforcement agency, you shouldn't use a tool that was unlawfully made," Ingram said.
Police spokesperson Cheryl Voordenhout told Taproot that EPS doesn't release operational details like the specific software that police use. She said the digital forensics team used an AI model that was ingested into a secure EPS platform where no data is transmitted externally, and that the source code is publicly accessible, fully transparent, and legal to use.
Ingram said she thinks this particular case is ethical because the police were motivated to identify a deceased person when a traditional sketch had produced no answers. "They turned to AI, hoping that the more realistic version might be helpful," she said. "That context matters in terms of what they did." Additionally, Ingram said, EPS members could confirm whether the AI-generated photo actually resembled the deceased person.
But Ingram contrasted the ethics in this case with an instance in 2022, when EPS used DNA found at a crime scene to produce an approximate image of a suspect in an unsolved sexual assault. Ingram has used this example when giving talks related to AI and racial bias.
The National Council of Canadian Muslims criticized the resulting image EPS released as racist because it created a "vague portrait" and a "generic image of a Black male," a statement at the time read. "It is hard to overstate the absurdity of releasing a hypothetical, racialized portrait of a suspect to the public, while hoping such a tactic might lead to overall vigilance and perhaps an arrest," the statement said. "In effect, the public is asked to 'watch out' for a person of a particular race, with some other physical traits thrown in as ranges (eg. height). It is racial profiling backed up by incomplete science." EPS has since apologized for releasing the 2022 photo.
Voordenhout said that when police have exhausted traditional investigative methods, a technologically enhanced image of a deceased person may be used to portray the individual. EPS consulted with the Office of the Chief Medical Examiner and with a forensic anthropologist before facial recognition experts in the digital forensics section created the AI-generated image. Voordenhout said neither the original photo of the deceased woman nor the police sketch were uploaded into the AI image generator; instead, the image was produced entirely through repeated prompts until it resembled the woman as accurately as possible.
An Edmonton Police vehicle sits downtown in this file photo. (Mack Male/Flickr)
Ingram said another layer of complexity is added by the potential shock or trauma that the image might cause to the woman's loved ones.
"I'm trying to imagine the relatives and friends of this person seeing that particular image and how they might feel about that. Because it is trying to be in service of identifying a deceased person in order to bring some closure to loved ones, there might be a sense of relief (because they) can recognize that person, but there might also be a sense of trauma in looking at the person in that way," Ingram said. "It's very complicated, and it's hard to know how they will actually react."
Other North American police services are using AI to generate photos of unidentified people. Earlier this year, the Calgary Police Service used AI to create an image of a man who was found deceased by the Bow River.
Meanwhile, a police service in Arizona is using AI to attempt to find suspects of crimes. To create the images, a victim describes the suspect to a sketch artist as usual, but then the sketch is put into an AI image generator. The artist works with the victim to tweak the AI image to match what the victim remembers.
Developers have created a platform to help create forensic sketches from scratch. But an ethicist told Vice in 2023, when the software was created, that using AI in police forensics can be dangerous, as it can reinforce existing racial and gender biases.
"The problem with traditional forensic sketches is not that they take time to produce (which seems to be the only problem that this AI forensic sketch program is trying to solve). The problem is that any forensic sketch is already subject to human biases and the frailty of human memory," Jennifer Lynch, of the Electronic Frontier Foundation, said. "AI can't fix those human problems, and this particular program will likely make them worse through its very design."