Police use of AI to create photo of dead woman 'complicated': Ethicist
An artificial intelligence and data ethicist said she feels the Edmonton Police Service's recent use of AI to create a photo of an unidentified deceased woman raises several questions.
"I think it's complicated, and I'm trying to think about it in a very holistic way," said Katrina Ingram, CEO of Ethically Aligned AI, an Edmonton company that helps organizations and individuals build and deploy ethical AI solutions.
As the use of AI becomes more common across all industries and sectors, including policing, the ethics of how these tools are applied is a growing conversation.
Edmonton police said a woman's body was found in a waste bin in downtown Edmonton in late December 2024. Officers released sketches of the woman and her tattoo, as well as stock images of her jacket and boots, in an attempt to confirm her identity in March 2025. In November, EPS then used AI to create an "image that is an approximate likeness of the deceased female" in hopes that it would generate tips about her identity.
Ingram said there are ethical questions about how AI tools are built and function, due to the data that they acquire to generate images and other content. "Should we use these tools? And there are even some questions about the lawfulness of the data that was acquired to build these tools, which really starts to raise questions for law enforcement agencies, because as a law enforcement agency, you shouldn't use a tool that was unlawfully made," Ingram said.
Police spokesperson Cheryl Voordenhout told Taproot that EPS doesn't release operational details like the specific software that police use. She said the digital forensics team used an AI model that was ingested into a secure EPS platform where no data is transmitted externally, and that the source code is publicly accessible, fully transparent, and legal to use.
Ingram said she thinks this particular case is ethical because the police were motivated to identify a deceased person when a traditional sketch had produced no answers. "They turned to AI, hoping that the more realistic version might be helpful," she said. "That context matters in terms of what they did." Additionally, Ingram said, EPS members could confirm whether the AI-generated photo actually resembled the deceased person.
But Ingram contrasted the ethics in this case with an instance in 2022, when EPS used DNA found at a crime scene to produce an approximate image of a suspect in an unsolved sexual assault. Ingram has used this example when giving talks related to AI and racial bias.
The National Council of Canadian Muslims criticized the resulting image EPS released as racist because it created a "vague portrait" and a "generic image of a Black male," a statement at the time read. "It is hard to overstate the absurdity of releasing a hypothetical, racialized portrait of a suspect to the public, while hoping such a tactic might lead to overall vigilance and perhaps an arrest," the statement said. "In effect, the public is asked to 'watch out' for a person of a particular race, with some other physical traits thrown in as ranges (eg. height). It is racial profiling backed up by incomplete science." EPS has since apologized for releasing the 2022 photo.
Voordenhout said that when police have exhausted traditional investigative methods, a technologically enhanced image of a deceased person may be used to portray the individual. EPS consulted with the Office of the Chief Medical Examiner and with a forensic anthropologist before facial recognition experts in the digital forensics section created the AI-generated image. Voordenhout said neither the original photo of the deceased woman nor the police sketch were uploaded into the AI image generator; instead, the image was produced entirely through repeated prompts until it resembled the woman as accurately as possible.