AI-generated images depicting child sexual abuse turning up in more Minnesota criminal cases

Online tips submitted to experts who track the abusive material have exploded in the past year.

The Minnesota Star Tribune
September 15, 2025 at 10:00AM
More than 10,500 cyber tips reporting child pornography have been flagged in Minnesota this year, images created with generative AI and others without. (Leila Navidi/The Minnesota Star Tribune)

A decade or more ago, computer-manipulated child pornography looked a lot like what was possessed by Jacob Wetterling’s killer Danny Heinrich: binders filled with images morphed in his home with Photoshop software.

Today, investigators on the front lines say they’re facing something much more complicated and dangerous: child pornography created using artificial intelligence.

The technology has evolved at breakneck speed, far removed from the images seized in Heinrich’s home in 2015. That seizure led to his eventual confession to kidnapping and killing Wetterling, 11, who had been missing for 26 years.

It’s now easier for bad actors to flood the internet with abusive depictions of children that are nearly indistinguishable from actual images of child sex abuse. The influx of what they call child sexual abuse material has placed a strain on law enforcement resources.

The explicit images have become more lifelike and sophisticated in the short time since the technology burst into the public sphere in late 2023. In most cases, the images are of real children — taken from yearbooks, social media or even surreptitiously in public. They’re then manipulated with AI to become child pornography. Although children are not physically harmed in the production of AI or otherwise manipulated pornography, computer-generated depictions are also illegal under the federal PROTECT Act, passed in 2003 to prevent the exploitation of minors.

“It’s just one little pixel, or one little misalignment of one blade of grass that maybe just doesn’t look quite right. They’re that real looking,” said Carla Baumel, an assistant U.S. Attorney for Minnesota. “If you can imagine your 6-year-old in the worst position and worst light ever, and wonder if [the image] is real, there’s no question that’s a harm.”

More than 10,500 cyber tips reporting child pornography have been flagged in Minnesota this year, images created with generative AI and others without. The number is on track to outpace last year’s 12,595 tips to the Minnesota Bureau of Criminal Apprehension (BCA). The National Center for Missing and Exploited Children is experiencing an increase so sharp it prompted researchers to release their annual online child sex crimes report early.

In the first half of 2024, the center received 6,800 reports flagging child pornography created with generative AI. Through June this year, tips have exploded to more than 440,000.

“That’s absolutely a strain on [police] resources. Because all of those deserve attention by law enforcement but you have to triage,” BCA Superintendent Drew Evans said. “I think we’re always playing catch-up because of the sheer volume. … I do think that AI adds to that volume that makes a difficult caseload even more unmanageable in that process.”

Baumel is the lead federal prosecutor in a criminal case pending in Minnesota’s U.S. District Court that’s believed to be one of the first prosecutions in the state involving a swell of victims whose images were used in AI-generated material.

In February, the U.S. Attorney’s Office indicted William Haslach, 30, of Maplewood on a federal charge of using AI to produce child sexual abuse images with photos he discreetly took of children at several Ramsey County schools where he worked as a recess and traffic monitor. To date, federal prosecutors have identified 100 children victimized in the case.

“They were doing nothing wrong. These parents just sent their children to a place where we just assume kids are the safest,” Baumel said. “So when we had a meeting with parents, there is just this total confidence that’s rocked.”

Hennepin County prosecutors in August charged Jason Polzin, 50, a former staff member and softball coach at Living Word Christian Center in Brooklyn Park, with interfering with the privacy of a minor after he allegedly recorded a 13-year-old girl and superimposed her face on a computer-generated nude or scantily clad female body.

In both cases, the men are accused of putting a photo through an app or website to generate an image that makes the victim appear undressed or in an explicit position. The Minnesota Star Tribune left messages for the attorneys representing the accused.

Federal prosecutors said the method makes the crime unique in that a person is often unaware of what happened.

“It’s one of the only crimes where it’s possible to be victimized and not confer with a victim,” said Melinda Williams, assistant U.S. Attorney for Minnesota.

Acting U.S. Attorney Joe Thompson and Assistant U.S. Attorney Carla Baumel outside the U.S. Attorney’s Office in Minneapolis on Aug. 20. Federal prosecutors in Minnesota are now working on a major case involving people using AI to create child pornography online. (Leila Navidi/The Minnesota Star Tribune)

Alarmed by how rampant and realistic AI-generated child pornography has become, Evans said the BCA was spurred to support expanding the state’s laws to include creating with artificial intelligence explicit images and videos of children.

Under Minnesota law, possessing child pornography carries a maximum penalty of five years in prison. For federal cases, the prison term ranges from five to 20 years.

But experts said even if a person sees justice in the form of a conviction, the impact often lasts long after the case closes.

“I think everybody intuitively knows the harm of being sexually abused, but there’s an additional harm that especially the victims speak very profoundly to, of knowing images of you being raped or being depicted as raped are out in the world,” Williams said.

The National Center for Missing and Exploited Children created a Take It Down tool for people under 18 who are seeking to remove explicit photos of them shared without their consent. But some images are spread to the darkest corners of the internet, where they are difficult for law enforcement to reach.

“It’s just that continual revictimization recirculation,” said John Shehan, vice president of the exploited children division at the National Center for Missing and Exploited Children. “When I talk to victims and survivors, they talk about walking down the street and they don’t know if people are going to recognize them, if they’ve seen their imagery. It’s really a terrible thing to have to live with mentally.”

about the writer

about the writer

Sarah Nelson

Reporter

Sarah Nelson is a reporter for the Minnesota Star Tribune.

See Moreicon

More from News & Politics

See More
card image
Corcoran PD

Few details have been released about an incident Thursday in Corcoran. The man’s cause and manner of death are under investigation.

card image