Things You Should Know But Don’t: AI in True Crime

Posted September 16, 2024

Do you trust how true crime is presented when you listen to or watch it?

While journalists may not host your favorite true crime podcast or TV show, most still try to provide a quality baseline of facts about the gruesome cases they discuss. Opinions may vary, but details of the case should not. It’s relatively easy to fact-check something that doesn’t align. But how would you know to fact-check a case that never existed?

Recently, this very problem arose when two YouTube videos discussing Colorado murders went viral— the first, with nearly 2 million views, was about the murder of a real estate agent from Littleton, Colorado, in 2014. The second has around 200,000 views and discusses the case of a jealous husband murdering his wife in Fort Collins around 2019.

However, according to the Littleton police department and the District Attorney, these crimes never happened. They were never investigated, the news never covered them, and many of the facts from the videos are either untrue or contradictory. As far as anyone can tell, they were entirely fabricated. However, with thousands of comments expressing shock and outrage on each video, it seems many have taken these stories at face value.

The YouTube channel True Crime Case Files brands itself like any other true crime show, claiming to be “your portal into the thrilling world of real-life criminal investigations.” Nothing on their page suggests that the works are fiction, but according to Casey Fiesler, an associate professor at the University of Colorado specializing in technology ethics, the videos are created using generative AI. She cited things such as the strange narration, the use of portrait photos with an “uncanny valley appearance,” and the inconsistent facts of the story. Fiesler stated that she’s seen AI used similarly to circulate conspiracy theories and that its motivation is usually profit. Videos like this are an easy way to get attention and ad revenue.

Earlier this year, a true crime Netflix documentary came under fire for using AI-generated photos while discussing an otherwise genuine case. The use of AI was not disclosed and was later harshly criticized by true crime fans.

While some headway is being made in regulating the use of AI, it’s worrying that these AI stories are convincing enough to fool thousands of people. What’s more, people aren’t questioning what they see online. A few minutes of independent research is enough to figure out that these cases aren’t real since no legitimate record of them exists from before the YouTube video was uploaded.

Let this remind you always to take what you see, hear, and learn from the internet with a grain of salt—especially when it starts creeping into your favorite genres.

Leave a Reply

Your email address will not be published. Required fields are marked *