The journalism world was recently shaken when a reporter from the Cody Enterprise was caught using artificial intelligence to generate fake stories and quotes. The first signs of something amiss came when CJ Baker, a seasoned reporter at the Powell Tribune, noticed odd phrases and slightly off quotes attributed to Wyoming’s governor and a local prosecutor. However, the dead giveaway came in a June 26 article about comedian Larry the Cable Guy being chosen as the grand marshal for a local parade. The article ended with an awkward explanation of the inverted pyramid—a structure used in news writing—suggesting the content had been produced by AI.
Baker’s suspicions led him to confront Aaron Pelczar, a relatively new journalist, who admitted to using AI in his articles. Pelczar resigned shortly after, and the Cody Enterprise issued an apology, promising to prevent such incidents in the future. Chris Bacon, the editor of the Enterprise, acknowledged his failure to catch the AI-generated content and assured readers that the publication would implement stricter editorial oversight.
The scandal underscores the growing challenges AI poses across various industries, including journalism. While AI has been employed in newsrooms to automate routine tasks, such as financial reporting or translating stories, this incident illustrates the dangers of misuse. The Associated Press, for example, has a policy against using generative AI for creating publishable content, highlighting the importance of transparency when AI tools are involved.
Pelczar’s use of AI to fabricate quotes and stories has cast a shadow over the Cody Enterprise and sparked a broader conversation about ethics in journalism. The publication is now working on a policy to prevent similar occurrences, as the journalism community grapples with the implications of AI in news reporting.