Reporter in US’s Wyoming resigns caught using AI to create fake quotes, stories


A statement from Wyoming’s governor and a local prosecutor was the first thing that struck Powell Tribune reporter CJ Baker as a bit odd. Then, there were some phrases in the stories that sounded almost robotic to him.

However, the clearest indication that a reporter from a competing news outlet was using generative artificial intelligence to help write his stories came from a June 26 article that said comedian Larry the Cable Guy had been chosen as the grand marshal of the Cody Stampede parade.

“The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American freedom, led by one of comedy’s most beloved figures,” the Cody Enterprise reported. “This structure ensures that the most important information is presented first, making it easier for readers to quickly grasp the key points.”

After some digging, Baker, who has been a journalist for more than 15 years, met Aaron Pelczar, 40, who was new to journalism and, according to Baker, admitted he used AI in his stories before resigning from the Enterprise.

The publisher and editor of the Enterprise, founded in 1899 by Buffalo Bill Cody, has apologized and vowed to take steps to ensure this never happens again. In an editorial published Monday, Enterprise editor Chris Bacon said he “failed to catch” the AI ​​copying and false quotes.

“It doesn’t matter that the false quotes were the obvious mistake of a hasty, novice reporter who relied on an AI. It was my job,” Bacon wrote. He apologized for “allowing an AI to insert words into stories that were never spoken.”

Journalists have derailed their careers by fabricating quotes or facts in stories long before the advent of AI. But this latest scandal shows the potential harms and dangers that AI poses to many industries, including journalism, as chatbots can create somewhat credible articles with only a few prompts.

AI has found a role in journalism, including the automation of certain tasks. Some newsrooms, including the Associated Press, use AI to free up reporters for more impactful work, but most AP employees are not allowed to use generative AI to create publishable content.

The AP has been using technology to assist with articles about financial earnings reports since 2014, and more recently for some sports stories as well. It is also experimenting with an AI tool to translate some stories from English to Spanish. Each such story has a note at the end explaining the role of technology in its production.

Being clear about how and when AI is used has proven crucial. Sports Illustrated was criticized last year for publishing AI-generated online product reviews that were written by reporters who didn’t actually exist. After the story broke, SI said it was firing the company that produced the articles for its website, but the incident damaged the reputation of the once mighty publication.

In his Powell Tribune story breaking the news of Pelczar’s use of AI in articles, Baker wrote that he had an uncomfortable but cordial meeting with Pelczar and Bacon. During the meeting, Pelczar said, “Obviously I have never intentionally tried to misquote anybody” and promised to “correct them and apologize and say those are incorrect statements,” Baker wrote, noting that Pelczar insisted that his mistakes should not reflect on his Cody Enterprise editors.

After the meeting, the Enterprise began a full review of all the stories Pelczar had written for the newspaper during the two months she worked there. Bacon said Tuesday that they had found seven stories that included AI-generated quotes from six people. They are still reviewing other stories.

“Those are very credible quotes,” Bacon said, adding that people he spoke to while reviewing Pelczar’s articles said the quotes sounded like something he would say, but they had never actually spoken to Pelczar.

Baker reported that seven people told him they had been quoted in stories written by Pelczar, but that they had not spoken to Pelczar.

Pelczar did not respond to a message left by the AP at a number he provided asking what happened. Bacon said Pelczar declined to discuss the matter when another Wyoming newspaper contacted him.

Baker, who regularly reads the Enterprise because it is his competing magazine, told the AP that a combination of phrases and quotes in Pelczar’s stories raised his suspicions.

Pelczar’s story about the shootings in Yellowstone National Park included this sentence: “This incident is a stark reminder of the unpredictable nature of human behavior, even in the most tranquil environments.”

Baker said the line sounded like a summary of his stories, generated by a chatbot, with some sort of “life lesson” added at the end.

Baker said another story — about a man being sentenced for poaching — included quotes from wildlife officials and prosecutors that sounded like they were taken from a news release. However, there was no news release and the relevant agencies did not know where the quotes came from, he said.

Two of the stories that were questioned included false quotes from Wyoming Governor Mark Gordon that his staff only learned about when Baker called them.

“In one case, (Pelczar) wrote a story about a new OSHA rule that included a quote from the governor that was completely fabricated,” governor spokesman Michael Perlman said in an email. “In another case, he fabricated a portion of a quote, and then combined it with part of a quote that was included in a news release announcing the new director of our Wyoming Game and Fish Department.”

The most obvious AI-generated copy appeared in a story about Larry the Cable Guy, which ended with an explanation of the inverted pyramid, the basic way to write a breaking news story.

Creating an AI story isn’t difficult. Users could feed a criminal affidavit into an AI program and ask it to write an article about the case, including quotes from local officials, said Alex Mahadevan, director of the Digital Media Literacy Project at the Poynter Institute, a prominent journalism think tank.

“These generative AI chatbots are programmed to give you an answer, whether that answer is complete nonsense or not,” Mahadevan said.

Cody Enterprise publisher Megan Barton wrote an editorial calling AI a “new, advanced form of plagiarism, and in the media and writing space, plagiarism is something every media outlet has to correct at some point. It’s the ugliest part of the job. But, a company that’s willing to correct (or literally write off) these mistakes is a reputable one.”

Barton wrote that the newspaper has learned its lesson, has a system in place to identify AI-generated stories and will be having “a long conversation about how AI-generated stories are not acceptable.”

Bacon said the enterprise had no AI policy, partly because it was clear journalists shouldn’t use it to write stories. Poynter has a template from which news outlets can create their own AI policies.

Bacon plans to have one installed by the end of this week.

“This would be a matter for pre-employment discussion,” he said.

publish Date:

August 14, 2024



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top