AI spreads in journalism. Remain calm, everyone.

Social media had a good time mocking this AI prep football story published by Gannett’s Columbus (Ohio) Dispatch on Aug. 18. It contains no names or stats, repeats the score in the first two paragraphs, repeats “Ohio football” in the first two paragraphs, and includes bizarre phrases such as “thin win” and “close encounter of the athletic kind.” The question is, is a mess like that better than no report at all? UPDATE AUG. 28: Axios reported that after this story went viral on social media, the dispatch “paused” use of its ai program to write prep sports stories.

The issue of how artificial intelligence programs will affect journalism is an interesting and complicated one. Some say they could have benefits. Others say they might be harmful. It depends on how they are used.

Did you think this was yet another article about AI for which the writer cleverly asked an AI program to write the lead? Fooled ya! This was actually my trying to write like an AI program.

Either way, pretty lame, eh?

The use of artificial intelligence in journalism is spreading rapidly, and debates over what newsrooms should and shouldn’t use it for are spreading even more rapidly. Rest assured the private equity and hedge fund owners of news chains are trying to figure out how they can use it to save on labor costs, which has led to some panic among the industry rank and file about job security and product quality.

AI software currently has a variety of uses in the field. Just a few examples: transcribing interviews; identifying trending topics online; delivering individually personalized news; flexible website paywalls; and internet data scraping. Let’s focus on its more controversial ability to create content.

The Associated Press uses AI to write articles from reports of corporate earnings. The Washington Post uses it to write articles from high school football statistics. These are examples of smart applications, producing formulaic stories in quantity and freeing journalists for more ambitious work.*

“Large language models” such as ChatGPT and Bard can also write whole stories from inputted data. This is not as smart. Even though the capabilities of AI are improving rapidly, results are too often factually wrong, dully written and generic rather than localized. Journalism garbage, in other words.

Nieman Lab recently surveyed news organizations around the world that have guidelines on AI and found that most do not allow creation of stories and photos. (Accepted uses included research, headline suggestions, social media posts and creation of illustrations.) The Knight Foundation examined 130 AI-related newsroom projects and determined that only 15% of them involved automated story generation.

The future might look different. The New York Times reported in July that Google demonstrated a story-writing program to representatives of The Times, The Washington Post and The Wall Street Journal, who saw it as potential assistance to their journalists.

AI offers tremendous potential gains in the production of standardized news stories. But writing journalism that readers will pay for demands more than that: critical thinking, context, nuance, creativity, style. And no good story can be written without the good reporting and interviewing that must come first. All that comes from pros, not programs.

Of course, news owners and managers have to recognize this, which explains the alarm among news unions and other news staff. Bosses can’t afford to underestimate the value of high-quality work and what it takes to achieve it.

New tech is always scary. It can be misused. But it can also be a gift.

 

*Sometimes AI can’t even do formula stories without embarrassment. See the photo with this post and the update in the caption Aug. 28.


Click here for my February take on how AI fits, and doesn’t fit, in a college classroom.