Tech by Blaze Media

© 2024 Blaze Media LLC. All rights reserved.
Blaze News investigates: AI-generated content may replace news aggregation, but it cannot 'craft the prose that moves humans'
Photo by Malcolm Stroud/Picture Post/Hulton Archive/Getty Images

Blaze News investigates: AI-generated content may replace news aggregation, but it cannot 'craft the prose that moves humans'

AI may take over the 'low-effort' online writing jobs, but there is still much that it cannot do.

AI-generated content has seeped into the media sphere over the last couple of years, raising questions about how it could impact the integrity of journalism and other digital content in the coming years.

A "content solutions" company called AdVon recently made headlines after being snuffed out for creating AI-generated content and using fake bylines for product reviews. These reviews were later published by major news outlets such as Hollywood Life, Los Angeles Times, USA Today, and Us Weekly.

AdVon was also behind the dissolution of Sports Illustrated, which published many articles by fake writers. The publisher eventually lost the rights to the name Sports Illustrated.

'AI will never be able to practice investigative journalism or have the style of a Jason Whitlock or a Matt Taibbi or pick your favorite writer.'

In 2023, Arena Group — the publisher of Sports Illustrated — said the "articles in question were product reviews and were licensed content from an external third-party company, AdVon Commerce. A number of AdVon’s e-commerce articles ran on certain Arena websites. We continually monitor our partners and were in the midst of a review when these allegations were raised. AdVon has assured us that all of the articles in question were written and edited by humans.”

Despite the fallout, AdVon continues to linger around, publishing AI-generated content for major outlets under fake bylines.

But it did not start this way. AdVon was initially a company that hired overseas contractors to write various product reviews. Some of these employees claimed that their written reviews were later used to train language models before their jobs were entirely replaced by artificial intelligence. Consequently, the writers were tasked with editing and polishing AI-generated content.

Samuel Hammond, senior economist at Foundation for American Innovation, told Blaze News that "generative AI is enabling the rise of massive content farms that churn out videos, articles, and other forms of online media at staggering scales. That includes articles designed to look like original news or reporting and content strategies for optimizing placement in search engines."

AdVon not only raises questions about how AI may be used to mislead people who are looking at these product reviews, but it also raises broader questions about AI's rightful place within the media landscape. The future of AI's push into news media could be more significant than producing fake product reviews under fake bylines.

AI currently has a bias problem

Ken LaCorte, former Fox News executive and host of the "Elephants in Rooms" YouTube channel, told Blaze News that one of the fundamental issues with AI right now is that it has the "built-in leftist bias of Silicon Valley embedded into it, and that's going to be a fight [for conservatives]."

He went on to say that the media "shouldn't be afraid that AI might cause them to report some bad facts. The media's problem with credibility right now isn't because they report bad facts. They actually do a pretty good job of reporting truth from falsehood."

"The media's credibility has been shattered because they're pretending to be non-biased when, in fact, their main job is to get those facts and shoehorn them in a way to have you vote for their candidate."

Researchers in the U.K. conducted a study in 2023 that featured academics from the University of East Anglia asking OpenAI's ChatGPT a series of political questions. The questions were structured in such a way that the answers could decisively be identified as a Republican, Democrat, or unspecified position. The responses provided by the AI chatbot were then taken and mapped onto the current political landscape.

The researchers said they found "robust evidence that ChatGPT presents a significant and systematic political bias toward the Democrats in the U.S., Lula in Brazil, and the Labour Party in the U.K."

The New York Post reported that ChatGPT had already shown its hand by refusing to write a story about Hunter Biden in the style of the Post, but it did not seem to have a problem with writing a story about him in the style of CNN.

Despite these alarming revelations, most journalists are leveraging AI technologies to help them complete work-related tasks.

How journalists are leveraging AI tools

In April, Forbes reported that the Associated Press Generative AI in Journalism study revealed that close to 70% of professionals in the news industry have leaned on AI to help them produce content. Other popular uses of AI technology in the news industry have been for business tasks, information gathering, and multimedia content. Despite using AI, these professionals have expressed concern about how generative AI could affect their profession in the future.

Felix Simon, who published a white paper for the Tow Center for Digital Journalism at Columbia, said that there was no reason to that AI presented an existential threat to journalism.

Simon wrote in the paper that AI “mostly constitutes a retooling of the news rather than a fundamental change in the needs and motives of news organizations. It does not impact the fundamental need to access and gather information, to process it into 'news,' to reach existing and new audiences, and to make money.”

Simon's comment about "a retooling of the news" could also mean the end of human-driven news aggregation, which is widely prevalent across major news outlets at the moment.

"Over the last decade, cable news channels and major papers like the New York Times have closed foreign bureaus and slashed budgets for investigative reporting in favor of political clickbait and sensationalist content," Hammond said.

"They were arguably forced to move away from good old-fashioned journalism to simply stay afloat in the internet age. But with AI, content generation is a game traditional media outlets simply can't win, at least without compromising their values and integrity. If mainstream media becomes one AI content farm among many, they will cease to exist," Hammond said.

However, Peter Gietl, Blaze Media managing editor for Return, said that "there's no reason we need this many [human] aggregators, and AI taking over is not a net loss for writing in the totality." The argument suggests that if AI takes over the news aggregation part of media, talented journalists will still be able to create original long-form news stories that appeal to a wide audience.

"What this will cause is, hopefully, readers to appreciate well-reasoned, intelligent, and original writing possessing some personality and verve. I haven't seen any evidence AI can create this type of original and fascinating writing. AI will never be able to practice investigative journalism or have the style of a Jason Whitlock or a Matt Taibbi or pick your favorite writer."

"I think X and Substack have shown there is a hunger for original thought and superlative style," Gietl continued. "AI will replace a lot of low-effort, boring content creation. However, it will never be able to replace the men and women crafting the prose that moves humans."

LaCorte appeared to agree, noting that "AI can't stand in front of a courthouse, or call up a lawyer and ask their opinion on something, or interview a plane crash victim."

If there is a future in which AI takes over news aggregation and other "low-effort" writing tasks, many freelancers and copywriters could be out of a job. However, Gietl said he believes "there were way too many writers to begin with," adding that Alphabet, Google's parent company is primarily to blame. He went on to say that Google determined that the best way to "reward websites with traffic and rise in the rankings was to have original content."

This ultimately led to the "proliferation of terribly composed, incoherent writing, existing with the sole purpose of getting clicks and selling products or services," Gietl said.

While it is still uncertain how far AI will push into the media landscape in the near future, it is here to stay, in one form or another.

Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?