Is SkyNet Taking Over? AI-Generated Articles Are Popping Up Left and Right

Yes, it’s hyperbolic to say that the machines are truly taking over as James Cameron predicted almost 40 years ago in The Terminator, but everyday seems to bring more and more news stories about AI infiltrating another industry. Publishing is not immune to this infiltration as anyone who works in the industry knows that ChatGPT and other similar services are seeing more and more use by authors. What felt like a foreboding fear a few years ago is now becoming a great reality.

Robot writing in a journal

               

In December of 2023, Ross Levinsohn, the CEO of Sports Illustrated, was fired by the magazine’s parent company after the magazine published articles that were allegedly generated by fake authors and AI services. A Futurism report noted that Drew Ortiz, an author of numerous articles for the magazine, had no social media presence or author history at any other publications. His profile image was also available for sale on website that specialized in selling AI-generated headshots. (It’s a whole new world, ladies and gentlemen!) The report found that this was not the only AI author with multiple articles found to have content that was suspected to be generated by AI. It’s one of the first large-scale scandals involving AI-generated content since the explosion in popularity of ChatGPT, but not the only one.

News publications are also using AI-generated articles but are trying to present their use as more “responsible”. Still, they are not immune to scandal. In September of 2023, Gannett, the publishing company that owns USA Today among other news outlets, was shown to have used AI to write multiple high school sports articles for local news sources that they own with little to no editing by humans. This comes just months after the group stated that they would use AI in some of their publications but would use it with human oversight. Specifically, they promised that no AI content would be published without first being edited and read over by a human contributor. Yet, the articles frequently featured repetitive information and at times failed to even mention player names. (Maybe they could hire Technica Editorial to help with the copyediting?)

Scholarly publishing remains more steadfast in keeping AI content out of their articles, but the content popping up in journal articles feels more and more inevitable by the day. In an August 2023 article about fossil fuel efficiency in Resources Policy, a journal under the Elsevier umbrella, the following sentence was included: “Please note that as an AI language model, I am unable to generate specific tables or conduct tests, so the actual results should be included in the table.” While the article’s authors all had institutional affiliations, the line is something seen repeatedly in content generated by ChatGPT. The article (and the AI note) was discussed by other researchers on X, Elsevier announced that it would be launching an investigation into the use of AI in this and other published articles. It should be noted that Elsevier does allow the use of AI tools in the writing of articles, but it must be disclosed by the authors upon submission and noted when the manuscript is published, a policy similar to Gannett’s human oversight policy for their content.

It’s clear that AI-generated content is here to stay in every level of publishing, whether that’s news writing, scholarly journals, or trade publications. The ease of use is perhaps too good for most authors to turn down. The question is whether publishers will be diligent and “responsible” in their oversight. Will companies learn from these scandals and push for better monitoring in their future publications or will we continue to see more unrestricted and unedited use?

By: Chris Moffitt
Chris is a Managing Editor at Technica Editorial

You May Also Be Interested In

I’ll Give You Proof!

I’ll Give You Proof!

At first glance, copy editing and proofing might seem like very similar tasks—and they do, indeed, have plenty in common. But a copy editor with a sharp eye for detail will recognize that these are entirely separate processes with entirely separate skill sets. On the...

Can AI Be Responsible? The Case for Elsevier’s Scopus

Can AI Be Responsible? The Case for Elsevier’s Scopus

If the scholarly publishing community has learned nothing else over the last 5 years, it’s that for better or worse, AI is here to stay. Peer reviewers are using it. Authors are using it. We’ve talked so much about the use of AI in scholarly publishing and the...

The Technica Advantage

At Technica Editorial, we believe that great teams cannot function in silos, which is why every member of our staff is cross-trained in editorial support and production. We train our employees from the ground up so they can see how each role fits into the larger publishing process. This strategy means Technica is uniquely positioned to identify opportunities to improve and streamline your workflow. Because we invest in creating leaders, you get more than remote support — you get a partner.