AI nonsense in over 100 research papers | Inquirer Technology

AI nonsense found in over 100 research papers

/ 11:24 AM March 23, 2024

If you finished college, you probably remember writing research papers. You performed an experiment or project and then detailed your findings in a document with a hundred pages or more. 

Most remember them as a hassle everyone must endure to graduate. However, they have larger real-world significance. 

READ: ChatGPT bug caused AI to spout nonsense

ADVERTISEMENT

IT firm Nogor Solutions Limited said on its Medium page that ”research papers can inform policy decisions, influence industry practices, and contribute to technological advancements.”

FEATURED STORIES

However, a recent study reveals that over 100 research papers contain telltale signs that ChatGPT made them. Many reportedly contain gibberish, despite undergoing peer review.

How AI nonsense spread into research papers

UK news outlet Daily Mail reported on findings from tech journalism site 404 Media. It found that 115 papers in Google Scholar, Google’s search tool for academic papers, contained the phrase, “As of my last knowledge update.”

At the time of writing, 186 results appear in searches. The tech news site reported that the dates in the papers with this phrase corresponded with ChatGPT knowledge updates.

In other words, the phrase appeared whenever the AI chatbot gained more information about topics, such as: 

  • Spinal injuries
  • Battery technologies
  • Rural medicine
  • Bacterial infections
  • Cryptocurrency
  • Children’s well-being
  • Artificial intelligence

Some papers used the phrase to explain the problems with using ChatGPT for research. However, Daily Mail says many are barely intelligible, such as “Global Education Iducation and International Education Advocacy.”

Kolina Koltai, a member of the open-source research group Bellingcat, posted an example of AI nonsense in one academic paper.

ADVERTISEMENT

You can see above how it shows ChatGPT’s chirpy reply in the paper: “Certainly, here is a possible introduction for your topic.”

Daily Mail reports academic researchers are using ChatGPT to make research papers because of immense pressure from their universities to publish papers.

Scientists are more likely to land new jobs and promotions by frequently publishing papers. Hence, the phrase, “publish or perish,” became a common name for this issue.

Conventional academic review takes months or years because other scientists will check papers and request multiple revisions.

On the other hand, “paper mills” accept nearly any submission as long as the author pays the publishing fee.

For instance, The International Journal of New Media Studies, allegedly published two different papers with the phrase, “As of my last knowledge update.” Yet, the journal claims to conduct peer review.

What are the risks of AI nonsense in papers?

This represents academic papers with AI gibberish.
Free stock photo from Pexels

Another Inquirer Tech article warns that ChatGPT makes it easier for researchers to post AI-made papers to cut corners and post faster.

Worse, some may post misinformation on reputable platforms by sneaking with convincing, AI-generated falsehoods.

Sandra Wachter, an Oxford Internet Institute researcher who specializes in artificial intelligence, said she felt “very worried.” She said, “We’re now in a situation where the experts are not able to determine what’s true or not.”

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

The AI researcher fears we might “lose the middleman that we desperately need to guide us through complicated topics.”

TOPICS: Artificial Intelligence, technology
TAGS: Artificial Intelligence, technology

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.