Skip to Main Content

Formulaire de recherche

APA 7th Citation Guide

Considerations Regarding AI Tools

  • An author has the legal, moral, etc. responsibility for the intellectual content of a publication of which he or she is the author. According to Springer Nature, Elsevier, etc. ChatGPT is not an author.
  • In its Terms of Use, OpenAI (creator of ChatGPT) assumes no responsibility:

"You accept and agree that any use of outputs from our service is at your sole risk and you will not rely on output as a sole source of truth or factual information, or as a substitute for professional advice."

Major Publishers' Viewpoints on Using AI Tools to Write Articles

Generally, publishers do not accept large language models (LLMs) as authors of a research article. In addition, they mention in their policies the acceptable usage and ask that the use of AI tools be disclosed.

Many major publishers do not allow peer reviewers to use AI tools in the review process.

● Springer Nature

"Large Language Models (LLMs), such as ChatGPT, do not currently satisfy our authorship criteria. Notably an attribution of authorship carries with it accountability for the work, which cannot be effectively applied to LLMs. Use of an LLM should be properly documented in the Methods section (and if a Methods section is not available, in a suitable alternative part) of the manuscript. The use of an LLM (or other AI-tool) for “AI assisted copy editing” purposes does not need to be declared.[...] These AI-assisted improvements may include wording and formatting changes to the texts, but do not include generative editorial work and autonomous content creation. In all cases, there must be human accountability for the final version of the text and agreement from the authors that the edits reflect their original work."

● IEEE

"The use of content generated by artificial intelligence (AI) in an article (including but not limited to text, figures, images, and code) shall be disclosed in the acknowledgments section of any article submitted to an IEEE publication. The AI system used shall be identified, and specific sections of the article that use AI-generated content shall be identified and accompanied by a brief explanation regarding the level at which the AI system was used to generate the content. The use of AI systems for editing and grammar enhancement is common practice and, as such, is generally outside the intent of the above policy. In this case, disclosure as noted above is recommended."

● Elsevier

"Where authors use generative AI and AI-assisted technologies in the writing process, these technologies should only be used to improve readability and language of the work. [...] The authors are ultimately responsible and accountable for the contents of the work."

● arXiv

"By signing their name as an author of a paper, they each individually take full responsibility for all its contents, irrespective of how the contents were generated. If generative AI language tools generate inappropriate language, plagiarized content, biased content, errors, mistakes, incorrect references, or misleading content, and that output is included in scientific works, it is the responsibility of the author(s)."

● Other editors: Wiley, PLOS ONE.

Mention - and DO NOT CITE! - the AI Tools

For the reasons mentioned at the left, even though APA suggets a way to cite ChatGPT answers, we recommend that you do not create in-text citations and references for ChatGPT answers (or other AI tools) and mention in your text how you used it.

For example, a student could write:

  • I used Dall-E to create the following image (add the link to the image if a permanent link exists): ...
  • I asked ChatGPT the question "..." and its answer was (add permanent link to answer): "..." ...
  • I corrected the spelling and grammar in my report by using ChatGPT.

Do not create OpenAI (2024) (for APA) or [X] (for IEEE) citations in the text and references in the reference list!

Ask your instructor / professor if you are allowed to use an AI tool for your assignments, projects, reports, etc., and under what conditions.

Major Publishers' Viewpoints on Using AI Tools in the Peer Review Process

● Springer Nature

"Despite rapid progress,  generative AI tools have considerable limitations: they can lack up-to-date knowledge and may produce nonsensical, biased or false information. Manuscripts may also include sensitive or proprietary information that should not be shared outside the peer review process. For these reasons we ask that, while Springer Nature explores providing our peer reviewers with access to safe AI tools,  peer reviewers do not upload manuscripts into generative AI tools."

● Elsevier

"Generative AI or AI-assisted technologies should not be used by reviewers to assist in the scientific review of a paper as the critical thinking and original assessment needed for peer review is outside of the scope of this technology and there is a risk that the technology will generate incorrect, incomplete or biased conclusions about the manuscript. The reviewer is responsible and accountable for the content of the review report."

 Wiley

"Editors or peer reviewers should not upload manuscripts (or any parts of manuscripts including figures and tables) into GenAI tools or services. GenAI tools may use input data for training or other purposes, which could violate the confidentiality of the peer review process, privacy of authors and reviewers, and the copyright of the manuscript under review."

Follow the Library on... Facebook Instagram YouTube LinkedIn
Ask a question