Tweeted already, but consider the journal Nature’s position, that it’s problematic (or dangerous) to give credit to something that is opaque about its own sources https://www.nature.com/articles/d41586-023-00191-1
“First, no LLM tool will be accepted as a credited author on a research paper. That is because any attribution of authorship carries with it accountability for the work, and AI tools cannot take such responsibility…. Second, researchers using LLM tools should document this use in the methods or acknowledgements sections. If a paper does not include these sections, the introduction or another appropriate section can be used to document the use of the LLM.”
I cannot see using the output of ChatGPT writing wholesale enough to give credit.
Looking broader, I agree with my colleague Jonathan Portiz’s assertion (https://podcast.oeglobal.org/2022/11/16/voices-43/)
“I think that giving credit to an AI, is like Ansel Adams giving credit to his camera for the copyright for his photography, or Norman Rockwell giving credit to his brush. It’s a tool, a pretty impressive tool, but just a tool”
Or if I create a mashup in Photoshop, do I give it authoring credit? If I use a beat generator for a drum track in a mix I do in Garage Band, which software gets co-author credit? If I write my paper using some kind of outlining software or something that provides alternative wording suggestion…
Citation is not the only means to be transparent. If an Artificial Intelligence tool played a major part in the writing, I would be include that in a note/acknowledgement section.
But there is no absolute answer, and I certainly am not going to ask ChatGPT for it’s opinion because it does not own one, nor does it have any stake in what it spits out. It’s merely a highly advanced autocompletion engine.