A.I. Like ChatGPT Is Revealing the Insidious Disease at the Heart of Our Scientific Process
The language in Nature was pretty mild as far as freakouts go. ChatGPT and other similar A.I. tools, the editors wrote, threaten "the transparency and trust-worthiness that the process of generating knowledge relies on … ultimately, research must have transparency in methods, and integrity and truth from authors." The editor of Nature's chief rival, Science, similarly blew his stack in a most genteel manner: "An AI program cannot be an author. A violation of these policies will constitute scientific misconduct no different from altered images or plagiarism of existing works," he wrote. These might seem like gentle warnings, but to academics who submit research papers to peer-reviewed journals like Science and Nature, the specter of being charged with research misconduct--potentially a career-wrecking accusation--for using A.I. is about as subtle as an air-raid siren.
Jan-31-2023, 17:21:19 GMT
- Technology: