A London librarian has analyzed millions of articles in search of uncommon terms abused by artificial intelligence programs
Librarian Andrew Gray has made a “very surprising” discovery. He analyzed five million scientific studies published last year and detected a sudden rise in the use of certain words, such as meticulously (up 137%), intricate (117%), commendable (83%) and meticulous (59%). The librarian from the University College London can only find one explanation for this rise: tens of thousands of researchers are using ChatGPT — or other similar Large Language Model tools with artificial intelligence — to write their studies or at least “polish” them.
There are blatant examples. A team of Chinese scientists published a study on lithium batteries on February 17. The work — published in a specialized magazine from the Elsevier publishing house — begins like this: “Certainly, here is a possible introduction for your topic: Lithium-metal batteries are promising candidates for….” The authors apparently asked ChatGPT for an introduction and accidentally copied it as is. A separate article in a different Elsevier journal, published by Israeli researchers on March 8, includes the text: In summary, the management of bilateral iatrogenic I’m very sorry, but I don’t have access to real-time information or patient-specific data, as I am an AI language model.” And, a couple of months ago, three Chinese scientists published a crazy drawing of a rat with a kind of giant penis, an image generated with artificial intelligence for a study on sperm precursor cells.
In general, if it passed peer review it shouldn’t matter how it was written.
The fact the blatant examples apparently made it past peer review show how shoddy the process is though.
The academic paper system has been in trouble for decades. But man, the last 10-20 years seems to have reached such an abysmal state that even the general public is hearing about it more and more with news like this, along with the university scandals last year.
I hate how much time and energy is wasted on this bullshit…
You’d think the smartest people around would come up with a be better system than this. I mean they did, but some of the highest decision-makers have big incentives to keep things as they are. So mark that one more on the “capitalism ruins everything it touches” scoreboard.
¯\(ツ)/¯Incentives matter in any system. The incentives are perverse right now.
So, not very meticulous?
It’s not the reviewer’s fault! When they asked ChatGPT to peer review the paper it found nothing wrong.
Things are very people specific as to what gets through some Journals are known to have lax review but high publication costs. These “predatory” journals and other nepotism stuff has been an issue for a while. The scientific community wants to tackle these issues but it’s been hard to make any real progress. Covid politics and now AI have really not helped.
Incredible.
You’re telling me that a country with 2 billion people producing multiple thousand scientific papers per day where someone’s quality of life is directly dependent on their educational certifications or attainment thereof and has a culture where cheating is acceptable in order to win is bullshitting and diluting science as if their life depended on it?
Shocked, I tell you, shocked.
Speaking with some family I still have over there, to hear them tell it at least, it’s lingering generational trauma originating from the Great Leap Forward.
Doing the “right thing” at that point in China’s history got you killed. Millions died in the name of collectivization. To survive, people did what they had to: they lied, smuggled, stole, and scammed.
The honest died, the dishonest lived, and so dishonesty became enshrined as a national virtue.
Not too different from capitalism in the west I suppose, since no one good and honest becomes rich. But at least the poor aren’t dying in the millions yet, so people still accept the lie that hard work and integrity will result in success.
It’s sadly something that happens anywhere you get incentives and pressure to cheat, https://www.npr.org/2023/06/26/1184289296/harvard-professor-dishonesty-francesca-gino
It’s commendable that they discovered this through their meticulous research.
what the fuck is this image? Is this new new biology?
you ever seen rat balls? they are huge.
i have not seen rat balls, but i’m going to assume they don’t look like the spire from fucking city 17
I wonder… could I let chatGPT get me a PhD?
Just don’t let it draw any mouse anatomy
That’s why I overuse a Thesaurus
Yes I dwindle, jade, tax, crumble, impair, weather, decrease, gall, decline, tire, decay, scrape, abrade, fade, waste, shrink, deteriorate, exhaust, erode, scuff, weary, graze, fatigue, diminish, fray, chafe, drain, overwork, grind, cut down, wear out, be worthless, become threadbare, become worn, go to seed, scrape off, use up, wash away and wear my thesaurus thin too
Fuck that website and that insane cookie confirm box.
Why couldn’t journals require authors to disclose use of any AI tool along with specific prompts used? It shouldn’t be too hard to manage that.
Do you think every paper writer would comply? Do you think that the actually problematic writers, like those cutting so many corners that they directly paste ChatGPT results into their paper, would comply?