As AI-generated slop takes over rising swathes of the user-generated Web because of the rise of huge language fashions (LLMs) like OpenAI’s GPT, spare a thought for Wikipedia editors. Along with their common job of grubbing out dangerous human edits, they’re having to spend an rising proportion of their time making an attempt to weed out AI filler.
404 Media has talked to Ilyas Lebleu, an editor on the crowdsourced encyclopedia, who was concerned in founding the “WikiProject AI Cleanup” mission. The group is making an attempt to give you greatest practices to detect machine-generated contributions. (And no, earlier than you ask, AI is ineffective for this.)
A selected downside with AI-generated content material on this context is that it’s virtually all the time improperly sourced. The flexibility of LLMs to immediately produce reams of plausible-sounding textual content has even led to entire faux entries being uploaded in a bid to sneak hoaxes previous Wikipedia’s human consultants.