WASHINGTON: Wikipedia has launched a tool designed to automatically highlight low-quality edits to articles. The Objective Revision Evaluation Service software has been trained by Wikipedia editors to recognise the quality of an edit based on the language and context of the change.
There are about half a million changes to Wikipedia articles every day. Editors and ordinary users will now be able to quickly check how likely it is a proposed alteration is “damaging”.
“This allows editors to triage them from the torrent of new edits and review them with increased scrutiny,” the Wikimedia Foundation said.
Other projects to engage artificial intelligence (AI) in the task of evaluating Wikipedia edits have not always been well received.
Some, for instance, have automatically downgraded the input of new editors, which has been seen as problematic for well-intentioned newcomers.