INNOVATION

Automated Wikipedia Edit-Bots Have Been Fighting Each Other For A Decade

In 13 different languages.

27/02/2017 9:16 PM AEDT | Updated 27/02/2017 11:58 PM AEDT
Kirillm via Getty Images
Toy robots fighting. Technology concept. Isolated over white. Contains clipping path

It turns out Wikipedia's automated edit 'bots' have been waging a cyber-war between each other for over a decade by changing each other's corrections -- and it's getting worse.

Researchers at the University of Oxford in the United Kingdom released a report on Thursday that shows that bot software studied between 2001 and 2010 -- which is designed to "undo vandalism, enforce bans, check spelling, create inter-language links... [and] identify copyright violations," has been reverting changes made by other Wikipedia-created-bots far more often than those made by humans.

"We find that, although Wikipedia bots are intended to support the encyclopedia, they often undo each other's edits and these sterile 'fights' may sometimes continue for years," the study reads.

"Unlike humans on Wikipedia, bots' interactions tend to occur over longer periods of time and to be more reciprocated."

The research covers bots from all 13 language versions of Wikipedia from the first ten years of the website's operation. It found that, since 2001, the frequency of bots reverting changes made by another bot has been consistently increasing -- although it does depend on the language.

"Over the ten-year period, bots on English Wikipedia reverted another bot on average 105 times," the report reads.

"Bots on German Wikipedia revert each other to a much lesser extent than other bots (24 times on average). Bots on Portuguese Wikipedia, in contrast, fight the most, with an average of 185 bot-bot reverts per bot."

Researchers also found that, compared to humans who make edits on Wikipedia pages, bots tend to take a lot longer to revert another bot's changes and are more likely to match the change a previous bot has applied.

In other words, if one bot edits a post, another bot will be likely to change it back to exactly what it was before -- even if it takes them a while.

Think of it as an online form of bickering between two robots that has actually lasted for more than 10 years.

So which particular bots have been responsible for the disagreements?

According to the findings, bots can be split into two groups -- benevolent and malevolent. Benevolent bots are the good-guys of Wikipedia who help users find what they need and make their online visit easier. Malevolent bots are the ones that counteract incorrect human edits or website violations.

Although as it turns out, on Wikipedia even the good guys create problems.

"We found that most of the disagreement occurs between bots that specialise in creating and modifying links between different language editions of the encyclopedia," researchers said.

"The same bots are responsible for the majority of reverts in all the language editions we study. For example, some of the bots that revert the most other bots include Xqbot, EmausBot, SieBot, and VolkovBot, all bots specialising in fixing inter-wiki links.

"In the case of Wikipedia, we see that benevolent bots that are designed to collaborate may end up in continuous disagreement. This is both inefficient as a waste of resources, and inefficacious."

Of the worst cases, Xqbot and Darknessbot managed to clash on 3629 different Wikipedia articles within a year, while the Japanese Tachikoma bot knocked heads with Russbot on more than 3000 articles over two years, according to the Guardian.

The results of the report work as a stark reminder that even the simplest of website algorithms can react unpredictably -- and before you know it, you're ten years into an endless spell check battle.

Looks like it could be back to the drawing board for Wikipedia.

More On This Topic