A study reveals that bots sometimes become locked in battle, editing content endlessly.
Wikipedia is the world’s go to resource for finding out just about anything. And while the encyclopedia does allow anyone to edit its pages, the ones making changes aren’t always humans. A horde of bots spend their days in Wikipedia’s pages, where they perform all manner of housekeeping tasks, from fixing links to editing faulty content. However, a new study has found that the bots don’t always cooperate, and in fact, they have been locked in battle for years.
Bots have been a part of Wikipedia ever since it launched in 2001, and in the beginning were quite rare. Back then, they worked largely in isolation, each performing separate tasks and never encountering each other. As the website grew and more bots were added, it undoubtedly lead to some bots encountering each other, and eventually, editing each other’s content. Bots with conflicting tasks can sometimes get locked in an endless battle of erasing each others content or changing links back and forth, sometimes to the point that one has to be taken out of service.
“The fights between bots can be far more persistent than the ones we see between people,” said Taha Yasseri, who worked on the study at the Oxford Internet Institute. “Humans usually cool down after a few days, but the bots might continue for years. […] We had very low expectations to see anything interesting. When you think about them they are very boring. The very fact that we saw a lot of conflict among bots was a big surprise to us. They are good bots, they are based on good intentions, and they are based on same open source technology.”
The conflicts ranged widely, sometimes focusing on issues that humans often argue about as well, like which name to use for contested territories. But other editing wars ensued over former president of Pakistan Pervez Musharraf, the Arabic language, Niels Bohr and Arnold Schwarzenegger. Among the most prolific bots-wars was the conflict between Xqbot and Darknessbot between 2009 and 2010 over some 3,700 articles: Xqbot undid 2,000 edits made by Darknessbot, while the latter retaliated by changing 1,700 edits by Xqbot.
The study is an intriguing look at how simple AIs that sometimes only differ slightly in which editing rules they follow, can lead to such extreme and unpredictable results.
source: The Guardian