Computer scientists from the University of Oxford and the Alan Turing Institute in the UK analyzed how both human editors and bots interact on Wikipedia. Anyone can edit an entry on Wikipedia which is why human volunteers and bots specially made to check facts and edits are so important. Sometimes editors will make back and forth changes before reaching a final decision. The same goes for bots as well, the study found. Interestingly, bots on some pages of Wikipedia ‘argued’ for years before finally settling.
Robot debate
You might be surprised to learn that most of the internet’s traffic isn’t created by humans. Overall, bots are responsible for 52 percent of web traffic, according to the security firm Imperva. Most of these bots are malignant, designed to attack websites by scrapping content, spamming comments, forcing passwords or injecting malicious content such as malware. But there are many bots that serve a wide range of functions, most notably crawling. Facebook’s feed fetcher, by itself, accounted for 4.4 percent of all website traffic.
Wikipedia has its own bots and, frankly, the website couldn’t run without them. These bots edit millions of pages every year and are responsible for the bulk of mindless jobs such as formatting sources and links. Some can even start new pages with minimal content — entries known as stubs — to get the conversation going.
To get a sense of the scale involved, Thomas Steiner from Google Germany monitored bot activity across all 287 language versions of Wikipedia and on Wikidata. In 2014, he found that human and bot entries are roughly 50-50 tied across all language versions of Wikipedia but there was some huge variation on a language by language basis. Only 5 percent of the edits to the English language version of Wikipedia were made by bots in contrast to 94 percent of the edits to the Vietnamese version are by bots. And despite this huge bot activity, Wikipedia remains over 95% accurate, beating most textbooks.
Professor Taha Yasseri from the Oxford Internet Institute, and colleagues, wanted to see how these bots interacted seeing how some overlap in tasks. The team looked at Wikipedia entries across 13 different languages over ten years (2001 to 2010). The study suggests there are many instances where the bots interacted with unpredictable consequences.
Remarkably, the bots behaved more like humans than expected changing behavior depending on the cultural context. Bots on the German edition, for instance, had the fewest conflicts with each undoing another’s edit 24 times on average. Bots in the Portuguese edition, however, quarreled over edit as many as 185 times on average over the ten years monitoring period. For the English edition, there 105 back-and-forth edits or three times the rate of human reverts.
Some of these fights could rage on for years. In some situations, there were deadlocks as the bots would repeatedly undo one another.
“We find that bots behave differently in different cultural environments and their conflicts are also very different to the ones between human editors. This has implications not only for how we design artificial agents but also for how we study them. We need more research into the sociology of bots,” said Dr Milena Tsvetkova, from the Oxford Internet Institute.
The findings published in PLOS ONE should come as a warning to developers who write code for artificial intelligence systems ranging from autonomous driving to cyber security to managing social media. Developers should be aware of the bots’ different cultural contexts — or rather the cultural context of the human designer.
“The findings show that even the same technology leads to different outcomes depending on the cultural environment. An automated vehicle will drive differently on a German autobahn to how it will through the Tuscan hills of Italy. Similarly, the local online infrastructure that bots inhabit will have some bearing on how they behave and their performance. Bots are designed by humans from different countries so when they encounter one another, this can lead to online clashes. We see differences in the technology used in the different Wikipedia language editions and the different cultures of the communities of Wikipedia editors involved create complicated interactions. This complexity is a fundamental feature that needs to be considered in any conversation related to automation and artificial intelligence,” Yasseri said.
This paper is also one of the few that studies ‘bot sociology.’ As more and more AI systems will interact with one another, as well as humans, it will be very important that these interact only when there’s a consensus to avoid unintended, possibly catastrophic consequences.