Wikipedia Is An Ecosystem Full Of Bots And It’s SpongeBob SquarePants’ Fault

Wikipedia is the largest repository of human knowledge ever accumulated. It has branches in dozens of languages. It is both the savior and the bane of every college student. All of the answers are there, but your professor won’t let you use them because Wikipedia is a den of scum and villainy. It’s curated by non-experts and vandals with no integrity, right? Nope. It’s a giant library that’s mostly (okay, maybe not mostly) run by robots!

The more complex articles are also defended by bots from errors and vandalism, and it all started with SpongeBob SquarePants. You see, ten years ago a band of roving trolls turned their eye to Wikipedia and decided to insert references to Squidward all across the site. The users of Wikipedia developed a simple bot to detect and remove edits that mentioned Squidward. Since then an entire ecosystem of bots has been created to work with, against and alongside human users.

A single bot, ClueBot, is responsible for almost half of the edits on the English Wikipedia site. It constantly scours entries for edits with offensive language or obvious misspellings and deletes or alters them. There are all kinds of bots running across Wikipedia’s database, run from the user’s own computers, writing, editing and keeping content safe from simple mistakes and malicious attacks.

It turns out no matter in what format humanity chooses to gather its knowledge that knowledge must still be protected from other humans. There are always barbarians at the gates looking to burn it all to the ground. Now our library is too big, and too easy to access, for people to defend it directly, so we wrote it its own immune system. It’s a library that defends itself from attack, and it learned to do it because of a grumpy squid with annoying neighbors.

A huge number of articles are even written by bots. Lsjbot writes 10,000 articles a day, most of them stubs, those short, dry articles that you find when you’re looking for an obscure bug or little known city. The bot’s contributions account for an amazing 8.5% of all articles on Wikipedia. Many of them may never be touched by human hands. They grab information from across the web, cross-reference with other approved sources and then put it all together. This means that not only are these articles able to be factually accurate, but they are also some of the most densely referenced articles on Wikipedia.

Disparage it all you like, but the most accessible, diverse accumulation of knowledge in history is no mean feat. Wikipedia is not only impressive as a collection of information, but as a technological marvel. It expands, improves and defends itself. We’ve used our knowledge to gather and prepare further knowledge for ourselves, and it’s just gorgeous.

Interested in a deeper look at the workings of these bots? Check this academic paper on Wikipedia bespoke code!

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button