The Wikipedia Robots

August 10, 2012 in Daily Bulletin

The invaluable Wikipedia that, let’s face it, all of us rely on, owes a large part of its success to automated robots that scan the encyclopedia writes Daniel Nasaw:

  • Over 700 bots prune through Wikipedia’s 4 million articles.
  • These bots detect and remove vandalism, categorize posts, fix references, and complete other invaluable tasks.
  • The first robots pulled out data from the US census on small towns. They produced thousands of articles a day but the robot articles were short and formulaic with basic information.
  • Today robots aren’t allowed to create articles but they maintain the Wikipedia experience.

To read more including why human writers don’t have to worry about being replaced just yet, why they aren’t like cars, the small rate of false positives, the potential for one of the robots to go crazy, and what would happen if the bots went on strike, click here.

Source: BBC