Wikipedia: Time for a reality check
The reason? Too many errors, too much unreliable information, and a growing realization that the position of the Wikipedia as a trusted information source could be placed at risk unless more controls are placed over its freewheeling editorial process. In a nutshell, this announcement affirms the limitation of serious user generated content on the Web and reaffirms the necessity for professional or dedicated curators to ensure quality of information and lack of self-dealing or self-interested postings.
Not that the Wikimedia Foundation would admit that it had failed. In fact, the Foundation did not even acknowledge on its own wiki that this significant change occurred less than a month after the Wikipedia posted its 3 millionth article. But this announcement can be taken only as an admission that adult supervision has become necessary to maintain the integrity of the Wikipedia -- and by extension, the integrity of all user-generated content that purports to be authoritative.
Reports of wars over Wikipedia entries have been common in the blogosphere for some time now. And, in the past year, many critics have begun to question the veracity of information found in the Wikipedia. As Simson L. Garfinkel, a noted computer expert, recently wrote in MIT's Technology Review, "With little notice from the outside world, the community-written encyclopedia Wikipedia has redefined the commonly accepted use of the word 'truth.'" And he doesn't mean in a good way. The criticisms citing Wikipedia's unreliability became important and vocal enough that the Wikipedia itself posted a page dealing with them.
Equally troubling, as this post in Slashdot shows, the media came to rely on the Wikipedia to such a degree that it unwittingly restated supposed facts gleaned from the online encyclopedia which turned out to be incorrect. With some of the best known authors of the day republishing (albeit, inadvertently) large chunks of the Wikipedia verbatim, one wonders how bad this misinformation could become as the the facts degrade into a giant game of telephone.
Less surprising but also disturbing, corporations and interested parties often openly edited entries relating to themselves to try to spin themselves in a more positive light. In 2005, as this Wired.com article explains, the Diebold Corporation (DBD) allegedly altered and redacted entries covering scandals over security issues with its voting machines. Tools that tracked who was changing Wikipedia entries repeatedly revealed that other companies and their agents were up to similar tricks.
The Wikipedia is not the only user generated content site to suffer this fate. TripAdvisor.com, a Website of user generate travel reviews, has had major problems with hotels rigging votes to up their popularity rankings. This, in turn, moves these properties much higher in TripAdvisor search results. Dozens of business owners have been the targets of what they claim were slanderous and inaccurate reviews in Yelp.com, the user generated yellow pages derivative that features ratings for millions of businesses around the country. Further, some business owners implied that Yelp itself was asking them to advertise on the site in exchange for more favorable treatment or for the removal of negative reviews.
That Wikipedia has maintained so much of its non-hierarchical architecture is perhaps a testament to the strength and rigor of its avid cadre of serious volunteer editors. And, without a doubt, much of what is published on Wikipedia is true and factually accurate. But dreams die hard, and founder Jimmy Wales' vision of a self-policed information sharing and collation site now appears to have come to terms with a less-than-ideal reality.