One of the first things that comes to mind when we think of Wikipedia is the collaborative, democratic effort of the project. Basically, because we like democracy, we like Wikipedia. We like the anonymity, the ability to access millions of articles in one convenient place about anything we ever wanted to know very quickly. We browse the website for leisure, and we even check it during class to check facts (or even to seem smarter than we actually are). It’s permeated our culture in such a way that it has become a verb, like “to Google.” It’s accessible, common, and we use it. A lot.
However, the democratic nature of the site is actually its biggest fault: people that post and edit may not be technical, unbiased experts on particular subjects, and that leads to a lack of full information. A main highlight of the site is the ability to edit articles, but, when it is unclear who is behind the text, it is uncertain how much we can rely on what the text says. It’s true that there are checks in place for some of these things, such as new software that can more accurately find and correct fallacious information. There are internal and well-known checks as well, such as the familiar call for re-editing or note about bias with the familiar broom icon at the top of a contentious article. However,While Wikipedia does highlight bias and invite re-editing, it is true that, especially if there is a reference cited, Wikipedia often does not catch mistakes. Furthermore, it is unclear that the re-editing will actually be more accurate and solve the initial problem. Even worse, we usually don’t react when we see an article designated as such. We realize that the bias or inaccuracy may be there, but we read it anyway as an initial source of information.
To us, these problems should not matter. We are definitely smart enough to ascertain that a popularly-edited site probably contains mistakes, right? We have always been taught that “Wikipedia is NOT a valid source, but a good starting point for background knowledge,” expressly BECAUSE of this collaborative nature. Unsuprisingly, that fails to be the case in many circumstances.
Take, for example, a recent article from the UK in The Register that discusses just how lazy we are becoming. Apparently, even our journalists, the last bastion of accuracy and doggedness in finding out the truth, are relying on Wikipedia as a primary source. Journalists in The Guardian and The Mirror apparently used Wikipedia to write the obituaries of Norman Wisdom, who was a comedian, singer, and actor (yes, I DID just Wikipedia him to figure out exactly who he was). There were several inaccuracies in the entry, and the Guardian still had not corrected the mistake at the time of the article. Additionally telling is the fact that it was widely known that a reference to and reliance upon Wikipedia caused the errors to occur, but neither publication has acknowledged that this was, in fact, the case.
Poor Norman Wisdom is not the only person to be misrepresented by the inaccuracies of Wikipedia. Some inaccuracies are a little more devastating to one’s reputation than having been mistaken as the author of a song or said to have been nominated for an Oscar. In an interview with NPR, the founding editorial director of USA Today defends that he is not, in fact, likely culpable in the assassinations of JFK or RFK.
In another story, golfer Fuzzy Zoeller sued to find the author of his Wikipedia page, who had slandered him in a number of ways. This brings up a host of legal issues. The anonymity is the selling point, but, at the point that things are inaccurate, how anonymous should things on Wikipedia be? What are the future legal implications of this suit? Does the fact that Zoeller sued at all, clearly caring about a characterization of himself via this PARTICULAR channel show our continued dependence on it? Should Wikipedia be treated like any other news source? How much of our First Amendment rights extend to a place like Wikipedia and the internet?
While this is a humorous example, it does highlight the issue of collaboration: anyone can write anything (at least for a time). The process of tracking these mistakes is slow, and the inaccuracies often go unnoticed, especially if tied to ANY reference (it’s unclear if the references have to be “reliable” or “expert” sources, although Wikipedia likes to claim it won’t allow any unpublished references to contribute to entries). The worst part is that those whom we expect to seek the truth and keep us informed when we can’t do so ourselves are using it as ironclad truth.
Wikipedia is a great tool, but are we addicted and blinded in such a way by the communitarian nature and the ease of access that we fail to see when something is wholly inaccurate?