One of the first things that comes to mind when we think of Wikipedia is the collaborative, democratic effort of the project. Basically, because we like democracy, we like Wikipedia. We like the anonymity, the ability to access millions of articles in one convenient place about anything we ever wanted to know very quickly. We browse the website for leisure, and we even check it during class to check facts (or even to seem smarter than we actually are). It’s permeated our culture in such a way that it has become a verb, like “to Google.” It’s accessible, common, and we use it. A lot.
However, the democratic nature of the site is actually its biggest fault: people that post and edit may not be technical, unbiased experts on particular subjects, and that leads to a lack of full information. A main highlight of the site is the ability to edit articles, but, when it is unclear who is behind the text, it is uncertain how much we can rely on what the text says. It’s true that there are checks in place for some of these things, such as new software that can more accurately find and correct fallacious information. There are internal and well-known checks as well, such as the familiar call for re-editing or note about bias with the familiar broom icon at the top of a contentious article. However,While Wikipedia does highlight bias and invite re-editing, it is true that, especially if there is a reference cited, Wikipedia often does not catch mistakes. Furthermore, it is unclear that the re-editing will actually be more accurate and solve the initial problem. Even worse, we usually don’t react when we see an article designated as such. We realize that the bias or inaccuracy may be there, but we read it anyway as an initial source of information.
To us, these problems should not matter. We are definitely smart enough to ascertain that a popularly-edited site probably contains mistakes, right? We have always been taught that “Wikipedia is NOT a valid source, but a good starting point for background knowledge,” expressly BECAUSE of this collaborative nature. Unsuprisingly, that fails to be the case in many circumstances.
Take, for example, a recent article from the UK in The Register that discusses just how lazy we are becoming. Apparently, even our journalists, the last bastion of accuracy and doggedness in finding out the truth, are relying on Wikipedia as a primary source. Journalists in The Guardian and The Mirror apparently used Wikipedia to write the obituaries of Norman Wisdom, who was a comedian, singer, and actor (yes, I DID just Wikipedia him to figure out exactly who he was). There were several inaccuracies in the entry, and the Guardian still had not corrected the mistake at the time of the article. Additionally telling is the fact that it was widely known that a reference to and reliance upon Wikipedia caused the errors to occur, but neither publication has acknowledged that this was, in fact, the case.
Poor Norman Wisdom is not the only person to be misrepresented by the inaccuracies of Wikipedia. Some inaccuracies are a little more devastating to one’s reputation than having been mistaken as the author of a song or said to have been nominated for an Oscar. In an interview with NPR, the founding editorial director of USA Today defends that he is not, in fact, likely culpable in the assassinations of JFK or RFK.
In another story, golfer Fuzzy Zoeller sued to find the author of his Wikipedia page, who had slandered him in a number of ways. This brings up a host of legal issues. The anonymity is the selling point, but, at the point that things are inaccurate, how anonymous should things on Wikipedia be? What are the future legal implications of this suit? Does the fact that Zoeller sued at all, clearly caring about a characterization of himself via this PARTICULAR channel show our continued dependence on it? Should Wikipedia be treated like any other news source? How much of our First Amendment rights extend to a place like Wikipedia and the internet?
While this is a humorous example, it does highlight the issue of collaboration: anyone can write anything (at least for a time). The process of tracking these mistakes is slow, and the inaccuracies often go unnoticed, especially if tied to ANY reference (it’s unclear if the references have to be “reliable” or “expert” sources, although Wikipedia likes to claim it won’t allow any unpublished references to contribute to entries). The worst part is that those whom we expect to seek the truth and keep us informed when we can’t do so ourselves are using it as ironclad truth.
Wikipedia is a great tool, but are we addicted and blinded in such a way by the communitarian nature and the ease of access that we fail to see when something is wholly inaccurate?
3 thoughts on “Are We Wikiaddicts? – by “Kristin B””
I’m hesitant agree, even slightly, with the theory that the million+ Wikipedia articles have a majority of ‘wholly’ inaccurate pieces. Are the mistakes, yes. But then again, aren’t there mistakes in professionally printed encyclopedias? Yes. So given that, then the question here becomes, are we becoming lead by whatever written text is in front of us and do we give Wikipedia, or any source, more power and validity than should be given. Just because it came from the internet and peers shouldn’t be a reason to write it off, and likewise, just because it was printed or professionally compiled, shouldn’t be a reason to trust it.
I am in no way implying that the majority are inaccurate. In fact, I think we defer to trusting Wikipedia not only because we use it so often but also because we think that the communitarian and collaborative aspect of it makes it more likely to have accurate entries.
In fact, the fewer things that are inaccurate, the harder it is for us to detect them. If we defer to trusting something often because it is often correct, we fail to check ourselves and avoid making errors in information.
The point was that this can often be damaging if left unchecked, and perhaps we trust it too much. Perhaps a better choice of words would be that some things are “partially inaccurate” or have inaccuracies that we fail to detect.
I think we need to treat ALL sources with the sort of critical eye that we’re told to use on Wikipedia. How do you know a given fact is true? This are the classic epistemic requirements for knowledge:
• The fact must be true
• The fact must be believed by the knower
• The knower must be justified in their belief of the fact
The problem is that the third requirement is really, really hard, and people take too many shortcuts that let in inaccuracies. Some shortcuts are needed: a true skeptic will choke on Cartesian doubt of reality … but use too many and one might be mislead.
Wikipedia’s pretty accurate, but there are, like in any reference, some errors, whether of omission, commission, or stylistic. We need to be wary of potential error, but disregarding it completely would be a waste of our time.