Are We Wikiaddicts? – by “Kristin B”

One of the first things that comes to mind when we think of Wikipedia is the collaborative, democratic effort of the project. Basically, because we like democracy, we like Wikipedia. We like the anonymity, the ability to access millions of articles in one convenient place about anything we ever wanted to know very quickly. We browse the website for leisure, and we even check it during class to check facts (or even to seem smarter than we actually are). It’s permeated our culture in such a way that it has become a verb, like “to Google.” It’s accessible, common, and we use it. A lot.

However, the democratic nature of the site is actually its biggest fault: people that post and edit may not be technical, unbiased experts on particular subjects, and that leads to a lack of full information. A main highlight of the site is the ability to edit articles, but, when it is unclear who is behind the text, it is uncertain how much we can rely on what the text says. It’s true that there are checks in place for some of these things, such as new software that can more accurately find and correct fallacious information. There are internal and well-known checks as well, such as the familiar call for re-editing or note about bias with the familiar broom icon at the top of a contentious article. However,While Wikipedia does highlight bias and invite re-editing, it is true that, especially if there is a reference cited, Wikipedia often does not catch mistakes. Furthermore, it is unclear that the re-editing will actually be more accurate and solve the initial problem. Even worse, we usually don’t react when we see an article designated as such. We realize that the bias or inaccuracy may be there, but we read it anyway as an initial source of information.

To us, these problems should not matter. We are definitely smart enough to ascertain that a popularly-edited site probably contains mistakes, right? We have always been taught that “Wikipedia is NOT a valid source, but a good starting point for background knowledge,” expressly BECAUSE of this collaborative nature. Unsuprisingly, that fails to be the case in many circumstances.

Take, for example, a recent article from the UK in The Register that discusses just how lazy we are becoming. Apparently, even our journalists, the last bastion of accuracy and doggedness in finding out the truth, are relying on Wikipedia as a primary source. Journalists in The Guardian and The Mirror apparently used Wikipedia to write the obituaries of Norman Wisdom, who was a comedian, singer, and actor (yes, I DID just Wikipedia him to figure out exactly who he was). There  were several inaccuracies in the entry, and the Guardian still had not corrected the mistake at the time of the article. Additionally telling is the fact that it was widely known that a reference to and reliance upon Wikipedia caused the errors to occur, but neither publication has acknowledged that this was, in fact, the case.

Poor Norman Wisdom is not the only person to be misrepresented by the inaccuracies of Wikipedia. Some inaccuracies are a little more devastating to one’s reputation than having been mistaken as the author of a song or said to have been nominated for an Oscar. In an interview with NPR, the founding editorial director of USA Today defends that he is not, in fact, likely culpable in the assassinations of JFK or RFK.

In another story, golfer Fuzzy Zoeller sued to find the author of his Wikipedia page, who had slandered him in a number of ways. This brings up a host of legal issues. The anonymity is the selling point, but, at the point that things are inaccurate, how anonymous should things on Wikipedia be? What are the future legal implications of this suit? Does the fact that Zoeller sued at all, clearly caring about a characterization of himself via this PARTICULAR channel show our continued dependence on it? Should Wikipedia be treated like any other news source? How much of our First Amendment rights extend to a place like Wikipedia and the internet?

While this is a humorous example, it does highlight the issue of collaboration: anyone can write anything (at least for a time). The process of tracking these mistakes is slow, and the inaccuracies often go unnoticed, especially if tied to ANY reference (it’s unclear if the references have to be “reliable” or “expert” sources, although Wikipedia likes to claim it won’t allow any unpublished references to contribute to entries). The worst part is that those whom we expect to seek the truth and keep us informed when we can’t do so ourselves are using it as ironclad truth.

Wikipedia is a great tool, but are we addicted and blinded in such a way by the communitarian nature and the ease of access that we fail to see when something is wholly inaccurate?

Two love letters to Wikipedia – by “Joel S”

May 2010:

Dear Wikipedia,

As a senior in high school, about to close the book on this remarkable journey, I feel the pressing need to profess my deepest adoration and gratitude for you. Frankly, you have been an indispensable asset, nay, a lifesaver throughout these past four years. I would be remiss if in thinking about the end and all who helped me get here I didn’t acknowledge you.

Seeing as I have no background in technological law, I care not about the legal questions that your services beg. Instead, I concern myself only with what you provide for me – a seemingly endless bounty of information, free of charge, and full of knowledge. You are an ostensibly omniscient being, providing information on almost any topic, be it acalculia, a calculator, or calculus. From the extensive to the esoteric, no topic seems too big or too small for your cavernous amphora of genius. Time and time again, topics, theories, and historical figures have bemused me. After flipping through the book, scouring the Internet, and nearly giving up, you are so often the one who helps me find what I’ve been searching for.

You’re also a time saver. Take that assignment in US history for example; remember all the way back to junior year? We had to identify less well-known civil war generals whose names were scattered throughout hundreds of pages of text. Rather than sift through the book, I consulted your services, and found all of the information that I needed for every last one of the generals – brigade commanded, side for which they fought, battles in which they participated. I even threw in some extra information that the teacher regarded as going “above and beyond” what was asked for in the assignment. Yes, Wikipedia, you are a bastion of efficiency.

Additionally, you satisfy my intense thirst for knowledge. If ever I find myself in a situation in which I desire to know more about a specific subject, you are the source to which I turn. When I wanted to learn what the WHIP statistic measured in baseball, I read your page. When my comparative government teacher discussed Ad Nauseam about Weber’s Modernization Theory, your page helped make sense of what exactly it was she was talking about. And, just a few days ago, when I found out what residential college I had been placed into at Yale, your page convinced me that Trumbull is indeed the best college at the school. Thank you Wikipedia for providing me, and countless others, with a free and rapidly evolving database of both useful and inane trivia.

Teachers may question the veracity of everything that you say. They are incredulous that a website monitored and maintained by the public can consistently result in fair, unbiased, and useful information. I understand their concerns, but throughout our four-year relationship, you have yet to let me down. I just want to say, in closing, that I love you Wikipedia, and I’m so happy that we get to go to college together.

Sincerely,
Me

October 5, 2010

Dear Wikipedia:

It’s been a little while since we have last spoken. I did not mean to neglect you; it’s just that, well college work is different than high school assignments. Also, I’m taking this intro to law and technology class that is reshaping the way that I view the Internet. It’s not that my feelings about you are any different; I still love you. It’s just that, well, the reasons for my loving you have changed.

No longer do I consider you the ‘be-all-end-all’ source of knowledge. The constant refrain of my high school teachers was in fact true: you are a good starting off point. For high school assignments where a rudimentary understanding of basic concepts was normally sufficient, you were all that was needed. That’s all changed now though, as classes go deeper in depth on more specific topics. It turns out you’re not as powerful as I used to think you were.

Though, in a technological sense, you are a paradigm-shifting database. A true embodiment of the auspices of the free software movement, your survival relies on the work of countless volunteers. As a manifestation of peer production, your work is a true testament to the power of collaboration, and signals a substantial cadre of people hoping to use the Internet as a communal tool, rather than simply a source from which you can readily access desired information. Upon reflection and further research, you were created to act as a discussion board among scholars and students alike, working together to create fair and balanced articles on all subjects that merited recognition. Your administrators, editors, and viewers to a lesser extent have adopted an esprit de corps that is founded on trust, curiosity, and an investment of trust in Internet users. If nothing else, you serve as an interesting social experiment as to what benefits and detriments arise out of increasing the role of the average Internet user in shaping widely read material. In terms of pure technology, your function is basic but noble. You recognize the human desire for immediate updates, and the near instantaneous dissatisfaction that comes with obsolete facts. In essence, you create a program that enables technologically inept users (such as myself) to make a difference in a domain in which they know very little.

Legally speaking, you also are an interesting case study. It makes sense that one of the few instances in which you censor material is when users post material that is an infringement of copyright. You create a culture in which the public decides the reputation of individuals by eschewing any tampering of one’s own page. Though, many reputation wary individuals (politicians come instantly to mind) ignore this cultural law and tamper with pages to enhance their accomplishments and downplay their pitfalls. Many subsidiary companies have started up, contrary to the non-for-profit nature of your endeavors, and consistently attempt to buffer the effects of bad press by editing their client’s pages. This creates an interesting quandary for you, and I wish I could tell you of some panacea to make it go away. I will say though, that through it all, I admire your commitment to assuming that all who use your services do so in the best interest of the general public.

Additionally, as Zittrain points out, your editors hold true to a certain ethos when working on your site (http://futureoftheinternet.org/static/ZittrainTheFutureoftheInternet.pdf, p. 142). Your dedication to neutrality signifies that readers most often find articles devoid of any noticeable bias. While that is a near impossibility, the technical style in which your articles are written come close to representing a fair account of the subject. Also, your stance on verifiability ensures readers that, while they should still be cautious, the majority of information found in your most frequently read articles is cited. And lastly, as an organization that conducts no original research, you uphold the purpose of an encyclopedia, and aggregate the work of many into one convenient, central location

In closing, I want to thank you again for all of your help, both as a source of information, and a beacon of hope for the future of the internet. Hopefully one day soon, people will put truth above stature, and care more about the accuracy and fairness of information on your site rather than what way to best enhance their own image on your pages.

Until next time,
Me

Conservative collaboration and the Wikipedia model – by “Zachary M”

Below is an interview from The Colbert Report of Andy Schlafly, the founder of Conservapedia, a conservative version of Wikipedia, and more recently the wiki-based Bible translation, the Conservative Bible Project. (I’m not sure the embedding is working; you can view it here.)

The Colbert Report Mon – Thurs 11:30pm / 10:30c
Andy Schlafly
www.colbertnation.com
Colbert Report Full Episodes 2010 Election March to Keep Fear Alive

First, let’s back up a second and understand what Conservapedia is.  It describes itself as a “conservative, family-friendly Wiki encyclopedia,” “conservative” being defined as someone who “adheres to principles of limited government, personal responsibility and moral values, agreeing with George Washington’s Farewell Address that “religion and morality are indispensable supports” to political prosperity.”  Andy Schlafly, son of Phyllis Schlafly, (known best for her opposition to the Equal Rights Amendment and feminism in general), founded it as a response to the perceived “liberal bias” of Wikipedia.

The articles have such blatant bias that they almost seem comedic most of the time.  For example, the article Barrack Hussein Obama (note the inclusion of the middle name) contains an entire section on evidence that Obama is a muslim, and the central policies are called the Conservapedia Commandments.  When I show Conservapedia to friends unfamiliar with it, they usually think it’s a joke like Encyclopedia Dramatica or Uncyclopedia.

The general encyclopedic part notwithstanding, Schlafly’s Conservative Bible Project (CBP) (hosted through Conservapedia) sounds just plain bizarre (Colbert puts it: “We already have that; it’s called The Bible.”)  It claims to be correcting for the following “errors in conveying Biblical meaning”:

  • lack of precision in the original language, such as terms underdeveloped to convey new concepts introduced by Christ
  • lack of precision in modern language
  • translation bias, mainly of the liberal kind, in converting the original language to the modern one.

The first claim seems to question the original scriptures, which otherwise would violate the purported belief that the Bible is the inspired, inerrant word of God- it suggests that divine revelation is unsatisfactory.  The second is linguistically inaccurate: the first thing you learn in any linguistics course is that all languages and dialects are equally valid; they just use different strategies to express the same things.  The third is what the rest of the article tries to establish, citing a handful of examples ranging in validity.  Schlafly’s general argument is that all of our views should be informed by our religion, largely meaning the Bible, and this is the source of his conservatism.  To then alter the supposed source of conservatism to make it more conservative makes the belief system circular. (Full disclosure here: I’m a committed Christian myself and consider the CBP to be disturbing.)

Andy Schlafly is a Princeton alum.
The Conservative Bible Project Page

Despite referring to itself as a “translation,” the project page doesn’t once suggest that contributors refer to the original Greek or Hebrew, though there is one link to a Greek text at the bottom of the page.  Its desire to fix translation inaccuracies is clearly not shown in a particularly ironic passage, Revelation 22:18-19:

I warn every man who hears the words of the prophecy of this book: If any man adds to these things, God will add to him the plagues that are written in this book. And if any man subtracts from the words of the book of this prophecy, God will subtract his portion out of the Book of Life, and out of the holy city, and from the things written in this book.

The word “Book” here should be “Tree.”  It results from the fact that the Conservative Bible is based on the King James Version, which for the end of Revelation was translated from Greek to Latin to Greek to English.  The Latin words for book and tree are similar, so that’s probably where the error came from.  This is actually theologically significant, since the “Tree of Life” recalls the Garden of Eden and the “Book of Life” creates a new concept, something like “God has a list of people going to heaven in a book,” which I’m pretty sure I heard once or twice in Sunday School growing up.  However, the CBP editors clearly didn’t care about a more accurate translation- when an error could not be corrected to make the passage more conservative, it was ignored.  It also seems that they didn’t read this passage at all, considering it promises them some significant divine punishment.

OK, so the CBP is inherently contradictory as a concept.  But what can we learn about collaboration from it?  Andy Schlafly makes some interesting assertions in his interview with Colbert:

  1. Isaac Newton claimed that work translating the Bible was responsible for his other insights and those of his contemporaries.  Thus, opening this process up to the general public is a major public service.
  2. This Conservative Bible is produced by the “best of the public,” which is better than experts. (“There are no definitive experts.”)
  3. The objective truth “becomes clear with time” through the work of the community.

If this claim about Newton is true, the first point is perhaps actually a justification for the project.  However, I doubt Newton was translating the Bible with an agenda other than understanding its meaning and am pretty sure it would have been from the original texts.  The other two, however, are much more general points about collaboration.  In essence, No. 2 and 3 are similar to the concepts governing Wikipedia.  Schlafly’s wording just happens to reduce the concept almost ad absurdum.  The Conservapedia Constitution opens with the statement: “Editing on Conservapedia is open to the best of the public – and that includes you.”  It does not say “and that could be you”- everyone is the best of the public, which renders the term meaningless.

The Conservapedia article Best of the public goes on to list “examples” of the concept, including many amateurs who rose to important status, including New Testament authors, Ronald Reagan, and one-hit wonders.   Though the selection is perhaps tailored to a conservative audience (except for examples like “Ice Ice Baby”), this is actually one of the most important sentiments in Internet culture.  As the “best of the public” article notes, any amateur can write a blog and dispense important information.  Wikipedia also depends on amateurs to synthesize information in an encyclopedic fashion, “encyclopedic” being identified by the myriad of policies, policies which were written by these amateurs.

This leads back to the fact that best of the public is presented in an absurd way on Conservapedia, showing an underlying tautology in collaborative web communities:  What is reliable information? That which the established members of the community achieve consensus on.  Who  gets to be an established member of the community?  Someone who provides reliable information. Colbert exposed this by having his fans edit him into the Conservative Bible- they created a clearly false consensus, and to overcome this, Conservapedia leaders had to violate the tenets of consensus.  An analogous situation would be issues of repeated vandalism in Wikipedia; articles prone to biased editing and vandalism, like “Christianity” and “George W. Bush,” tend to be semi-protected, meaning only established users can edit them.

If, however, a large group of people were to register Wikipedia accounts and assert on a discussion page something patently false was in accordance with Wikipedia policy, the community would be hard-pressed to go against  the consensus.  This generally doesn’t happen, since there are tons of Wikipedia users with a contrary opinion (who probably know Wikipedia policies well enough to cite them by abbreviation like WP:FU and WP:NOR and WP:NOTPAPER; as you can see, I’ve been inside this process).  This is actually why Conservapedia formed in the first place;  people with extreme conservative views found themselves quickly barred entry by an already-existing community.  We can only hope that the community is “right,” since such a gigantic status quo is hard to shift; the policies themselves are built around it.  Conservapedia, therefore, is no different from Wikipedia in that regard: an established status quo bars edits that violate the beliefs of the community.  It’s just that Wikipedia seems intuitively much more rational to most of us.

So now, all of the concepts behind Internet collaboration are tautological.  Where does that leave us?  Thankfully, there has been some review from outside of the system to help gauge whether it’s working.  A study in Nature found that Wikipedia is about as accurate as Encyclopedia Britannica.  That can give us some comfort that the system is doing its job and that any community-based inertia isn’t necessarily bad.  I don’t think any study has been conducted of Conservapedia or the Conservative Bible.

But I’m sure if a study did find Conservapedia to be less than accurate, Conservapedia would happily point out its liberal bias.

UPDATE: I just remembered that Andy Schlafly’s daughter Phyllis, who goes to Princeton, posted this on PrincetonFML: “My dad is the founder of Conservapedia. MLIG” An interesting discussion resulted, raising some of the points I raised here. (The OP is indeed his daughter; she posted about it on Facebook.)

Wikipedia: The Next Political Battleground? – by “Magic M”

Paul Krugman notes an interesting phenomenon in his Sunday NYT article here – every major contender (save Mitt Romney) for the 2012 Republican nomination who doesn’t currently hold a political office is a paid contributor to Fox News.  There’s undeniably a connection between the network (and its parent News Corporation’s other holdings, like the Wall Street Journal) and the Republican establishment, and there has been for years, but the much more worrying phenomenon is their ability as of late to craft a symbiotic relationship with the populism manifested by the Tea Party movement.  Politicians like Sarah Palin and Christine O’Donnell and personalities like Rush Limbaugh or Glenn Beck have developed an almost messianic aura, and their followers often display cultish devotion to ideas that have only the most tenuous grasp on sanity, chief among them the belief apparently held by a fifth of this country that “Obama is a Muslim!” (forget for a moment the implicitly bigoted suggestion the tone with which such statements are uttered conveys) .

 These leaders aren’t just far-right nutcases, they’re far-right nutcases with rather serious and powerful backers.  In return for supporting policies that probably end up hurting the middle-class Americans they claim to work for but benefit the war hawk (hopscotch from Afghanistan to Iraq to Pakistan to maybe-soon-Iran, anyone?) and business tycoons types, they get massive amounts of network coverage and make their election (and implementation of said policies) all the more likely.  Some commentators have suggested this populist movement is a temporary quirk, a function of the economic situation that will blow over quickly after the 2012 election.  I disagree.  I’m worried it has a little bit more staying power than that, and I’m worried it could be the end of democracy as we know it.

Okay, so maybe that’s a little bit melodramatic.  But a culture war is surely coming, and the next battleground may well be Wikipedia.

Constituents in this country rarely reward the candidates who take nuanced and charitable positions on topics.  The televised presidential debates rarely explore the intricacies of the topics they engage on; candidates instead turn to pre-prepared statements and catch-phrases, all in a ceaseless kowtow to the 24-hour news cycle.  The public simply loves to deal in absolutes.  Either we should go into foreign countries and spread democracy by force in every case because damn it, freedom and justice and apple pie demand it, or our last administration was full of sadistic torturers and Christian zealots plucked right out from the Inquisition or Crusades.  By and large it’s the academics who flesh out arguments for or against these policies more thoroughly in research papers – but it isn’t the academics who govern Wikipedia.   It’s the mob.  And there’s no reason to think they won’t turn to Wikipedia to serve their political interests.

Wikipedia has already seen attempts at manipulation by self-serving interests, of course.  Zittrain’s “The Future of The Internet and How to Stop It” discusses MyWikiBiz, a company devoted to polishing other companies’ public image on Wikipedia by editing articles.  Similarly, politicians have an obvious incentive to make themselves look better by tweaking articles before elections, and some have.  Thankfully, it’s fairly easy to stop that kind of thing.  There are not many of these people, and it is easy to spot them.      

But we’re not talking about these aberrations or random vandalism here – we’re talking about concerted nation-wide efforts made to change the entries surrounding political events and people, to subtly influence the perception that everyone who ever reads those articles will have.  If 20% of the country thinks that Obama is a Muslim and even 1% of those people are committed to influencing Wikipedia, that’s still 6.2 million people who might be willing to edit the Obama article once a day.  People can of course change it back, but I’m not talking about the kind of thing that is blatantly obvious – I’m not suggesting that these people will successfully permanently convert (get it?) the “religion” box on the page to Muslim.  But there are other ways to impact perceptions.  What if people worked together to get the structure of the page changed so that greater emphasis was devoted to the speech in Cairo, statements condemning Israel for a variety of policies, and bowing to the king of Saudi Arabia?  The way you present facts is just as important as the facts themselves for the conclusions people draw.  The neutral point of view policy can be invoked, certainly, but it won’t hold in all instances, only the most egregious changes; similarly, no original research means you merely have to turn to one writer or pundit or another on the Republican payroll to provide your backing.  They have scientists who proclaim that global warming doesn’t exist, remember.            

I realize I’ve created somewhat of an apparent contradiction with my suggestion that political stances are polarized but that Wiki edits will be subtle.  My resolution to this is to suggest instead that the polarized political stances provide the necessary motivation to make tedious and subtle Wikipedia edits, but nevertheless ones that can add up to create a definite political advantage in a world where Wikipedia is increasingly viewed as an authoritative institution of information.  Readers beware.

How Wikipedia will save politics – by “Olga M”

As many ambitious young Yalies before me, I spent my summer working for a political organization in Washington DC. Jokes about the tech-cluelessness of politicians aside, I was amazed by the serious lack of reliable information on Capitol Hill. Congressmen got nearly all of their facts from lobbyists, either in direct conversations or in pseudo-objective policy papers written by those same interest groups. The voices of true experts and average citizens were completely drowned out in the cross-shouting of lobbyists and extremists.

That’s why I was so intrigued when my think tank started working on Progressive Map, a Wiki that is designed to provide Congressional staff with (left-leaning) reliable information on issues, organizations, and people. The project follows the trend of political Wikis, like conservapedia, liberapedia, and Rational Wiki, in creating collaborative information gathering projects dealing with political issues. Progressive Map differs in hoping that average citizens will be able to bypass lobbyists and the money-buying-access problem to tell their Congressmen the full truth about the people and policies they are dealing with.

Sounds like  a pipe dream, right?

On first glance, politics seems like the least likely field in which a Wiki format could work. For starters, the self-conscious norm for objectivity and consensus that make Wikipedia work are completely absent from politics. While Wikipedia relies on the basic notion of trusting your neighbor, politics encourages people to form adversarial groups that prove that their particular viewpoint is correct. If the average Wikipedia writer comes to engage in a common enterprise with other users, the average participant in politics just wants his policy to win, and consensus created on the Discussion page is not the way to do that.

Furthermore, the basic rules of Wikipedia are particularly hard to apply to politics. The no-original research policy is tricky in a field that deals primarily with people’s conjectures and expectations. For example, to say that the user thinks that the war in Afghanistan is hopeless would have to involve polling data specifying the demographic that believes in that argument, or else a link to a prominent commentator making that claim.

Meanwhile, the requirement of verifiability will inevitably run into disputes over credibility and representation. For example, is it appropriate to say that Republicans question the existence of global warming just because conservative Christian fundamentalists don’t believe in it? Political groups are by nature heterogeneous, and it is very hard (and potentially very anger-provoking) to generalize about their views.

Finally, the neutral point of view standard would require hard choices about what due and undue weight, as many very prominent political groups (e.g. LGBT groups) have small numbers of clear members and a much wider undefined support network.

Most damning, however, is the problem of editing by vested interests. Wikipedia works in part because few people care whether zucchini is a fruit or a vegetable, but questions of politics have lives and livelihoods at stake. The incentive to try to cheat the system is thus incredibly great.

Already, instances of interference abound in Wikipedia. Marty Meehan raised public outcry when he edited his own entry to delete a reference to a campaign promise, and staffers for many Congressmen admit to doing the same. The entry for President Obama had to be blocked from further editing, after too many birthers edited the page to question Obama’s birth certificate and sparked editing wars . Meanwhile, some blogs claim that there is already a conservative slant to Wikipedia because right-wing advocates are more willing to devote time to promoting causes.

These problems are not exceptions. A political Wiki would have to deal with more than just bored teenagers; it would have to face people who bomb abortion clinics, donate millions of dollars, and spend countless hours demonstrating for the sake of getting their viewpoint out there.

And yet for all of the obvious challenges, there is a glimmer of possibility that enough individuals who care about accountability and bipartisanship will join in on the project and make it work. The requirement for some final result may create a new culture in which users agree to represent others’ views fairly if they get the same treatment. Groups wishing to push their views will simply add a sentence that a particular person believes and advocates for a particular policy, while leaving the debate over the correctness of the beliefs to other spheres. Given how many voters seem disgusted with partisanship, the Wiki should have plenty of users who have a desire to preserve neutrality.

So how could we make Progressive Map work?

Wikipedia’s current format is clearly too trusting and open to prevent sabotage. An alternative could come from a system like Slashdot’s, which uses carefully chosen moderators among logged in and regular long-term users. They would be chosen, and would enforce, a system of Karma, in which comments are ranked from “Most Fair” to “Disruptively Biased.” Additionally, users could rate other users as “Disruptive,” alerting the moderators that a user is abusing the openness of the system. After a user makes a maximum number of disruptive changes, he or she would be blocked from the website.

Progressive Map has the chance to work, but it requires rules and sufficient participation to succeed.

Free and open source is not always the answer – by “Emily Y”

A recent New York Times article spotlighted a key issue in the world of open source software: businesses using others’ open source code to develop their own products and then failing to follow Open Source Standard (OSS) requirements. Open source issues such as these can be difficult – if not impossible – to overcome in the business sphere.  How can we expect programmers to put years of hard work into a quality program, and then just give it away for free?  But as its benefits to technological growth become increasingly obvious, open source software is becoming more common and accessible.  Is it possible for the future of software to be completely free and open source?

In the past, I’d been skeptical about of free software.  In respect to the quality, the free programs I’d downloaded and tested were fine, but never matched up to the caliber of (pricey) proprietary software programs.  More importantly, however, might have been that the phrase “open source” sounded like something that might appeal to a computer programmer, but not to me, a generally-technologically-capable-but-coding-oblivious student.  Why would I care whether or not I could access software code?

I’m not sure that I will ever have the desire to look at the coding of software.  What it comes down to is that some software on the market just doesn’t fit in a college student’s budget.  Moreover, most of this software is packed with features I’d never use, for techniques I’ll never understand.  When it comes to computers, I’m just a hobbyist.  For example, in working with newspaper design, I love toying around with the abundance of features that Adobe InDesign, Photoshop and Illustrator offer.  However, purchasing Adobe Creative Suite CS5 would set me back anywhere from $300 to $900 – and that’s the Student pricing.  Thinking about all of this led me to go the free route and test out a couple of “alternative” open source programs: GIMP and Inkscape, two major open source rivals to Adobe’s Creative Suite.

GIMP

GIMP is a free graphics manipulation program, with offerings similar to that of Adobe Photoshop.  In several ways, it lacks the power and usability of Photoshop.  Yet there are offshoots of GIMP that have used it to come awfully close to reproducing Photoshop.  The creator of GimPhoto took GIMP and modified it with features and a UI that rival those of Photoshop.  One major issue I held with GIMP was its inability to simply batch process a group of photos (automatically execute the same adjustments on several photos simultaneously).  According to the GIMP Wiki, in order to do this, the user must input commands, such as the one below:

(define (simple-unsharp-mask filename
radius
amount
threshold)
(let* ((image (car (gimp-file-load RUN-NONINTERACTIVE filename filename)))
(drawable (car (gimp-image-get-active-layer image))))
(plug-in-unsharp-mask RUN-NONINTERACTIVE
image drawable radius amount threshold)
(gimp-file-save RUN-NONINTERACTIVE image drawable filename filename)
(gimp-image-delete image)))

Which, to me, is closer to gibberish than a true “command.”

But still, the bottom line is that most of what GIMP doesn’t have, I wouldn’t use anyway.

Inkscape

Inkscape is the GIMP to Adobe Illustrator.  On most levels, Inkscape and Illustrator are identical when it comes to features.  There are a very small number of Illustrator features that are missing in Inkscape, but again, the people who makes use of these features are just a tiny fraction of the software’s users.   And the reverse is also true: Inkscape includes a number of useful features that are unavailable in Illustrator.  In fact, I found the Inkscape UI to be slightly more intuitive than Illustrator’s.   Thus, in my opinion, user interface and personal likings are what should influence one’s decision in this case.  Brand names are irrelevant.

A question often posed on the topic of open source software is whether or not computer programmers would continue to output quality software if there was no profit incentive.  But there is actually a great deal of profit to be made via free, open source software.  Where does the money come from?   Large companies such as Microsoft, Apple, and Google pay licensing fees to use open source software more freely.  For example, MySQL (an open source database) allows users to access and use its software at no cost; however, improvements that users make to the software must be shared with the company.  For individual users, this is generally a non-issue, but businesses rely on keeping the rights to their work to generate profit.  Thus, they pay these licensing fees, which can add up to significant profits for the original creator.

So why (or why not) open source?   There’s incredible room for enhancements in software, and with the increased freedom and flexibility of open source, the possibilities are endless.  Yet at the end of the day, I can’t use OpenOffice (sorry, Maria!).  I like the power of Microsoft Office, and I’m much more comfortable using it — no matter how annoyed I get with its UI changes.  Therefore, I understand if you can’t bear to make the switch to GIMP.  While we shouldn’t let our lives be controlled by proprietary software, we also shouldn’t impose limits on ourselves solely to promote open source.

In the end, a healthy balance between the two is really what we need.  There still is – and, I believe, always will be – a market for proprietary software.  Yet the major advancements made by open source software in the past decade are proof that open source is changing the way we create and use computer software.  The age of assuming “expensive software is better software” has passed; we are realizing that free software is no less advanced than proprietary software, while once-seemingly-impossible barriers to open source are gradually being overcome.

“A lot of people talk about open-source versus commercial, but they’re not mutually exclusive. Don’t view it as an all-or-nothing prospect.” — Steve Gerdt, program manager for open-source strategy at IBM.

Open Handset Alliance far from open by GNU standards – by “Bill T”

Like the previous blogger, I too am in love with my Droid. He is a Droid X. His name is Yeste (after the most famous swordmaker in all of Florin), and he runs Verizon/Motorola’s official OTA release of Froyo (Android 2.2) with Motorola’s MotoBlur skin on top of it. Motorola is a proud member of the “Open Handset Alliance” which is a group of 78 tech companies that seek to propagate Google’s open-source mobile operating system, “Android”. Some of its members are wireless distributors seeking wider access to smart phones, others are phone manufacturers looking to decrease some of its costs, others are developers excited about a popular mobile platform with a low bar for entry. All of them are in the business of technological advancements. All of them are in the business of making money. Many of them are competitors.

Google has set a tone of openness not entirely unlike that in GNU’s copyleft standards, but that tone ends at the conveyance of Android. As the leading producers of Android handsets, Motorola and HTC are the most capable of upholding the attitude of openness begun by Google. Motorola and HTC add the custom skins “MotoBlur” and “Sense UI” respectively  to Google’s stock form of Android, a practice Google adamantly defends, and one clearly aligned with GNU’s policy of allowing modification and redistribution (not that GNU’s rules apply to Android).

HTC has been moderately good about maintaining openness when conveying Android. Though they’ve added Sense, it’s possible to turn off most of it’s features and return to stock Android. Users seeking superuser access will still need to “root” their phones in order to load new firmware, but HTC hasn’t done much to prevent that. In fact, it’s become as easy as downloading an app to root HTC’s phones.

Motorola on the other hand, has shown a proclivity towards limitations on this openness. In order to remove MotoBlur, one must root one’s Motorola phone.  While rooting the Droid X and Droid 2 is possible, it is very difficult in comparison to other Android phones due to Motorola’s inclusion of a “Locked Bootloader” which, though it doesn’t “brick” the phone, takes very strong measures to prevent rooting. This ardent anti-circumvention measure would unquestionably violate copyleft standards, if they applied, and as a result, lowers the bar for openness among members of the Open Handset Alliance.

So what accounts for the difference between Motorola and believers in copyleft? Yes, Motorola is in the business of making money, but profit is not something the GPL disdains, indeed it embraces it by clarifying its definition of “free” as regarding freedom (which MotoBlur lacks) rather than price (which Motorola is happy to include). As the leading manufacturer of handsets, it can’t be that Motorola lacks interest in technological progress. Indeed, many consider Motorola’s Droid to be the first real “iPod Killer.”

Perhaps it’s the desire to beat the competition at either of these factors that drives Motorola’s desire to lock things down. While GNU supports gaining from modifications on open software, it doesn’t appear to support competitive enterprising. While GNU supports technological progress, that is not its primary tenant. Motorola’s desire, first, to lead the way rather than to contribute primarily to the customization of the Android platform is what pushes it so far away from the copyleft standard. Motorola doesn’t seem to want us to truly own the software on our phones.

I’m very happy with my MotoBlur-running Droid X, and even when given the warranty-preserving options of downloading MotoBlur-replacing apps like Launcher Pro or Handcent SMS, I’ve stuck with Motorola’s stock apps. I may not be better off for that, but I’m happy with those functions as they are. I don’t really need free tethering or mobile hotspot capabilities. With the ability to tether via Bluetooth to my MacBook Pro which can use its Airport as a hotspot, I’m satisfied. I don’t plan to root any time soon. Having said that, every once in a while I come across a cool app that says “requires root,” and wish that that wasn’t necessary. None of the apps have been worth voiding my warranty or taking the chance that I’ll brick my phone by screwing up the complicated process of circumventing eFuse, nor have they even been worth remembering. But as members of an Open Handset Alliance, perhaps Motorola should still consider democratizing superuser access.

After all, is there any good to the consumer from such a locked-down device?

Apple’s struggle for closure – by “Ben S”

I’m in love with my Droid.  I ordered it the first day they were available, even putting up with my unusable Samsung Juke and its shattered screen for an extra week, just so I could get my hands on what I saw as the first tenable challenge to the iPhone–one of the main draws, for me, at least, was the fact that anyone could design apps and put them on the “Android Market” without putting them up for review before some ominous Comité de salut app.

One of the biggest shortcomings, however, was the lack of Adobe Flash, which, Android owners kept being promised, was “coming,” and after waiting months, the news in June that Flash 10.1 had finally been released and was Android-compatible was somewhat muted by the fact that it was actually compatible with Android 2.2, whereas all Droid users were still stuck with 2.1 for the foreseeable future.

http://2.bp.blogspot.com/
Finally accurate

So while at this point I could have manually rooted the “Froyo” update to my phone and used Flash to my heart’s content, a call to Verizon confirmed what I had suspected–any manual installation of the new OS from a source other than Motorola or Verizon would void my warranty (which, when buying a phone with a plan allows you to get it at 1/5 of the non-plan cost, is a real consideration).

And so I waited, patiently, until late August to get my update, download Flash, and then find out Hulu was blocking all mobile phones from viewing videos anyway.

So, although iPhone users all over the world are likely still more than a little heady about the Librarian of Congress’ clarification to the DMCA allowing for, among other things, “jailbreaking” the device, it is certainly worth noting that, while Apple can no longer use the threat of legal action to keep all its devotees in line, it still has a plethora of tools at its disposal to discourage users from straying from the Way of Apple, including, yes, voiding your warranty:

Apple’s goal has always been to insure that our customers have a great experience with their iPhone and we know that jailbreaking can severely degrade the experience. As we’ve said before, the vast majority of customers do not jailbreak their iPhones as this can violate the warranty and can cause the iPhone to become unstable and not work reliably. [Emphasis added]

http://www.cultofmac.com/
It’s not that they’re controlling, it’s just that they know what’s best for you!

To be fair, they have a point–there have been instances of jailbroken phones being exposed to vulnerabilities, especially those that use SSH and don’t change the password from the default.  Naturally, the more open a technology is, the more risk there is for malicious attacks–and when the openness is not officially sanctioned, Apple has little reason to fortify the rogue phones against attacks.

Indeed, though there is absolutely no indication that they plan to do this, Apple could even, if they so chose, develop viruses themselves that specifically target jailbroken phones, or, more legally ambiguously, introduce some internal fuse designed to detect modification, and, if any such modification occurs, melt the phone.  Not, of course, that Apple would ever deliberately introduce defects into their products

But back to the probable: Apple has absolutely no incentive to provide any sort of support for those who use the phone in ways that Apple has said it should not be used–and while communities of jailbroken iPhone users will certainly continue to grow and evolve, coming up with patches and fixes themselves, what, ultimately, is the point in taking technology from one of the most closed consumer technology companies in existence and trying to make it open?  Why not just get technology that is open in the first place?

So, in short, if you want an open, generative phone, then buy an open, generative phone (one that you can also hold any way you like).  Don’t be a putz.



Leasing Ourselves Away – by “Sabina M”

Using and demanding more DRM-free services like recently launched UrFilez or Ovi Music will make you not only 300% cooler, but a responsible citizen.

Imagine this: sometime during the night, half your books have been pillaged. Not by vikings, but by Barnes & Noble.

Imagine this: law enforcement shows up at your doorstep. You have attempted to glue Lego to your science fair project (or hair) – but Lego has very strict ideas about how, where, and for how long Lego can be used.

Imagine this: you have taken apart your CD player to figure out how it works. You do – and you even figure out a way to make it sound better, and maybe be used for time travel. Naturally, you show all your friends how to repeat this miraculous feat. Shouldn’t have done that: get ready to drop the soap.

Maybe the examples are a bit hyperbolic. Or maybe they are all too realistic, if used as an analogy for how the products we purchase digitally are protected by both copyright law and DRM (digital rights management) technologies. Last year, DRM “protection” was the backdoor that made possible the Amazon deletion of eBooks from customers’ Kindles (because the “digital age” is an ironic one, it had to be 1984 – so funny that it’s not). DRM is being used to prevent you from playing movies, music, and games or using software on just any machine or number of machines: on just any operating system (read: anything beyond Apple or Microsoft); in any geographical locality or for any amount of time.

When it comes to digital goods, we have implicitly come to accept the idea that we cannot do just whatever we want with the products we have purchased – and perhaps even more worrying, that we can never truly own digital media. We have accepted, perhaps without being aware of, the fact that we are only renting it and so have to submit to any specific conditions the provider specifies, including the possibility of having our product deleted or made less functional at a whim.

Don’t even think about tinkering with your new copy of, say, Microsoft Office Word: and if you do, do not share your discoveries with anybody. Although tinkerers – or people unwilling to be held hostage to a specific service provider – have recently won a small battle this past summer as such alteration was extended legal permissibility (if not permanent protection) when it comes to phones, the fundamental issue remains. Legal bright spots aside, DRM technology explicitly aims at making the cracking of the proverbial CD player near impossible, even at the cost of practicality. Imagine the CD player, telephone, or Lego blocks of your childhood being 20 pounds heavier just so you couldn’t use them in weird ways: imagine your CD player working less well or ceasing to work altogether if it suspected you were using it in non-correct ways (and then imagine it did this anyway: see the Spore case), all as a trade-off in the name of being more difficult to tamper with.

We (us nerds anyway) instinctively find something unnerving about the idea of someone stealing or blacking out large parts of our books, of the CD player company preventing us from tinkering with our bought property and using police as their proxy – basically, of someone watching over our shoulder when it comes to what we do with the things that are our own. In contrast, the response seems much less visceral, and much more confused, when we talk of DRM. And there are some perhaps justifiable reasons for this ambivalence: but, mainly, a terribly bad one, namely the idea that digital products are, and should be treated as, fundamentally different. For who? For the companies.

After all, is it not the right of developers to keep their code a secret? In this lies part of the crux. With software, the ideas and design are the product. Furthermore, many of these “new” types of products – that is, digital media – are increasingly being couched within a larger framework of a continuous and larger service (see iTunes, Amazon’s Video on Demand, Blizzard’s online RPG).

One way to think of this dilemma and why it came to be so dilemm-ish is this: you could take apart a CD player, sure, knock yourself out – but you could not, in practice, by yourself, replicate the finished product and so displace the monopoly the production company had on designing, manufacturing, and delivering that product to you. If that had been possible, tampering with the interior workings of tech products would have become an issue far earlier – in the digital ownership of digital products, it is all too easy to threaten the profitability of a product by making the company obsolete as a supplier (or sole developer, as the case sometimes is when protected software is cracked in order to be enhanced and, inevitably, spread).

There is undoubtedly truthiness to these facts. Yet surely we can all agree that there are concessions that are unacceptable, even in the name of protecting the economic viability of software companies, when those concessions concern the basic rights of being human – of being curious and inventive – and the basic rights we associate with democracy, namely those of free speech and perhaps to private property. These questions must be asked regardless of how unpleasant the answer might be to commercial interests. (And with DRM, it might not be so at all: it is unclear just how beneficial DRM protection has been for companies, all considered).

As someone with an unhealthy relationship with the Internet and nerd culture, I am squarely on the side of copyleft, open source, creative commons, and so on. But I do not want to ignore the fact that companies are inhabiting a very peculiar space when it comes to purely digital products. If before it was no biggie to lend my SNES game to my neighbor, today it is – because my neighbor next door has suddenly become the entire internet-browsing public. Free speech in the sense of spreading an idea, lending a creative work, instructing others in how things work even when companies would rather we not, discovering and tampering with code (which I would argue rightly deserves to be defined as speech) – yes, free speech and tinkering has become complicated for everyone involved. But this does not mean we must compromise it to the tune of private corporations too worried about their short-term profit to realize the long-term consequences of the laws and practices they have begun to implement and entrench into society as a de facto necessity.

My heart weeps for these producers, or at least sniffles because my head tells it to, but it recoils at the idea of DRM and its supporting social, legal, economic structures evolving further in the direction it has. What do I mean by social and economic structures? I mean the slow transition we are witnessing from physical to digital product: our thinking of digital products as not-quite-goods: the lack of uproar over how the key cultural and other products of our age are coming to be accepted as simply services, things we use on a lease and with a leash.

The heavy media giants – companies like Sony, Apple, Microsoft, Amazon – have begun to construe many of their products as services, period. Services are subject to change. Services can be terminated. And so it comes to be that only the bookstore of a fascist state can enter your house and steal your books in the middle of the night, while Amazon can do so in broad daylight. As Amazon customers that had their Kindles “bricked” (in an ongoing debacle separate from the 1984 deletions) can testify, simply purchasing an eBook is no guarantee for keeping it.

I own an impressive amount of useless TV shows through Amazon on Demand. Yet, if I want to watch them, I have to make sure not to leave American soil: licensing issues. Hopefully Europe will sort those out some ten years down the road, but the point is, Amazon has without warning taken away my right to use these purchases of mine because they deem I do not have the – apparently far more important – right to digitally watch them beyond the U.S. So what then? Do I purchase every episode all over again, but on a DVD? Why are digital copies of an episode more acceptable to, in effect, control the content and presentation of than physical ones – mere practicability is not it (and one might well wonder when DVDs will begin to have automatic IP blocking and such, also).

The issue seems to lie more in the aesthetic feeling of digital goods somehow being fundamentally different in every single way: it is almost as if a digital product is not real. “That’s ridiculous, nobody can take back my purchased DVD” versus “Hey, Amazon is blocking me from re-watching the Battlestar Galactica season finale, I guess that’s just how it works”.

It only works this way because we let it: and my point is, we shouldn’t. We can’t allow the law to codify our digital goods, especially expressive ones, as second-class expressions or property. Yes, we can buy a hard copy of a DVD. But in – twenty? Ten? Five? years from now, will hard copies still be there? Probably not. But we would still be stuck with legislation that presupposes a digital book does not deserve the same protections against theft and censorship the “real” equivalent does. This is what DRM is: anti-license to do whatever to your goods, but also anti-protection against what the private company you got it from can do to it in turn – remove, censor, alter. And the problem stretches not only to purely digital or software goods: PS3 has removed features from already-bought consoles, remotely (like prior support for Linux). With the internet, boundaries between hardware and software are thinning.

Take a look at Sony’s license agreement:

Some content may be provided automatically without notice when you sign in. Such content may include automatic updates or upgrades which may change your current operating system, cause a loss of data or content or cause a loss of functionalities or utilities

[…]

You may not sell, rent, sublicense, modify, adapt, translate, reverse engineer, decompile, or disassemble any portion of the Property. Except as stated in this Agreement or otherwise expressly permitted by SCEA in writing, you may not reproduce or transfer any portion of the Property. You may not create any derivative works, attempt to create the source code from the object code, or download or use any Property for any purpose other than as expressly permitted. You may not bypass, disable, or circumvent any encryption, security, digital rights management or authentication mechanism in connection with Sony Online Services or any of the content or service offered through Sony Online Services.

http://us.playstation.com/support/termsofuse/

Do things look this cyberpunkly bleak only to the people that really, really care about full control of their software and media? While not all of us may feel this to be a great threat to democracy, it very well might become if we do not begin debating this issue on a level more profound than “WTF $ONY DONT L3T ME HAVE LINUX…. >>” or “APPEL WHY DOES MY CELINE DION MP3 NOT WOERK SOMETIME”. Our society has changed. We cannot just shrug it away and assume the free market will take care of everything, that we will end up with well-functioning, reasonable DRM and copyright policies. Companies are, by definition, for-profit entities that have no incentive to think ahead and take principles of democracy into consideration. They want to make money, and keep making money. Why should we allow private corporations to dictate the terms of our future relationship to the culture and technology we come into contact with?

Don’t click away your rights. You are human, or possibly a transhumanist. The only thing that makes man better than a monkey, even when the monkey is cuter, as is the case with many nerds, is our curiosity: our ability to learn, then take the knowledge we just gained and build upon it. So the next time you’re skipping through a license agreement, take the time to read it: because we need to have the sanctity of the bookshelf, we need the freedom to tinker, and we should never compromise away our right to share knowledge, however threatening to commercial interests.

When will it end? – by “Jeffrey Z”

I remember when I first discovered BitTorrent. It was just too good of a deal to pass up. All I had to do was go on Mininova, find whatever video games that I wanted, and click on the tracker link. That’s it. No hassle, no waiting (except for the often horrendously slow download rates when people don’t seed!), and most importantly, no money for titles that would retail for over $50. Unzip the file, upload the disc using Daemon Tools, and within 10 minutes after the file finished downloading, I was playing Warhammer 40,000: Dawn of War. Just like that.  But even with increasing public focus upon media piracy, to some degree, piracy still remains unfettered, especially within the realms of video games.

Video game developers, unlike the music industry or the film industry, lacks a protective headline institution like the RIAA and MPAA. They don’t often actually take the time or resources to file lawsuits against simple copyright infringement, only aggressively pursuing action when the copyright infringement could disastrously hurt their income.  So many video game developers, rather than working have turned towards more sophisticated ways of preventing piracy.

Many developers, corporate and indie alike, have turned towards online integration as a way to ensure everyone playing has an unique copy of the disk. Blizzard announced, not without much anger and resentment from gamers, that Starcraft 2 would not have LAN (Local Area Network) support, forcing all players to play online, ensuring that Blizzard could ensure unique CD keys. Indie developer Notch, responding to the piracy of his popular indie game, Minecraft, says that “instead of just relying on guilt tripping pirates into buying, or wasting time and money trying to stop them, I can offer online-only services that actually add to the game experience.”

But with each generation with increasingly complex DRMs, there has been just as fervent response on the pirate side. Almost immediately upon release, hackers have worked on methods on cracking new DRMs, a process not to different from jailbreaking the newest iPod firmware.  It’s almost like a Q&A session, responding as if each new generation of DRM was a challenge for them.

I guess the real question soon becomes apparent. How far can this go? How long can developers keep on developing technologies to dissuade piracy? When will it end?

At some point, a balance needs to be struck.  Video games developers cannot be expected to produce quality products yet constantly shovel money towards developing stronger piracy protections.  Will video game developers begin turning towards methods like those that the RIAA uses against copyright infringement?  With growing acceptance of video games as a serious market influence, it’s become a definite possibility.