If there’s one term it seems that tech writers can’t get enough of, it’s Web 2.0. It has spawned design conferences, a host of new applications, and even Time Magazine’s Person of the Year. The idea behind this buzzword is that while once the average user went online and found a plethora of content produced by others which they could access, this new incarnation of the internet would be democratized. Instead of its content being dictated by only a few individuals who knew how to code and could afford hosting, now any user who wanted to could produce websites, blogs and videos and share them with the world via the internet. This supposedly marked a great shift in how people communicated and would help realize the web’s true potential.
At the same time, there has been a significant change not only in the way that online content is produced, but how it is accessed. Where internet users were once chained to personal computers (which were themselves bound by the limitations of wired access and WiFi), they can now go online via mobile devices, whether they be through smartphones like the iPhone or the Droid, or tablets like the iPad (because, let’s be honest, no one is actually going to buy the Samsung Galaxy).
If all of this is true, then the question needs to be asked: why is Apple (one of the largest and most influential makers of mobile devices) standing in the way of electronic populism?
Apple has long been known for its draconian policies on any number of subjects, and they’ve made it abundantly clear that content on their mobile devices will be no exception. The App Store has produced nearly $200 million in profits for Apple (though it only accounts for about 1% of gross profits), but more importantly, the broad range of applications available through the App Store has fueled sales of the iPhone, driving up its market share.
This diversity of applications has emerged in spite of Apple’s promulgation that it will remove any Apps which, for example, have metadata that mentions the name of any other computer platform, misspells the name of any Apple product or even simply has icons which are too similar to those used by Apple (not to mention the fact that, lest this be easy for developers, having an interface which is too complicated is also grounds for removal). In short: if you want to put your application on the iPhone, you had better follow Apple’s rules, no matter how ridiculous (also, don’t think you can get away with talking about how absurd you think some of the regulations are, either).
So, what does all of this have to do with Web 2.0? Simply put, the fundamental idea behind Web 2.0 is that users get to dictate the content they find online with relatively little interference from the electronic “elite.” Apple’s actions, however, drive the newest online frontier in the opposite direction. Instead of creating an electronic forum where ideas and innovation flourish free from censorship, Steve Jobs would impose a world where the unwashed masses are kept away from “undesirable” content in the name of Apple “trying to do the right thing for its users,” instead of those users deciding what is right for themselves.
While more open platforms like Google’s Android have been gaining steam in recent years, users of iPhones and iPads are left with little recourse against the arbitrary governance of their closed platforms, except, perhaps, someone playing them a sad, sad song on the world’s tiniest open-source violin.