The greatest experiment in the history of our species is being conducted beneath the border between Switzerland and France. In the 16-mile wide ring of the Large Hadron Collider at CERN, human beings are recreating the conditions from the start of our universe. Protons are smashed together at velocities approaching the speed of light. It’s a truly awe-inspiring endeavour.
What’s equally awe-inspiring is the level of cooperation required to accomplish this Apollonian feat. The legal framework for the (literally) groundbreaking work of the LHC was established in the CERN Convention of 1954. Twelve nations initially signed up, later expanding to twenty. Together they would run CERN as a stateless entity devoted entirely to pure science. The only return on investment that was expected was in the currency of knowledge.
This groundwork allowed CERN to become a very special place. Many of the usual hindrances to cooperation have been removed: national boundaries, economics, social hierarchies. Instead, things get done like it’s one giant hack day. Propose an experiment, find out who else is interested in helping you out, and away you go. Nobel prize-winning physicists and students on Summer internships work together.
It was in this atmosphere of collaboration that Tim Berners-Lee created the world wide web, aided and abetted by his colleague Robert Cailliau. Today we think of the world wide web as one of the greatest inventions in the history of communications, but to the scientists at CERN, it is merely a byproduct. When you’re dealing in cosmological timescales and investigating the very building blocks of reality itself, the timeline of mankind’s relationship with technology is little more than a rounding error.
Sir Tim’s hypertext system was designed to help the scientists at CERN collaborate more efficiently. It wasn’t the only hypertext system around, and it certainly wasn’t the best. On the web, using the laughably-primitive vocabulary of HTML, you could link to any URL regardless of who “owned” it. If that URL were to later disappear, tough luck. Now you’ve got a broken link.
It was a fragile, incomplete system that didn’t come anywhere near solving all the challenges of hyperlinking data. It was, of course, a huge success: its simplicity turned out to be its strength. Although HTML was consumed by computers, it could also be read by humans. Crucially, it could also be easily written by humans. Anyone with access to a text editor could create a new hyperlink-filled HTML document.
It’s easy for us to look back now and see the web’s strengths inscribed into its founding architecture, but its success was by no means assured. Many of the features of the web that we take for granted today came about by accident. The fact that browsers display URLs — and allow you to enter URLs — was initially a power-user feature that seemed unlikely to be popular. The fact that people started writing documents in HTML took Tim Berners-Lee by surprise: his markup language was really only intended for index pages that pointed to the real content in other formats.
But there was one key factor in the web’s success that was not an accident. On April 30th, 1993, Tim Berners-Lee and Robert Cailliau placed the web into the public domain. This was by no means a fait accompli — the temptation to monetize this burgeoning hypertext system must have been hard to resist. But, perhaps inspired by the selfless spirit of cooperation and collaboration at CERN, they gave their gift to the world and asked for nothing in return.
Almost twenty years later at the 2012 Summer Olympic games in London, Sir Tim Berners-Lee was lauded in the opening ceremony. Watched by a global audience, he passed on one message regarding the world wide web:
This is for everyone.
If I want to write an HTML document, I don’t need to ask for permission. If I want to publish that HTML document at a URL, I don’t need to ask for permission. If I want to link from that HTML document to any other URL anywhere on the web, I don’t need to ask for permission.
Steve Jobs once said, “You don’t need anyone’s permission to be awesome,” which is somewhat ironic, because to publish something in Apple’s App Store, you definitely need permission.
If you want to publish a fart app, that will probably be allowed. If you want to publish a location-based social networking gamification platform, I’m sure that will be fine. But if you want to publish an app that simply shows on a map the locations of the latest drone strikes in Afghanistan and Pakistan (without even showing the results of those strikes), then your app will be rejected.
The terms and conditions for Apple’s App Store make it very clear that this is not for everyone:
We view apps different than books or songs. If you want to criticise a religion, write a book. If you want to describe sex, write a book or a song.
The web is an uncontrolled mess where anyone can link to anyone else. The App Store is cultivated walled garden where it isn’t even possible to link between apps. All of the individual apps available from the gatekeepers of the App Store are fenced inside their own little plot of land.
Still, the experience of using these pre-filtered apps can at times feel superior to the experience of using an old-fashioned website. Native apps can be “richer” and more immersive than anything that can be experienced through a web browser. Perhaps they even portend the death of the web as we know it.
When I first started making websites back in the ’90s, there was a rival technology that was “richer” and more immersive than the web: CD-ROMs. Microsoft’s Encarta was an encyclopaedia on a disc, complete with images, video, and audio. But CD-ROMs were also isolated islands. While the experience of using any individual CD-ROM was easily greater than the experience of using any individual website, all the CD-ROMs in the world couldn’t collectively compete with the experience of using the wild lawless world wide web. So it is with the App Store. Native apps will no more destroy the web than swimming pools will destroy the ocean.
At this point, you may have dismissed my opinions as those of a Luddite afraid of change. Here I am criticising the new shiny App Store while I’m defending a clunky world wide web that’s more than two decades old. But make no mistake; I’m criticising walled gardens like the App Store precisely because they are not a step forward. They are attempting to turn the clock back and return us to the world before the web.
The world before the web was a world of atoms, not bits. Unlike bits, atoms take up space and there’s only so much space to go around. That’s why we needed tastemakers and gatekeepers to decide which atoms would be placed on which shelves. In that world of consumers and producers, record companies, publishing houses, and film studios decided what would be published. These organisations existed in order to tell us what we would consume — which books we would read, which movies we would watch, which songs we would listen to.
Those are the very same organisations that greeted the App Store with open arms, not because it offers something new, but precisely because the walled garden promises a return to a world of producers and consumers. Here’s an opportunity to put the genie of the web back in its bottle.
I don’t believe it will work. The spirit of the web — that of free and open access and sharing — has already infected the world.
The web was born in an environment of openness, sharing, and collaboration. The spirit of CERN influenced Tim Berners-Lee’s work. In giving away that work for free, Sir Tim showed the world that the old permission-based value systems would no longer define the culture of our society.
Steve Jobs was right: you don’t need anyone’s permission to be awesome…if you’re publishing on the world wide web. Or, as Andy Baio put it:
The ability to link to any web page without permission is part of what makes the web great. Anyone who says otherwise is a poopy pants.