Tags: internet

11

sparkline

100 words 054

In between publishing the Whole Earth Catalog and spinning up the Long Now Foundation, Stewart Brand wrote an article in Rolling Stone magazine about one of the earliest video games, Spacewar.

Except it isn’t really about Spacewar at all. It’s about the oncoming age of the personal computer.

The article was published in 1972. At the end, there’s an appendix listing some communal places where “one can step in off the street and compute.” One of those places—with 16 terminals available—was run by a certain Bob Kahn.

Together with Vint Cerf he created the Internet’s Transmission Control Protocol.

Hope

Cennydd points to an article by Ev Williams about the pendulum swing between open and closed technology stacks, and how that pendulum doesn’t always swing back towards openness. Cennydd writes:

We often hear the idea that “open platforms always win in the end”. I’d like that: the implicit values of the web speak to my own. But I don’t see clear evidence of this inevitable supremacy, only beliefs and proclamations.

It’s true. I catch myself saying things like “I believe the open web will win out.” Statements like that worry my inner empiricist. Faith-based outlooks scare me, and rightly so. I like being able to back up my claims with data.

Only time will tell what data emerges about the eventual fate of the web, open or closed. But we can look to previous technologies and draw comparisons. That’s exactly what Tim Wu did in his book The Master Switch and Jonathan Zittrain did in The Future Of The Internet—And How To Stop It. Both make for uncomfortable reading because they challenge my belief. Wu points to radio and television as examples of systems that began as egalitarian decentralised tools that became locked down over time in ever-constricting cycles. Cennydd adds:

I’d argue this becomes something of a one-way valve: once systems become closed, profit potential tends to grow, and profit is a heavy entropy to reverse.

Of course there is always the possibility that this time is different. It may well be that fundamental architectural decisions in the design of the internet and the workings of the web mean that this particular technology has an inherent bias towards openness. There is some data to support this (and it’s an appealing thought), but again; only time will tell. For now it’s just one more supposition.

The real question—when confronted with uncomfortable ideas that challenge what you’d like to believe is true—is what do you do about it? Do you look for evidence to support your beliefs or do you discard your beliefs entirely? That second option looks like the most logical course of action, and it’s certainly one that I would endorse if there were proven facts to be acknowledged (like gravity, evolution, or vaccination). But I worry about mistaking an argument that is still being discussed for an argument that has already been decided.

When I wrote about the dangers of apparently self-evident truisms, I said:

These statements aren’t true. But they are repeated so often, as if they were truisms, that we run the risk of believing them and thus, fulfilling their promise.

That’s my fear. Only time will tell whether the closed or open forces will win the battle for the soul of the internet. But if we believe that centralised, proprietary, capitalistic forces are inherently unstoppable, then our belief will help make them so.

I hope that openness will prevail. Hope sounds like such a wishy-washy word, like “faith” or “belief”, but it carries with it a seed of resistance. Hope, faith, and belief all carry connotations of optimism, but where faith and belief sound passive, even downright complacent, hope carries the promise of action.

Margaret Atwood was asked about the futility of having hope in the face of climate change. She responded:

If we abandon hope, we’re cooked. If we rely on nothing but hope, we’re cooked. So I would say judicious hope is necessary.

Judicious hope. I like that. It feels like a good phrase to balance empiricism with optimism; data with faith.

The alternative is to give up. And if we give up too soon, we bring into being the very endgame we feared.

Cennydd finishes:

Ultimately, I vote for whichever technology most enriches humanity. If that’s the web, great. A closed OS? Sure, so long as it’s a fair value exchange, genuinely beneficial to company and user alike.

This is where we differ. Today’s fair value exchange is tomorrow’s monopoly, just as today’s revolutionary is tomorrow’s tyrant. I will fight against that future.

To side with whatever’s best for the end user sounds like an eminently sensible metric to judge a technology. But I’ve written before about where that mindset can lead us. I can easily imagine Asimov’s three laws of robotics rewritten to reflect the ethos of user-centred design, especially that first and most important principle:

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

…rephrased as:

A product or interface may not injure a user or, through inaction, allow a user to come to harm.

Whether the technology driving the system behind that interface is open or closed doesn’t come into it. What matters is the interaction.

But in his later years Asimov revealed the zeroeth law, overriding even the first:

A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

It may sound grandiose to apply this thinking to the trivial interfaces we’re building with today’s technologies, but I think it’s important to keep drilling down and asking uncomfortable questions (even if they challenge our beliefs).

That’s why I think openness matters. It isn’t enough to use whatever technology works right now to deliver the best user experience. If that short-time gain comes with a long-term price tag for our society, it’s not worth it.

I would much rather an imperfect open system to a perfect proprietary one.

I have hope in an open web …judicious hope.

Forgetting again

In an article entitled The future of loneliness Olivia Laing writes about the promises and disappointments provided by the internet as a means of sharing and communicating. This isn’t particularly new ground and she readily acknowledges the work of Sherry Turkle in this area. The article is the vanguard of a forthcoming book called The Lonely City. I’m hopeful that the book won’t be just another baseless luddite reactionary moral panic as exemplified by the likes of Andrew Keen and Susan Greenfield.

But there’s one section of the article where Laing stops providing any data (or even anecdotal evidence) and presents a supposition as though it were unquestionably fact:

With this has come the slowly dawning realisation that our digital traces will long outlive us.

Citation needed.

I recently wrote a short list of three things that are not true, but are constantly presented as if they were beyond question:

  1. Personal publishing is dead.
  2. JavaScript is ubiquitous.
  3. Privacy is dead.

But I didn’t include the most pernicious and widespread lie of all:

The internet never forgets.

This truism is so pervasive that it can be presented as a fait accompli, without any data to back it up. If you were to seek out the data to back up the claim, you would find that the opposite is true—the internet is in constant state of forgetting.

Laing writes:

Faced with the knowledge that nothing we say, no matter how trivial or silly, will ever be completely erased, we find it hard to take the risks that togetherness entails.

Really? Suppose I said my trivial and silly thing on Friendfeed. Everything that was ever posted to Friendfeed disappeared three days ago:

You will be able to view your posts, messages, and photos until April 9th. On April 9th, we’ll be shutting down FriendFeed and it will no longer be available.

What if I shared on Posterous? Or Vox (back when that domain name was a social network hosting 6 million URLs)? What about Pownce? Geocities?

These aren’t the exceptions—this is routine. And yet somehow, despite all the evidence to the contrary, we still keep a completely straight face and say “Be careful what you post online; it’ll be there forever!”

The problem here is a mismatch of expectations. We expect everything that we post online, no matter how trivial or silly, to remain forever. When instead it is callously destroyed, our expectation—which was fed by the “knowledge” that the internet never forgets—is turned upside down. That’s where the anger comes from; the mismatch between expected behaviour and the reality of this digital dark age.

Being frightened of an internet that never forgets is like being frightened of zombies or vampires. These things do indeed sound frightening, and there’s something within us that readily responds to them, but they bear no resemblance to reality.

If you want to imagine a truly frightening scenario, imagine an entire world in which people entrust their thoughts, their work, and pictures of their family to online services in the mistaken belief that the internet never forgets. Imagine the devastation when all of those trivial, silly, precious moments are wiped out. For some reason we have a hard time imagining that dystopia even though it has already played out time and time again.

I am far more frightened by an internet that never remembers than I am by an internet that never forgets.

And worst of all, by propagating the myth that the internet never forgets, we are encouraging people to focus in exactly the wrong area. Nobody worries about preserving what they put online. Why should they? They’re constantly being told that it will be there forever. The result is that their history is taken from them:

If we lose the past, we will live in an Orwellian world of the perpetual present, where anybody that controls what’s currently being put out there will be able to say what is true and what is not. This is a dreadful world. We don’t want to live in this world.

Brewster Kahle

Cerf rocks

After I wrote about digital preservation and the need to save everything, not just the so-called “important” stuff, Jason wrote a lovely piece with his own thoughts on the matter:

In order to write a history, you need evidence of what happened. When we talk about preserving the stuff we make on the web, it isn’t because we think a Facebook status update, or those GeoCities sites have such significance now. It’s because we can’t know.

In a timely coincidence, Vint Cerf also spoke about the importance of digital preservation:

When you think about the quantity of documentation from our daily lives that is captured in digital form, like our interactions by email, people’s tweets, and all of the world wide web, it’s clear that we stand to lose an awful lot of our history.

He warns of the dangers of rapidly-obsoleting file formats:

We are nonchalantly throwing all of our data into what could become an information black hole without realising it. We digitise things because we think we will preserve them, but what we don’t understand is that unless we take other steps, those digital versions may not be any better, and may even be worse, than the artefacts that we digitised.

It was a little weird that the Guardian headline refers to Vint Cerf as “Google boss”. On the BBC he’s labelled as “Google’s Vint Cerf”. Considering he’s one of the creators of the internet itself, it’s a bit like referring to Neil Armstrong as a NASA employee.

I have to say, I just love listening to him talk. He’s so smooth. I’m sure that the character of The Architect from The Matrix Reloaded is modelled on him.

Vint Cerf knows a thing or two about long-term thinking when it comes to data formats. He has written many RFCs for the IETF (my favourite being RFC 2468). Back in 1969, he wrote RFC 20, proposing the ASCII format for network interchange. If you’ve ever used the keypress event in JavaScript and wondered why, for example, the number 13 corresponds to a carriage return, this is where all those numbers come from.

Last month, over 45 years after the RFC’s original publication, it became an official standard.

So when Vint Cerf warns about the dangers of digitising into file formats that could become unreadable, I think we should pay attention to him.

9,125 days later

The World Wide Web turned 25 last week. Happy birthday!

As is so often the case when web history is being discussed, there is much conflating of “the web” and “the internet” in some mainstream media outlets. The internet—the network of networks that allows computers to talk to each other across the globe—is older than 25 years. The web—a messy collection of HTML files linked via URLs and delivered with the Hypertext Transfer Protocol (HTTP)—is just one of the many types of information services that uses the pipes of the internet (yes, pipes …or tubes, if you prefer—anything but “cloud”).

Now, some will counter that although the internet and the web are technically different things, for most people they are practically the same, because the web is by far the most common use-case for the internet in everyday life. But I’m not so sure that’s true. Email is a massive part of the everyday life of many people—for some poor souls, email usage outweighs web usage. Then there’s streaming video services like Netflix, and voice-over-IP services like Skype. These sorts of proprietary protocols make up an enormous chunk of the internet’s traffic.

The reason I’m making this pedantic distinction is that there’s been a lot of talk in the past year about keeping the web open. I’m certainly in agreement on that front. But if you dig deeper, it turns out that most of the attack vectors are at the level of the internet, not the web.

Net neutrality is hugely important for the web …but it’s hugely important for every other kind of traffic on the internet too.

The Snowden revelations have shown just how shockingly damaging the activities of the NSA and GCHQ are …to the internet. But most of the communication protocols they’re intercepting are not web-based. The big exception is SSL, and the fact that they even thought it would be desirable to attempt to break it shows just how badly they need to be stopped—that’s the mindset of a criminal organisation, pure and simple.

So, yes, we are under attack, but let’s be clear about where those attacks are targeted. The internet is under attack, not the web. Not that that’s a very comforting thought; without a free and open internet, there can be no World Wide Web.

But by and large, the web trundles along, making incremental improvements to itself: expanding the vocabulary of HTML, updating the capabilities of HTTP, clarifying the documentation of URLs. Forgive my anthropomorphism. The web, of course, does nothing to itself; people are improving the web. But the web always has been—and always will be—people.

For some time now, my primary concern for the web has centred around what I see as its killer feature—the potential for long-term storage of knowledge. Yes, the web can be (and is) used for real-time planet-spanning communication, but there are plenty of other internet technologies that can do that. But the ability to place a resource at a URL and then to access that same resource at that same URL after many years have passed …that’s astounding!

Using any web browser on any internet-enabled device, you can instantly reach the first web page ever published. 23 years on, it’s still accessible. That really is something special. Digital information is not usually so long-lived.

On the 25th anniversary of the web, I was up in London with the rest of the Clearleft gang. Some of us were lucky enough to get a behind-the-scenes peak at the digital preservation work being done at the British Library:

In a small, unassuming office, entire hard drives, CD-ROMs and floppy disks are archived, with each item meticulously photographed to ensure any handwritten notes are retained. The wonderfully named ‘ancestral computing’ corner of the office contains an array of different computer drives, including 8-inch, 5 1⁄4-inch, and 3 1⁄2-inch floppy disks.

Most of the data that they’re dealing with isn’t much older than the web, but it’s an order of magnitude more difficult to access; trapped in old proprietary word-processing formats, stuck on dying storage media, readable only by specialised hardware.

Standing there looking at how much work it takes to rescue our cultural heritage from its proprietary digital shackles, I was struck once again by the potential power of the web. With such simple components—HTML, HTTP, and URLs—we have the opportunity to take full advantage of the planet-spanning reach of the internet, without sacrificing long-term access.

As long as we don’t screw it up.

Right now, we’re screwing it up all the time. The simplest way that we screw it up is by taking it for granted. Every time we mindlessly repeat the fallacy that “the internet never forgets,” we are screwing it up. Every time we trust some profit-motivated third-party service to be custodian of our writings, our images, our hopes, our fears, our dreams, we are screwing it up.

The evening after the 25th birthday of the web, I was up in London again. I managed to briefly make it along to the 100th edition of Pub Standards. It was a long time coming. In fact, there was a listing on Upcoming.org for the event. The listing was posted on February 5th, 2007.

Of course, you can’t see the original URL of that listing. Upcoming.org was “sunsetted” by Yahoo, the same company that “sunsetted” Geocities in much the same way that the Enola Gay sunsetted Hiroshima. But here’s a copy of that listing.

Fittingly, there was an auction held at Pub Standards 100 in aid of the Internet Archive. The schwag of many a “sunsetted” startup was sold off to the highest bidder. I threw some of my old T-shirts into the ring and managed to raise around £80 for Brewster Kahle’s excellent endeavour. My old Twitter shirt went for a pretty penny.

I was originally planning to bring my old Pownce T-shirt along too. But at the last minute, I decided I couldn’t part with it. The pain is still too fresh. Also, it serves a nice reminder for me. Trusting any third-party service—even one as lovely as Pownce—inevitably leads to destruction and disappointment.

That’s another killer feature of the web: you don’t need anyone else. You can publish to this world-changing creation without asking anyone for permission. I wish it were easier for people to do this: entrusting your heritage to the Yahoos and Pownces of the world is seductively simple …but only in the short term.

In 25 years time, I want to be able to access these words at this URL. I’m going to work to make that happen.

Dealing with IE again

People have been linking to—and saying nice things about—my musings on dealing with Internet Explorer. Thank you. You’re very kind. But I think I should clarify a few things.

If you’re describing the techniques I showed (using Sass and conditional comments) as practical or useful, I appreciate the sentiment but personally I wouldn’t describe them as either. Jake’s technique is genuinely useful and practical.

I wasn’t really trying to provide any practical “take-aways”. I was just thinking out loud. The only real point to my ramblings was at the end:

When we come up with clever hacks and polyfills for dealing with older versions of Internet Explorer, we shouldn’t feel pleased about it. We should feel angry.

My point is that we really shouldn’t have to do this. And, in fact, we don’t have to do this. We choose to do this.

Take the particular situation I was describing with a user of The Session who using IE8 on Windows XP with a monitor set to 800x600 pixels. A lot people picked up on this observation:

As a percentage, this demographic is tiny. But this isn’t a number. It’s a person. That person matters.

But here’s the thing: that person only started to experience problems when I chose to try to “help” IE8 users. If I had correctly treated IE8 as the legacy browser that it is, those users would have received the baseline experience …which was absolutely fine. Not great, but fine. But instead, I decided to jump in with my hacks, my preprocessor, my conditional comments, and worst of all, my assumptions about the viewport size.

In this case, I only have myself to blame. This is a personal project so I’m the client. I decided that I wanted to give IE8 and IE7 users the same kind of desktop navigation that more modern browsers were getting. All the subsequent pain for me as the developer, and for the particular user who had problems, is entirely my fault. If you’re working at a company where your boss or your client insists on parity for IE8 or IE7, I guess you can point the finger at them.

My point is: all the problems and workarounds that I talked about in that post were the result of me trying to crowbar modern features into a legacy browser. Now, don’t get me wrong—I’m not suggesting that IE8 or IE7 should be shut out or get a crap experience: “baseline” doesn’t mean “crap”. There’s absolutely nothing wrong with serving up a baseline experience to a legacy browser as long as your baseline experience is pretty good …and it should be.

So, please, don’t think that my post was a hands-on, practical example of how to give IE8 and IE7 users a similar experience to modern browsers. If anything, it was a cautionary tale about why trying to do that is probably a mistake.

Dealing with IE

Laura asked a question on Twitter the other day about dealing with older versions of Internet Explorer when you’ve got your layout styles nested within media queries (that older versions of IE don’t understand):

It’s a fair question. It also raises another question: how do you define “dealing with” Internet Explorer 8 or 7?

You could justifiably argue that IE7 users should upgrade their damn browser. But that same argument doesn’t really hold for IE8 if the user is on Windows XP: IE8 is as high as they can go. Asking users to upgrade their browser is one thing. Asking them to upgrade their operating system feels different.

But this is the web and websites do not need to look the same in every browser. Is it acceptable to simply give Internet Explorer 8 the same baseline experience that any other old out-of-date browser would get? In other words, is it even a problem that older versions of Internet Explorer won’t parse media queries? If you’re building in a mobile-first way, they’ll get linearised content with baseline styles applied.

That’s the approach that Alex advocates in the Q&A after his excellent closing keynote at Fronteers. That’s what I’m doing here on adactio.com. Users of IE8 get the linearised layout and that’s just fine. One of the advantages of this approach is that you are then freed up to use all sorts of fancy CSS within your media query blocks without having to worry about older versions of IE crapping themselves.

On other sites, like Huffduffer, I make an assumption (always a dangerous thing to do) that IE7 and IE8 users are using a desktop or laptop computer and so they could get some layout styles. I outlined that technique in a post about Windows mobile media queries. Using that technique, I end up splitting my CSS into two files:

<link rel="stylesheet" href="/css/global.css" media="all">
<link rel="stylesheet" href="/css/layout.css" media="all and (min-width: 30em)">
<!--[if (lt IE 9) & (!IEMobile)]>
<link rel="stylesheet" href="/css/layout.css" media="all">
<![endif]-->

The downside to this technique is that now there are two HTTP requests for the CSS …even for users of modern browsers. The alternative is to maintain one stylesheet for modern browsers and a separate stylesheet for older versions of Internet Explorer. That sounds like a maintenance nightmare.

Pre-processors to the rescue. Using Sass or LESS you can write your CSS in separate files (e.g. one file for basic styles and another for layout styles) and then use the preprocessor to combine those files in two different ways: one with media queries (for modern browsers) and another without media queries (for older versions of Internet Explorer). Or, if you don’t want to have your media query styles all grouped together, you can use Jake’s excellent method.

When I relaunched The Session last month, I initially just gave Internet Explorer 8 and lower the linearised content—the same layout that small-screen browsers would get. For example, the navigation is situated at the bottom of each page and you get to it by clicking an internal link at the top of each page. It all worked fine and nobody complained.

But I thought that it was a bit of a shame that users of IE8 and IE7 weren’t getting the same navigation that users of other desktop browsers were getting. So I decided to use a preprocesser (Sass in this case) to spit out an extra stylesheet for IE8 and IE7.

So let’s say I’ve got .scss files like this:

  • base.scss
  • medium.scss
  • wide.scss

Then in my standard .scss file that’s going to generate the CSS for all browsers (called global.css), I can write:

@import "base.scss";
@media all and (min-width: 30em) {
 @import "medium";
}
@media all and (min-width: 50em) {
 @import "wide";
}

But I can also generate a stylesheet for IE8 and IE7 (called legacy.css) that calls in those layout styles without the media query blocks:

@import "medium";
@import "wide";

IE8 and IE7 will be downloading some styles twice (all the styles within media queries) but in this particular case, that doesn’t amount to too much. Oh, and you’ll notice that I’m not even going to try to let IE6 parse those styles: it would do more harm than good.

<link rel="stylesheet" href="/css/global.css">
<!--[if (lt IE 9) & (!IEMobile) & (gt IE 6)]>
<link rel="stylesheet" href="/css/legacy.css">
<![endif]-->

So I did that (although I don’t really have .scss files named “medium” or “wide”—they’re actually given names like “navigation” or “columns” that more accurately describe what they do). I thought I was doing a good deed for any users of The Session who were still using Internet Explorer 8.

But then I read this. It turned out that someone was not only using IE8 on Windows XP, but they had their desktop’s resolution set to 800x600. That’s an entirely reasonable thing to do if your eyesight isn’t great. And, like I said, I can’t really ask him to upgrade his browser because that would mean upgrading the whole operating system.

Now there’s a temptation here to dismiss this particular combination of old browser + old OS + narrow resolution as an edge case. It’s probably just one person. But that one person is a prolific contributor to the site. This situation nicely highlights the problem of playing the numbers game: as a percentage, this demographic is tiny. But this isn’t a number. It’s a person. That person matters.

The root of the problem lay in my assumption that IE8 or IE7 users would be using desktop or laptop computers with a screen size of at least 1024 pixels. Serves me right for making assumptions.

So what could I do? I could remove the conditional comments and the IE-specific stylesheet and go back to just serving the linearised content. Or I could serve up just the medium-width styles to IE8 and IE7.

That’s what I ended up doing but I also introduced a little bit of JavaScript in the conditional comments to serve up the widescreen styles if the browser width is above a certain size:

<link rel="stylesheet" href="/css/global.css">
<!--[if (lt IE 9) & (!IEMobile) & (gt IE 6)]>
<link rel="stylesheet" href="/css/medium.css">
<script>
if (document.documentElement.clientWidth > 800) {
 document.write('<link rel="stylesheet" href="/css/wide.css">');
}
</script>
<![endif]-->

It works …I guess. It’s not optimal but at least users of IE8 and IE7 are no longer just getting the small-screen styles. It’s a hack, and not a particularly clever one.

Was it worth it? Is it an improvement?

I think this is something to remember when we’re coming up solutions to “dealing with” older versions of Internet Explorer: whether it’s a dumb solution like mine or a clever solution like Jake’s, we shouldn’t have to do this. We shouldn’t have to worry about IE7 just like we don’t have to worry about Netscape 4 or Mosaic or Lynx; we should be free to build according to the principles of progressive enhancement safe in the knowledge that older, less capable browsers won’t get all the bells and whistles, but they will be able to access our content. Instead we’re spending time coming up with hacks and polyfills to deal with one particular family of older, less capable browsers simply because of their disproportionate market share.

When we come up with clever hacks and polyfills for dealing with older versions of Internet Explorer, we shouldn’t feel pleased about it. We should feel angry.

Update: I’ve written a follow-up post to clarify what I’m talking about here.

Things

Put the kettle on, make yourself a cup of tea, and settle down to read a couple of thought-provoking pieces about networked devices.

First up, Scott Jenson writes Of Bears, Bats, and Bees: Making Sense of the Internet of Things:

The Internet of Things is a growing, changing meme. Originally it was meant to invoke a giant swarm of cheap computation across the globe but recently has been morphing and blending, even insinuating, into established product concepts.

Secondly, Charles Stross has published an abridged version of a talk he gave back in June called How low (power) can you go?:

The logical end-point of Moore’s Law and Koomey’s Law is a computer for every square metre of land area on this planet — within our lifetimes. And, speaking as a science fiction writer, trying to get my head around the implications of this technology for our lives is giving me a headache. We’ve lived through the personal computing revolution, and the internet, and now the advent of convergent wireless devices — smartphones and tablets. Ubiquitous programmable sensors will, I think, be the next big step, and I wouldn’t be surprised if their impact is as big as all the earlier computing technologies combined.

And I’ll take this opportunity to once again point to one of my favourite pieces on the “Internet of Things” by Russell Davies:

The problem, though, with the Internet Of Things is that it falls apart when it starts to think about people. When big company Internet Of Things thinkers get involved they tend to spawn creepy videos about sleek people in sleek homes living optimised lives full of smart objects. These videos seem to radiate the belief that the purpose of a well-lived life is efficiency. There’s no magic or joy or silliness in it. Just an optimised, efficient existance. Perhaps that’s why the industry persists in inventing the Internet Fridge. It’s top-down design, not based on what people might fancy, but on what technologies companies are already selling.

Fortunately, though, there’s another group of people thinking about the Internet of Things - enthusiasts and inventors who are building their own internet connected things, adding connectivity and intelligence to the world in their own ways.

You can read it on your networked device or you can listen to it on your networked device …while you’re having your cup of tea …in a non-networked cup …with water from a non-networked kettle.

BBC - Podcasts - Four Thought: Russell M. Davies 21 Sept 2011 on Huffduffer

Cables

After speaking at Go Beyond Pixels in St. John’s, I had some time to explore Newfoundland a little bit. Geri was kind enough to drive me to a place I really wanted to visit: the cable station at Heart’s Content.

Heart's Content Cable Station

I’ve wanted to visit Heart’s Content (and Porthcurno in Cornwall) ever since reading The Victorian Internet, a magnificent book by Tom Standage that conveys the truly world-changing nature of the telegraph. Heart’s Content plays a pivotal role in the story: the landing site of the transatlantic cable, spooled out by the Brunel-designed Great Eastern, the largest ship in the world at the time.

Recently I was sent an advance reading copy of Tubes by Andrew Blum. It makes a great companion piece to Standage’s book as Blum explores the geography of the internet:

For all the talk of the placelessness of our digital age, the Internet is as fixed in real, physical places as any railroad or telephone system ever was.

There’s an interview with Andrew Blum on PopTech, a review of Tubes on Brain Pickings, and I’ve huffduffed a recent talk by Andrew Blum in Philadelphia.

Andrew Blum | Tubes: A Journey to the Center of the Internet - Free Library Podcast on Huffduffer

Now there are more places I want to visit: the nexus points on TeleGeography’s Submarine Cable Map; the hubs of Hibernia Atlantic, whose about page reads like a viral marketing campaign for some soon-to-be-released near-future Hollywood cyberpunk thriller.

I’ve got the kind of travel bug described by Neal Stephenson in his classic 1996 Wired piece Mother Earth Mother Board:

In which the hacker tourist ventures forth across the wide and wondrous meatspace of three continents, acquainting himself with the customs and dialects of the exotic Manhole Villagers of Thailand, the U-Turn Tunnelers of the Nile Delta, the Cable Nomads of Lan tao Island, the Slack Control Wizards of Chelmsford, the Subterranean Ex-Telegraphers of Cornwall, and other previously unknown and unchronicled folk; also, biographical sketches of the two long-dead Supreme Ninja Hacker Mage Lords of global telecommunications, and other material pertaining to the business and technology of Undersea Fiber-Optic Cables, as well as an account of the laying of the longest wire on Earth, which should not be without interest to the readers of Wired.

Maybe one day I’ll get to visit the places being designed by Sheehan Partners, currently only inhabited by render ghosts on their website (which feels like it’s part of the same subversive viral marketing campaign as the Hibernia Atlantic site).

Perhaps I can find a reason to stop off in Ashburn, Virginia or The Dalles, Oregon, once infamous as the site of a cult-induced piece of lo-tech bioterrorism, now the site of Google’s Project 02. Not that there’s much chance of being allowed in, given Google’s condescending attitude when it comes to what they do with our data: “we know what’s best, don’t you trouble your little head about it.”

It’s that same attitude that lurks behind that most poisonous of bullshit marketing terms…

The cloud.

What a crock of shit.

The cloud is a lie

Whereas other bullshit marketing terms once had a defined meaning that has eroded over time due to repeated use and abuse—Ajax, Web 2.0, HTML5, UX—“the cloud” is a term that sets out to deceive from the outset, imbued with the same Lakoffian toxicity as “downsizing” or “friendly fire.” It is the internet equivalent of miasma theory.

Death to the cloud! Long live the New Flesh of servers, routers, wires and cables.

Baran

The first Event Apart of the year has just kicked off here in Seattle. Every Event Apart is excellent, but the Seattle instantiation has two extra things going for it:

  1. a great venue and
  2. a really great hotel with some colourful history.

Jeffrey opened the proceedings with a long-zoom stroll down memory lane, giving us a history lesson of technology, the internet, the web and web standards.

Reflecting on the history of the internet today seems especially poignant with the recent passing of . Which reminds me…

Ten years ago, the Zelig-like Stuart Brand conducted an interview with Paul Baran. You can read the transcript on Wired.

It’s fantastic! A mixture of cold-war history and eerie emergent network effects:

It didn’t take very long before we started seeing all sorts of wonderful properties in this model. The network would learn where everybody was. You could chop up the network and within half a second of real-world time it would be routing traffic again. Then we had the realization that if there’s an overload in one place, traffic will move around it. So it’s a lot more efficient than conventional communications. If somebody tries to hog the network, the traffic routes away from them. Packet switching had all these wonderful properties that weren’t invented — they were discovered.

Shape shifting

I’ve been with the same ISP for years: Eclipse Internet. I never had any cause to complain until recently. I’ve been finding that certain types of requests simply weren’t completing; file uploads, some forms, Ajax requests…

I started googling for any similar reports. I found quite a few forum posts, all of them expressing the same sentiment; that Eclipse used to be good but that since getting bought out by Kingston, service had really gone downhill. Most alarmingly of all, I read reports of .

I called up technical support. They didn’t deny it. Instead, they tried to upsell me on a package that would give me an admin panel to allow more control over exactly how my packets were throttled.

Screw. That.

I started shopping around for a new ISP. When I twittered what I was doing, I received some good recommendations. Eventually, I was able to narrow my search down to two providers who both sounded good: Zen and IDNet. With little else to distinguish between the two companies, their respective websites became the deciding factor. That settled it. IDNet was the clear winner. Not only is their site prettier, it also validates and even uses hCard.

To cancel my Eclipese account, I needed to call them to request a . Of course the reason why they make you call is so that they can try to persuade not to leave. Sure enough, the guy who took my call asked And can you tell me why you’re moving away from Eclipse?

Traffic shaping, I responded.

Ah right, he said. Technical reasons.

I corrected him. Moral reasons, actually.

I got the info I needed. I ordered my new broadband service. Today I switched over.

If you want to find out if your broadband provider is a filthy traffic-shaper, check to see if it’s listed on .

If you find yourself changing to a more ethical ISP, you too will probably be asked to explain why you’re jumping ship. Just tell them what you want:

Fat Pipe, Always On, Get Out of the Way.