Thursday, May 21st, 2015

100 words 060

I spent the day in Greenwich, where there were two different web conferences happening simultaneously—Clearleft’s own UX London, and the annual Talk Web Design conference for web students at the University of Greenwich.

I was bouncing between both events, which meant I never really got immersed in either one. But that’s okay. I managed to meet up with plenty of people at both.

There was one unmissable talk today: Charlotte’s public speaking debut, opening up Talk Web Design with a presentation about her transition from student life to working at Clearleft. It was great. I knew it would be.

Wednesday, May 20th, 2015

100 words 059

Today was the first day of UX London. I was planning to attend. I decided I’d skip the first couple of talks—because that would entail rising at the crack of dawn—but I was aiming to get to the venue by the time the first break rolled around.

No plan survives contact with the enemy and today the enemy was the rail infrastructure between Brighton and London. Due to “unforeseen engineering works”, there were scenes of mild-mannered chaos when I arrived at the station.

I decided—wisely, in retrospect—to abandon my plan. Here’s hoping it’s better by tomorrow.

Instantiation

When I give talks or workshops, I sometimes get a bit ranty. One of the richest seams of rantiness comes from me complaining about how we web designers and developers are responsible for making the web a hostile place. “Stop getting the web wrong!” I might shout, like an old man yelling at a cloud. I point to services like Instapaper and Readability and describe their existence as a damning indictment of our work.

Don’t get me wrong—I really like Instapaper, Readability, RSS readers, or any other tools that allow people to read what they want when they want it. But think about their fundamental selling point: get to the content you want without having to wade through the cruft. That cruft was put there by us.

So-called modern web design and development is damage that people have to route around.

(Ooh, I can feel myself coming over all ranty and angry again! Calm down, Jeremy, calm down!)

And. Breathe.

Now there’s a new tool to the add to the list: Facebook Instant. Again, I think it’s actually pretty great that this service exists. But once again, it should make us ashamed of the work we’re collectively producing.

In this case, the service is—somewhat ironically—explicitly touting the performance benefits of not going to a website to read an article. Quite right.

PPK points to tools as the source of the problem and Marco Arment agrees:

The entire culture dominant among web developers today is bizarrely framework-heavy, with seemingly no thought given to minimizing dependencies and page weight.

But I think it’s a bit more subtle than that. As John Gruber says:

Business development deals have created problems that no web developer can solve. There’s no way to make a web page with a full-screen content-obscuring ad anything other than a shitty experience.

Now you might be saying to yourself “Well, I’ve never made a bloated web page!” or “I’ve never slapped loads of intrusive crap over the content!” I’d certainly like to think that I can look at my track record and hold my head up reasonably high. But that doesn’t matter. If the overall perception is that going to a URL to read an article is a pain in the ass, it hurts all of us.

Take this article from M.G. Siegler:

Not only is the web not fast enough for apps, it’s not fast enough for text either. …on mobile, the web browser just isn’t cutting it. … Native apps provide a better user experience on mobile than a web browser.

On the face of it, this is kind of a bizarre claim. After all, there’s nothing inherent in web browsers that makes them slow at rendering text—quite the opposite! And native apps still use HTTP (and often HTML) to fetch content; the network doesn’t suddenly get magically faster just because the piece of software requesting a resource doesn’t happen to be a web browser.

But this conflation of slow websites and slow web browsers is perfectly understandable. If it looks like a slow duck, and it quacks like a slow duck, then why not conclude that ducks are slow? Even if we know that there’s nothing inherently slow about making web pages:

My hope is that Facebook Instant will shake things up a bit. M.G. Siegler again:

At the very least, Facebook has put everyone else on notice. Your content better load fast or you’re screwed. Publication websites have become an absolutely bloated mess. They range from beautiful (The Verge) to atrocious (Bloomberg) to unusable (Forbes). The common denominator: they’re all way too slow.

There needs to be a cultural change in how we approach building for the web. Yes, some of the tools we choose are part of the problem, but the bigger problem is that performance still isn’t being recognised as the most important factor in how people feel about websites (and by extension, the web). This isn’t just a developer issue. It’s a design issue. It’s a UX issue. It’s a business issue. Performance is everybody’s collective responsibility.

I’d better stop now before I start getting all ranty again.

I’ll leave you with some other writings on this topic…

Tim Kadlec talks about choosing performance:

It’s not because of any sort of technical limitations. No, if a website is slow it’s because performance was not prioritized. It’s because when push came to shove, time and resources were spent on other features of a site and not on making sure that site loads quickly.

Jim Ray points out that “we learned the wrong lesson from the rise of mobile and the app ecosystem”:

We’ve spent far too long trying to compete with native experiences by making our websites look and behave like apps. This includes not just thousands of lines of JavaScript to mimic native app swipes and scrolling but even the lower overhead aesthetics of fixed position headers and persistent navigation.

(*cough*Flipboard*cough*)

Finally, Baldur Bjarnason has written a terrific piece:

The web doesn’t suck. Your websites suck.

All of your websites suck.

You destroy basic usability by hijacking the scrollbar. You take native functionality (scrolling, selection, links, loading) that is fast and efficient and you rewrite it with ‘cutting edge’ javascript toolkits and frameworks so that it is slow and buggy and broken. You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability.

The lousy performance of your websites becomes a defensive moat around Facebook.

Go read the whole thing—it’s terrific:

This is a long-standing debate. Except it’s only long-standing among web developers. Columnists, managers, pundits, and journalists seem to have no interest in understanding the technical foundation of their livelihoods. Instead they are content with assuming that Facebook can somehow magically render HTML over HTTP faster than anybody else and there is nothing anybody can do to make their crap scroll-jacking websites faster. They buy into the myth that the web is incapable of delivering on its core capabilities: delivering hypertext and images quickly to a diverse and connected readership.

Tuesday, May 19th, 2015

100 words 058

PPK writes of modern web development:

Tools don’t solve problems any more, they have become the problem.

I think he’s mostly correct, but I think there is some clarification required.

Web development tools fall into two broad categories:

  1. Local tools like preprocessors, task managers, and version control systems that help the developer output their own HTML, CSS, and JavaScript.
  2. Tools written in HTML, CSS, and JavaScript that the end user has to download for the developer to gain benefit.

It’s that second category that contain a tax on the end user. Stop solving problems you don’t yet have.

Monday, May 18th, 2015

100 words 057

It’s UX London week. That’s always a crazy busy time at Clearleft. But it’s also an opportunity. We have this sneaky tactic of kidnapping a speaker from UX London and making them give a workshop just for us. We did it a few years ago with Dave Grey and we got a fantastic few days of sketching out of it.

This time we grabbed Jeff Patton. He spent this afternoon locked in the auditorium at 68 Middle Street teaching us all about user story mapping. ‘Twas most enlightening and really helped validate some of the stuff we’ve been doing lately.

Sunday, May 17th, 2015

100 words 056

I did not take any photographs today. There was a moment when I thought about it. Standing in the back garden, looking up through the leaves and branches of an overhanging tree, I almost reached for my phone.

The sky was a rich clear cerulean blue. The leaves of the tree were a deep maroon colour. The sunlight shining through the leaves showed a branching system of vein-like lines.

If I had taken a photograph, I probably would’ve pointed the camera lens straight up, filling most of the frame with pure blue, and the purple leaves encroaching into the picture.

Saturday, May 16th, 2015

100 words 055

Yesterday I wrote about a tenuous serendipitous connection between Spacewar and the creation of the internet. In the appendix to Stewart Brand’s 1972 Rolling Stone article I spotted a reference to the one and only Bob Kahn.

Except it turns out there is more than one Bob Kahn. A kindly email from Jack Dietz set me straight: there’s Robert Kahn who demoed ARPANET and then there’s Robert Kahn who advocated public access to computers.

This has taught me two important lessons:

  1. Names are not the best unique identifiers, and
  2. The best way to get feedback is to publish.

Friday, May 15th, 2015

This is for everyone with a certificate

Mozilla—like Google before them—have announced their plans for deprecating HTTP in favour of HTTPS. I’m all in favour of moving to HTTPS. I’ve done it myself here on adactio.com, on thesession.org, and on huffduffer.com. I have some concerns about the potential linkrot involved in the move to TLS everywhere—as outlined by Tim Berners-Lee—but still, anything that makes the work of GCHQ and the NSA more difficult is alright by me.

But I have a big, big problem with Mozilla’s plan to “encourage” the move to HTTPS:

Gradually phasing out access to browser features.

Requiring HTTPS for certain browser features makes total sense, given the security implications. Service Workers, for example, are quite correctly only available over HTTPS. Any API that has access to a device sensor—or that could be used for fingerprinting in any way—should only be available over HTTPS. In retrospect, Geolocation should have been HTTPS-only from the beginning.

But to deny access to APIs where there are no security concerns, where it is merely a stick to beat people with …that’s just wrong.

This is for everyone. Not just those smart enough to figure out how to add HTTPS to their site. And yes, I know, the theory is that is that it’s going to get easier and easier, but so far the steps towards making HTTPS easier are just vapourware. That makes Mozilla’s plan look like something drafted by underwear gnomes.

The issue here is timing. Let’s make HTTPS easy first. Then we can start to talk about ways of encouraging adoption. Hopefully we can figure out a way that doesn’t require Mozilla or Google as gatekeepers.

Sven Slootweg outlines the problems with Mozilla’s forced SSL. I highly recommend reading Yoav’s post on deprecating HTTP too. Ben Klemens has written about HTTPS: the end of an era …that era being the one in which anyone could make a website without having to ask permission from an app store, a certificate authority, or a browser manufacturer.

On the other hand, Eric Mill wrote We’re Deprecating HTTP And It’s Going To Be Okay. It makes for an extremely infuriating read because it outlines all the ways in which HTTPS is a good thing (all of which I agree with) without once addressing the issue at hand—a browser that deliberately cripples its feature set for political reasons.

100 words 054

In between publishing the Whole Earth Catalog and spinning up the Long Now Foundation, Stewart Brand wrote an article in Rolling Stone magazine about one of the earliest video games, Spacewar.

Except it isn’t really about Spacewar at all. It’s about the oncoming age of the personal computer.

The article was published in 1972. At the end, there’s an appendix listing some communal places where “one can step in off the street and compute.” One of those places—with 16 terminals available—was run by a certain Bob Kahn.

Together with Vint Cerf he created the Internet’s Transmission Control Protocol.

Thursday, May 14th, 2015

100 words 053

When I got back from Bletchley Park yesterday, I immediately started huffduffing more stories about cryptography and code-breaking.

One of the stories I found was an episode of Ockham’s Razor featuring Professor Mark Dodgson. He talks about the organisational structure at Bletchley Park:

The important point was the organization emphasised team-working and open knowledge sharing where it was needed, and demarcation and specialisation where it was most appropriate.

This reminds of another extraordinary place, also displaying remarkable levels of collaboration, that has an unusual lack of traditional hierarchies and structure: CERN.

Bletchley Park produced the computer. CERN produced the web.