A classic example of the holy grail of web performance and robustness—start with regular HTML sent from the server, enhance once it’s in the browser …if the browser is capable of it. In this case, it’s using JavaScript (React) on both the server and the browser.
My friend Jeffrey has been writing on his website for twenty years. There are very few things on the web that last that long. I’m very, very glad that his website is one of them.
I remember finding Zeldman.com—and Ask Dr. Web, and the Ad Graveyard—back when I was first “going online.” I remember being so grateful for his generosity, but I also remember that what really struck me was the warmth and humility in the writing.
My own website will turn twenty in another few years. I never would have started it if it weren’t for Jeffrey.
I made mention already of an exercise that myself and Charlotte came up with to help developers to think in terms of granular components: using a humble pair of scissors to cut up screenshots or mockups into their constituent parts.
Recently we repeated and added to this exercise. Once the groups of components are gathered together—buttons, form elements, icons, whatever—we go through each group. Everyone writes on a post-it what name they would give this component—button, formField, icon, etc. Then everyone slaps down their post-it notes at the same time. See any overlap? That’s your class name.
Griddled asparagus and courgette with roasted tomatoes and basil oil.
A straight-faced Jake talks us through the step-by-step iterations for turning a JavaScript-required web thang into a progressively enhanced zippy experience, supercharged with Service Worker.
The system makes the website. Don’t blame the web developer, blame the organisation. A web developer embedded in a large system isn’t the one making the websites.
To make a progressively enhanced website that performs well and loads quickly even on slow connections, you need to first make an organisation that values those qualities over others.
This article first appeared in Fast Company almost twenty years ago. It’s a fascinating look into the culture and process that created and maintained the software for the space shuttle. It’s the opposite of Silicon Valley’s “move fast and break things.”
To be this good, the on-board shuttle group has to be very different — the antithesis of the up-all-night, pizza-and-roller-hockey software coders who have captured the public imagination. To be this good, the on-board shuttle group has to be very ordinary — indistinguishable from any focused, disciplined, and methodically managed creative enterprise.
Susan points out some uncomfortable truths. It’s all very well for us to try and create a culture of performance amongst designers and developers, but it will all be nought if we could change the minds of people higher up the chain …who currently just don’t care.
I think she’s spot on when she points to this possible solution:
I think what I’m asking is, who will be the game changer in this conversation? Who will be the large, bulky site that will work towards performance and make it happen and then we will all point to them and say, see they did it. It seems to me that that is what it takes. Much like we pointed to ESPN and being able to use CSS for layout or The Boston Globe and being able to do responsive at a large scale, who will we point to for the performance overhaul?
Today was a bittersweet day at Clearleft. It was Sophie’s last day. She’s moving on to pastures new, where I have no doubt she will kick ass as finely as she has done for the past few years here at our little agency.
She guided us through quite a tricky time; our move into our new building was a scary transition that coincided with some uncertain financial waters. If it hadn’t been for Sophie, I’m not sure how we would have coped.
And so tonight we broke bread together and toasted her time with us. I’ll miss having her around.
The slides from Paul’s talk-in-progress on design principles for building responsive sites. He gave us a sneak peak at Clearleft earlier this week. ‘Sgood.
Hillary, legendary for being the first to scale Mount Everest with teammate Tenzing Norgay, was on board, and Armstrong was, too, saying he was curious to see what the North Pole looked like from ground level, as he’d only seen it from the moon. Astronaut problems.
Apart from the best practices that can often be automated, there are many human decisions that have impact on page speed. A way to make page speed part of the conversation and optimising it part of a website’s requirement, is to set a performance budget.
There’s something to be said for focusing on one particular kind of work. There’s a mastery that only comes with repeated practice.
On the other hand, I have to tendency to get bored of doing the same thing. Repetition is good for skill-building but it has the downside of being …repetitive.
I just can’t get excited about the prospect of building something for any particular operating system, be it desktop or mobile. I think about the potential lifespan of what would be built and end up asking myself “why bother?” If something isn’t on the web—and of the web—I find it hard to get excited about it. I’m somewhat jealous of people who can get equally excited about the web, native, hardware, print …in my mind, if it hasn’t got a URL, it’s missing some vital spark.
I know that this is a problem, but I can’t help it. At the very least, I have enough presence of mind to recognise it as being my problem.
Given these unreasonable feelings of attachment towards the web, you might expect me to wish it to become the one technology to rule them all. But I’ve never felt that any such victory condition would make sense. If anything, I’ve always been grateful for alternative avenues of experimentation and expression.
When Flash was a thriving ecosystem for artists to push the boundaries of what was possible to deliver to a web browser, I never felt threatened by it. I never wished for web technologies to emulate those creations. Don’t get me wrong: I’m happy that we’ve got nice smooth animations in CSS, but I never thought the lack of animation was crippling the web’s potential.
Now we have native technologies that can do more than the web can do. iOS and Android apps can access device APIs that web browsers can’t (yet). And, once again, while I look forward to the day that websites will be able to do all the things that native apps can do today, I don’t think that the lack of those capabilities is dooming the web to irrelevance.
There will always be some alternative that is technologically more advanced than the web. First there were CD-ROMs. Then we had Flash. Now we have native apps. Each one of those platforms offered more power and functionality than you could get from a web browser. And yet the web persists. That’s because none of the individual creations made with those technologies could compete with the collective power of all of the web, hyperlinked together. A single native app will “beat” a single website every time …but an app store pales when compared to the incredible reach and scope of the entire World Wide Web.
The web will always be lagging behind some other technology. I’m okay with that. If anything, I see these other technologies as the research and development arm of the web. CD-ROMs, Flash, and now native apps show us what authors want to be able to do on the web. Slowly but surely, those abilities start becoming available in web browsers.
The pace of this standardisation can seem infuriatingly slow. Sometimes it is too slow. But it’s important that we get it right—the web should hold itself to a higher standard. And so the web plays the tortoise while other technologies race ahead as the hare.
Like I said, I’m okay with that. I’m okay with the web not being as advanced as some other technology stack at any particular moment. I can wait.
In fact, as PPK points out, we could do real damage to the web by attempting to make it mimic some platform that’s currently in the ascendent. I disagree with his framing of it as a battle—rather than conceding defeat, I see it more like waiting out a siege—but I agree completely with this assessment:
The web cannot emulate native perfectly, and it never will.
If we accept that, then we can play to the web’s strengths (while at the same time, playing a slow game of catch-up behind the scenes). The danger comes when we try to emulate the capabilities of something that isn’t the web:
Emulating native leads to bad UX (or, at least, a UX that’s clearly a sub-optimal copy of native UX).
Whenever a website tries to emulate something from an operating system—be it desktop or mobile—the result is invariably something that gets really, really close …but falls just a little bit short. It feels like entering an uncanny valley of interaction design.
I think you make what I call “bicycle bear websites.” Why? Because my response to both is the same.
“Listen bub,” I say, “it is very impressive that you can teach a bear to ride a bicycle, and it is fascinating and novel. But perhaps it’s cruel? Because that’s not what bears are supposed to do. And look, pal, that bear will never actually be good at riding a bicycle.”
This is how I feel about so many of the fancy websites I see. “It is fascinating that you can do that, but it’s really not what a website is supposed to do.”
It’s time to recognise that this is the wrong approach. We shouldn’t try to compete with native apps in terms set by the native apps. Instead, we should concentrate on the unique web selling points: its reach, which, more or less by definition, encompasses all native platforms, URLs, which are fantastically useful and don’t work in a native environment, and its hassle-free quality.
This is something that Cennydd talked about recently on an episode of the Design Details podcast. The web, he argues, is great for the sharing of information, but not so great for applications.
I think PPK, Cennydd, and I are all in broad agreement, but we almost certainly differ in the details. PPK, for example, argues that maybe news sites should be native apps instead, but for me, those are exactly the kind of sites that benefit from belonging to no particular platform. And when Cennydd talks about applications on the web, it raises the whole issue of what constitutes a web app anyway. If we’re talking about having access to device APIs—cameras, microphones, accelerometers—then yes, native is the way to go. But if we’re talking about interface elements and motion design, then I think the web can hold its own …sometimes.
Of course not every web browser can match the capabilities of a native app—that’s why it’s so important to approach web development through the lens of progressive enhancement rather than treating it like software development no different than that of native platforms. The web is not a platform—that’s the whole point of the web; it’s cross-platform. As Baldur put it:
Treating the web like another app platform makes sense if app platforms are all you’re used to. But doing so means losing the reach, adaptability, and flexibility that makes the web peerless in both the modern media and software industries.
The price we pay for that incredible cross-platform reach is that features on the web will always be lagging behind, and even when do they do arrive, they won’t be available in all web browsers.
To paraphrase William Gibson: capabilities on the web will always be here, but they will never be evenly distributed.
But let’s take a step back from the surface-level differences between web and native. Just as happened with CD-ROMs and Flash, the web is catching up with native when it comes to motion design, visual feedback, and gestures like swiping and dragging. I don’t think those are where the fundamental differences lie. I don’t even think the fundamental differences lie in accessing device APIs like cameras, microphones, and offline storage—the web is (slowly) catching up in those areas too.
What if the fundamental differences lie deeper than the technical implementation? What if the web is suited to some things more than others, not because of technical limitations, but because of philosophical mismatches?
The web was born at CERN, an amazing environment that’s free of many of the economic and hierarchical pressures that shape technology decisions elsewhere. The web’s heritage as a hypertext document sharing system for pure scientific research is often treated as a handicap, something that must be overcome in this age of applications and monetisation. But I see this heritage as a feature, not a bug. It promotes ideals of universal access above individual convenience, creation above consumption, and sharing above financial gain.
For web development to grow as a craft and as an industry, we have to follow the money. Without money the craft becomes a hobby and unmaintained software begins to rot.
But I think there’s a danger here. If we allow the web to be led by money-making, we may end up changing the fundamental nature of the web, and not for the better.
Now, personally, I believe that it’s entirely possible to run a profitable business on the web. There are plenty of them out there. But suppose we allow that other avenues are more profitable. Let’s assume that there’s more friction in making money on the web than there is in, say, making money on iOS (or Android, or Facebook, or some other monolithic stack). If that were the case …would that be so bad?
Suppose, to use PPK’s phrase, we “concede defeat” to Apple, Google, Microsoft, and Facebook. When you think about it, it makes sense that platforms borne from profit-driven companies are going to be better at generating profit than something created by a bunch of idealistic scientists trying to improve the knowledge of the human race. Suppose we acknowledged that the web isn’t that well-suited to capitalism.
I think I’d be okay with that.
Would the web become little more than a hobbyist’s playground? A place for amateurs rather than professional businesses?
Maybe.
I’d be okay with that too.
Y’see, what attracted me to the web—to the point where I have this blind spot—wasn’t the opportunity to make money. What attracted me to the web was its remarkable ability to allow anyone to share anything, not just for the here and now, but for the future too.
If you’ve been reading my journal or following my links for any time, you’ll be aware that two of my biggest interests are progressive enhancement and digital preservation. In my mind, these two things are closely intertwingled.
For me, progressive enhancement is a means of practicing universal design, a way of providing access to as many people as possible. That includes access across time, hence the crossover with digital preservation. I’ve noticed again and again that what’s good for accessibility is also good for longevity, and vice versa.
Whenever the ephemerality of the web is mentioned, two opposing responses tend to surface. Some people see the web as a conversational medium, and consider ephemerality to be a virtue. And some people see the web as a publication medium, and want to build a “permanent web” where nothing can ever disappear.
I don’t want a web where “nothing can ever disappear” but I also don’t want the default lifespan of a resource on the web to be ephemeral. I think that whoever published that resource should get to decide how long or short its lifespan is. The problem, as Maciej points out, is in the mismatch of expectations:
I’ve come to believe that a lot of what’s wrong with the Internet has to do with memory. The Internet somehow contrives to remember too much and too little at the same time, and it maps poorly on our concepts of how memory should work.
I completely agree with Bret’s woeful assessment of the web when it comes to link rot:
It is this common record of public thought — the “great conversation” — whose stability and persistence is crucial, both for us alive today and for those who will come after.
I believe we can and should do better. But I completely and utterly disagree with him when he says:
Photos from your friend’s party are not part of the common record.
Nor are most casual conversations. Nor are search histories, commercial transactions, “friend networks”, or most things that might be labeled “personal data”. These are not deliberate publications like a bound book; they are not intended to be lasting contributions to the public discourse.
We can agree when it comes to search histories and commercial transactions, but it makes no sense to lump those in with the ordinary plenty that I’ve written about before:
My words might not be as important as the great works of print that have survived thus far, but because they are digital, and because they are online, they can and should be preserved …along with all the millions of other words by millions of other historical nobodies like me out there on the web.
For me, this lies at the heart of what the web does. The web removes the need for tastemakers who get to decide what gets published. The web removes the need for gatekeepers who get to decide what gets saved.
Other avenues of expressions will always be more powerful than the web in the short term: CD-ROMs, Flash, and now native. But they all come with gatekeepers. The collective output of the human race—from the most important scholarly papers to the most trivial blog post—is too important to put in the hands of the gatekeepers of today who may not even be around tomorrow: Apple, Google, Microsoft, et al.
The web has no gatekeepers. The web has no quality control. The web is a mess. The web is for everyone.
Today, as part of a crack Clearleft team, I travelled to Leamington Spa. That’s Royal Leamington Spa to you.
This seems like a perfectly pleasant town. Fortunately for us, our visit coincides with a pub quiz down at the local hipster bar—the one serving Mexican food with a cajun twist. Naturally we joined in the quizzing fun.
We thought we were being sensible by jokering the “science and nature” round, but it turned out we should’ve jokered “puppets and dummies” or “musicians in the movies”—a clean sweep! Who could’ve foretold that Andy Budd’s favourite film, Freejack, would feature?
As a conference organiser, it’s easy to see yourself as being in a position of weakness. You’re hustling hard to put on a great event, but you are a victim to the whims of the ticket-buying public. So you might well be tempted to make whatever compromises are necessary just to break even.
But the truth is that, as a conference organiser, you are in a position of power. You decide which voices will be amplified. You might think that your conference line-up needs to reflect the current state of the world. But it could also highlight a better world.
There’s just a few more weeks to go until the third and final Responsive Day Out and I can’t wait! It’s going to be unmissable so, like, don’t miss it. If you haven’t already got your ticket, it’s not too late. And remember: it’s a measly £80.
On June 19th, follow the trail of eager geeks to the Corn Exchange at the Brighton Dome, a short walk from the train station. We’ll be using the main Dome entrance on Church Street and registration starts at 9am, with the first talk at 10am.
Now, what with it being a measly £80, don’t expect much in the way of swag. In fact, don’t expect anything in the way of swag. You won’t even get a lanyard; just a sticker. There won’t be any after-party; we can all just wander off to the nearby pubs and cafés instead. And lunch won’t be provided. But that’s okay, because Street Diner will be happening just up the road that day, and I’ve already confirmed that The Troll’s Pantry will be present—best burgers in Brighton (or anywhere else for that matter).
It’s going to be such a great day! Like I said …unmissable.
Much of the web’s early cultural and design history is at risk, despite efforts by the Internet Archive and renegade archivists. One of our realizations after 20 years on the web is that our responsibility isn’t just to the new; we also need to preserve what’s been built in the past.
Jessica and I went to see Mad Max: Fury Road at the Dukes At Komedia last week. We both thoroughly enjoyed it. There’s the instant thrill of being immersed in a rollicking good action movie but this film also stayed with me long after leaving the cinema.
This isn’t really Max’s movie at all—it’s Furiosa’s. And oh, what a wonderful protagonist she is.
Max’s role in this movie is to be an ally. And for that reason, I see him as a role model—one who offers a shoulder, not to cry on, but to steady a rifle’s aim.
I believe that Mozilla can make progress in privacy, but leadership needs to recognize that current advertising practices that enable “free” content are in direct conflict with security, privacy, stability, and performance concerns — and that Firefox is first and foremost a user-agent, not an industry-agent.
Brad is visiting Brighton this weekend after his stint at UX London. I’ve been showing him around town, introducing him to the finest coffee, burgers, and beers that Brighton has to offer.
Yesterday Ireland held a referendum to amend its constitution in order to provide equal rights to gay couples who want to get married. Today it’s clear that the “yes” vote is going to carry the day.
This is amazing.
I left Ireland in the early ’90s. When I told people abroad about the medieval legal situation in Ireland on contraception, divorce, and homosexuality …well, people just wouldn’t believe me. Combined with the nationalistic political situation, I got used to a sort of permanent miasma of embarrassment towards the country I came from.
Grant, like Emma, has recently started blogging again. This makes me very, very happy. And he’s doing it for what I consider to be all the right reasons:
But this is mostly a place for me to capture my thoughts, and an excuse to consider them, and an opportunity to understand them more fully.
I may have just ordered a spaceprob.es T-shirt for myself.
I had band practice with Salter Cane today. It’s been ages since the last rehearsal. Our drummer, Emily, has been recovering from surgery on her foot, hence the hiatus.
I was sure that this practice would be a hard slog. Not only had we not played together for a long time, but we’re trying out a new rehearsal space too. Sure enough, there were plenty of technical difficulties that arose from trying to get things working in the new space. But I was pleasantly surprised by how the songs sounded. We were pretty tight. One might even say we rocked.
On just about every client project that I work on, the subject of browser support comes up. Rightly so. It’s an important issue on which to get mutual understanding and agreement. But all too often, this important question is framed in a binary, true/false, go/no-go way: “Which browsers do we/don’t we support?”
Really, the first thing to get agreement on is not a list of browsers, but what we mean by the word “support”. In my mind, that word implies that a user of a particular browser should be able to accomplish the primary tasks on the website, whether that’s reading an article, booking a ticket, or buying a product. That doesn’t mean that the task must be experienced in pixel-perfect fidelity to an ideal visual design.
But to others, that’s exactly what “support” means. Personally, I’d call that optimisation. As Brad puts it:
There is a difference between support and optimization.
So to put it in glib terms, I support every browser …but I optimise for none.
Alright, fine. But I still need to get to some mutual understanding with a client about which browsers will get the optimised experience and which browsers will simply be supported.
Personally, I like the Filament Group’s approach of discussing this in terms of features rather than browsers. It makes sense to me to say the browsers that support geolocation will get the geolocation features, or the browsers that support offline caching will get the offline caching features. There’s no need to produce a list of what those browsers are for each feature, and in any case, the list would be constantly changing and updating with each new browser release.
But—and this is a big but—nine times out of ten, when the issue of browser support comes up, it isn’t about functionality; it’s about branding. What clients generally want to know is which browsers will get the ideal visual design. Obviously the newer versions of Chrome and Firefox are going to get all the lovely layouts, rounded corners, gradients, transparencies, and animations …but what about older versions of Internet Explorer? Even if users of IE8 and IE7 can accomplish their tasks, will the “degraded” visual presentation hurt their experience?
My hypothesis is that it won’t. Users of older versions of Internet Explorer aren’t doing a side-by-side comparison of the same website opened up in the latest Chrome nightly. Considering what their daily usage must be like—unable to use Facebook, unable to use Google services—I suspect that they are happy just to be able to complete their task, regardless of the site’s visual fidelity.
There’s another viewpoint—one that I’ve heard expressed by clients—that even users of older browsers should still get the ideal, pixel-perfect visual design. The hypothesis here is that, by allowing someone to experience anything less than the perfect presentation, the client’s brand will be damaged in the mind of that person.
Like I said, this is something that comes up on most client projects, and this is the point at which we’d have to come to an agreement about which hypothesis we’re going to go with. Of course I’m going to argue in favour of the first hypothesis, but I’ve come to realise that arguing in favour of either hypothesis is the wrong approach. We shouldn’t be debating this …we should be testing it.
We have two competing hypotheses about a group of users. Instead of trying to read their minds, why not test with that group of users to find out which hypothesis is correct? No matter what the results of the test, they will be valuable either way.
Think about the amount of work that’s going to go in to optimising for older browser versions—it’s going to take quite a bit of time and money. It makes sense to ensure that this time and money isn’t being spent on little more than a hunch that pixel-perfection is important to those users. On the other hand, if the test reveals that actually those users really will have a lesser opinion of a brand unless they get pixel-perfect parity with newer browsers, then you’ll know that the time and money spent making that happen isn’t wasted.
Or, in longer terms if more people appreciated how one day of user research can save weeks of coding I think they would do it more. It is remarkable what you decide to not build after talking to a few people closely.
When it comes to decisions around browser support/optimisation, I think that even a little bit of up-front research and testing could potentially save a lot of time, money, and heartache. I’m not sure exactly what form the testing should take, but I’m interested in figuring it out.
More thoughts on the lack of a performance culture, prompted by the existence of Facebook Instant:
In my experience, the biggest barrier to a high-performance web is this: the means of production are far removed from the means of delivery. It’s hard to feel the performance impact of your decisions when you’re sitting on a T3 line in front of a 30 inch monitor. And even if you test on real devices (as you should), you’re probably doing it on a fast wifi network, not a spotty 3G connection. For most of us, even the ones I would describe as pro-performance, everything in the contemporary web design production pipeline works against the very focus required to keep the web fast.
The Indieweb approach has a lot in common with Ev’s ideas for Medium, but the key difference is that we are doing it in a way that works across websites, not just within one.
I spent the day in Greenwich, where there were two different web conferences happening simultaneously—Clearleft’s own UX London, and the annual Talk Web Design conference for web students at the University of Greenwich.
I was bouncing between both events, which meant I never really got immersed in either one. But that’s okay. I managed to meet up with plenty of people at both.
There was one unmissable talk today: Charlotte’s public speaking debut, opening up Talk Web Design with a presentation about her transition from student life to working at Clearleft. It was great. I knew it would be.
Today was the first day of UX London. I was planning to attend. I decided I’d skip the first couple of talks—because that would entail rising at the crack of dawn—but I was aiming to get to the venue by the time the first break rolled around.
No plan survives contact with the enemy and today the enemy was the rail infrastructure between Brighton and London. Due to “unforeseen engineering works”, there were scenes of mild-mannered chaos when I arrived at the station.
I decided—wisely, in retrospect—to abandon my plan. Here’s hoping it’s better by tomorrow.
OH: “Could any more crap pop up on my screen‽ Just let me see Duran Duran!” —@WordRidden choosing her new jam.
Bistecca a la fiorentina.
Jersey royals, asparagus, and green pepper, topped with chives from the garden.
Katie, Divya, and the other great designers and developers at Sparkbox run workshops on HTML and CSS for girl scouts. They’ve shared their resources and I might just borrow some of them for Codebar.
I’m disenchanted with desktop. That conviction runs so deep, I groan when I see a desktop layout JPEG.
All too often we talk the talk about taking a mobile first approach, but we rarely walk the walk. Most designers and developers still think of the small-screen viewport as the exception, not the norm.
Zeldman looks back at Stewart Butterfield’s brilliant 5K contest. We need more of that kind of thinking today:
As one group of web makers embraces performance budgets and the eternal principles of progressive enhancement, while another (the majority) worships at the altar of bigger, fatter, slower, the 5K contest reminds us that a byte saved is a follower earned.
When I give talks or workshops, I sometimes get a bit ranty. One of the richest seams of rantiness comes from me complaining about how we web designers and developers are responsible for making the web a hostile place. “Stop getting the web wrong!” I might shout, like an old man yelling at a cloud. I point to services like Instapaper and Readability and describe their existence as a damning indictment of our work.
Don’t get me wrong—I really like Instapaper, Readability, RSS readers, or any other tools that allow people to read what they want when they want it. But think about their fundamental selling point: get to the content you want without having to wade through the cruft. That cruft was put there by us.
(Ooh, I can feel myself coming over all ranty and angry again! Calm down, Jeremy, calm down!)
And. Breathe.
Now there’s a new tool to the add to the list: Facebook Instant. Again, I think it’s actually pretty great that this service exists. But once again, it should make us ashamed of the work we’re collectively producing.
In this case, the service is—somewhat ironically—explicitly touting the performance benefits of not going to a website to read an article. Quite right.
The entire culture dominant among web developers today is bizarrely framework-heavy, with seemingly no thought given to minimizing dependencies and page weight.
Business development deals have created problems that no web developer can solve. There’s no way to make a web page with a full-screen content-obscuring ad anything other than a shitty experience.
My least favorite online game these days: finding the “X" that closes the nearly ubiquitous website popup for newsletter signup or video ad.
Now you might be saying to yourself “Well, I’ve never made a bloated web page!” or “I’ve never slapped loads of intrusive crap over the content!” I’d certainly like to think that I can look at my track record and hold my head up reasonably high. But that doesn’t matter. If the overall perception is that going to a URL to read an article is a pain in the ass, it hurts all of us.
Not only is the web not fast enough for apps, it’s not fast enough for text either. …on mobile, the web browser just isn’t cutting it. … Native apps provide a better user experience on mobile than a web browser.
On the face of it, this is kind of a bizarre claim. After all, there’s nothing inherent in web browsers that makes them slow at rendering text—quite the opposite! And native apps still use HTTP (and often HTML) to fetch content; the network doesn’t suddenly get magically faster just because the piece of software requesting a resource doesn’t happen to be a web browser.
But this conflation of slow websites and slow web browsers is perfectly understandable. If it looks like a slow duck, and it quacks like a slow duck, then why not conclude that ducks are slow? Even if we know that there’s nothing inherently slow about making web pages:
You don’t need Facebook to deliver your text faster than you can. Remove all unnecessary cruft and make your site blazing fast.
My hope is that Facebook Instant will shake things up a bit. M.G. Siegler again:
At the very least, Facebook has put everyone else on notice. Your content better load fast or you’re screwed. Publication websites have become an absolutely bloated mess. They range from beautiful (The Verge) to atrocious (Bloomberg) to unusable (Forbes). The common denominator: they’re all way too slow.
There needs to be a cultural change in how we approach building for the web. Yes, some of the tools we choose are part of the problem, but the bigger problem is that performance still isn’t being recognised as the most important factor in how people feel about websites (and by extension, the web). This isn’t just a developer issue. It’s a design issue. It’s a UX issue. It’s a business issue. Performance is everybody’s collective responsibility.
I’d better stop now before I start getting all ranty again.
I’ll leave you with some other writings on this topic…
It’s not because of any sort of technical limitations. No, if a website is slow it’s because performance was not prioritized. It’s because when push came to shove, time and resources were spent on other features of a site and not on making sure that site loads quickly.
We’ve spent far too long trying to compete with native experiences by making our websites look and behave like apps. This includes not just thousands of lines of JavaScript to mimic native app swipes and scrolling but even the lower overhead aesthetics of fixed position headers and persistent navigation.
You destroy basic usability by hijacking the scrollbar. You take native functionality (scrolling, selection, links, loading) that is fast and efficient and you rewrite it with ‘cutting edge’ javascript toolkits and frameworks so that it is slow and buggy and broken. You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability.
The lousy performance of your websites becomes a defensive moat around Facebook.
This is a long-standing debate. Except it’s only long-standing among web developers. Columnists, managers, pundits, and journalists seem to have no interest in understanding the technical foundation of their livelihoods. Instead they are content with assuming that Facebook can somehow magically render HTML over HTTP faster than anybody else and there is nothing anybody can do to make their crap scroll-jacking websites faster. They buy into the myth that the web is incapable of delivering on its core capabilities: delivering hypertext and images quickly to a diverse and connected readership.
For a minute there, I didn’t know what to read on the internet.
The controversial hamburger icon goes mainstream with this story on the BBC News site.
It still amazes me that, despite clear data, many designers cling to the belief that the icon by itself is understandable (or that users will “figure it out eventually”). Why the aversion to having a label for the icon?
Progressive Enhancement remains the best option for solving web development issues such as wide-ranging browser support, maintenance and future-proofing your application.
It’s UX London week. That’s always a crazy busy time at Clearleft. But it’s also an opportunity. We have this sneaky tactic of kidnapping a speaker from UX London and making them give a workshop just for us. We did it a few years ago with Dave Grey and we got a fantastic few days of sketching out of it.
This time we grabbed Jeff Patton. He spent this afternoon locked in the auditorium at 68 Middle Street teaching us all about user story mapping. ‘Twas most enlightening and really helped validate some of the stuff we’ve been doing lately.
I did not take any photographs today. There was a moment when I thought about it. Standing in the back garden, looking up through the leaves and branches of an overhanging tree, I almost reached for my phone.
The sky was a rich clear cerulean blue. The leaves of the tree were a deep maroon colour. The sunlight shining through the leaves showed a branching system of vein-like lines.
If I had taken a photograph, I probably would’ve pointed the camera lens straight up, filling most of the frame with pure blue, and the purple leaves encroaching into the picture.
Mozilla—like Google before them—have announced their plans for deprecating HTTP in favour of HTTPS. I’m all in favour of moving to HTTPS. I’ve done it myself here on adactio.com, on thesession.org, and on huffduffer.com. I have some concerns about the potential linkrot involved in the move to TLS everywhere—as outlined by Tim Berners-Lee—but still, anything that makes the work of GCHQ and the NSA more difficult is alright by me.
But I have a big, big problem with Mozilla’s plan to “encourage” the move to HTTPS:
Gradually phasing out access to browser features.
Requiring HTTPS for certain browser features makes total sense, given the security implications. Service Workers, for example, are quite correctly only available over HTTPS. Any API that has access to a device sensor—or that could be used for fingerprinting in any way—should only be available over HTTPS. In retrospect, Geolocation should have been HTTPS-only from the beginning.
But to deny access to APIs where there are no security concerns, where it is merely a stick to beat people with …that’s just wrong.
This is for everyone. Not just those smart enough to figure out how to add HTTPS to their site. And yes, I know, the theory is that is that it’s going to get easier and easier, but so far the steps towards making HTTPS easier are just vapourware. That makes Mozilla’s plan look like something drafted by underwear gnomes.
The issue here is timing. Let’s make HTTPS easy first. Then we can start to talk about ways of encouraging adoption. Hopefully we can figure out a way that doesn’t require Mozilla or Google as gatekeepers.
On the other hand, Eric Mill wrote We’re Deprecating HTTP And It’s Going To Be Okay. It makes for an extremely infuriating read because it outlines all the ways in which HTTPS is a good thing (all of which I agree with) without once addressing the issue at hand—a browser that deliberately cripples its feature set for political reasons.
Except it isn’t really about Spacewar at all. It’s about the oncoming age of the personal computer.
The article was published in 1972. At the end, there’s an appendix listing some communal places where “one can step in off the street and compute.” One of those places—with 16 terminals available—was run by a certain Bob Kahn.
The important point was the organization emphasised team-working and open knowledge sharing where it was needed, and demarcation and specialisation where it was most appropriate.
This looks like it’ll be brilliant! Nat is running a prototyping workshop the day before Responsive Day Out:
This workshop is for designers with no coding experience — if you’re an absolute beginner who wants to find out whether coding can help you with your job, this is for you!
Presenting @qwertykate with a souvenir of Düsseldorf.
Instead of coming up with all these new tools and JavaScript frameworks, shouldn’t we try to emphasize the importance of learning the underlying fundamentals of the web? Teach those who are just stepping to this medium and starting their careers. By not making our stack more and more complex, but by telling about the best practices that should guide our work and the importance of basic things.
There was a Clearleft outing to Bletchley Park today. I can’t believe I hadn’t been before. It was nerdvana—crypto, history, and science combined in one very English location.
Alan Turing’s work at Station X is rightly lauded, but I can’t help feeling a bit uncomfortable with the way we make heroes of those who work in the shadows. After the war, England’s fictional hero was James Bond, the creation of former Bletchley worker Ian Fleming. And now we have GCHQ spying on its own citizens.
Righteousness in the past doesn’t earn a country a free pass for the future.
It was fascinating at Indie Web Camp Germany to see how much could be accomplished by taking some pre-existing small things and loosely joining them.
For example, there are already webmention and micropubplug-ins for quite a few CMSs. If you’re using Wordpress or Jekyll, you can get pretty far pretty quickly by making use of what people have already provided. And after that Indie Web Camp, you can add Drupal and Kirby to the list of CMSs with readily-available components.
I was somewhat surprised—and very pleased—that people made use of some little PHP snippets that I had posted as gists. I deliberately posted them as gists to show how minimal and barebones the code could be—no need for a whole project, or installers, or dockering the node to yeoman the gulp, or whatever it is the cool kids do these days.
This modular approach also worked well for interface elements. Glenn and Aaron worked on separate projects to create small JavaScript enhancements for posting interfaces. Assemble enough of these enhancements together and before you know it, you’ve got something approaching Medium.
By the end of the second day, I was amazed to see how much progress people had made. Like Johannes says:
I was pretty impressed by how much people got done. At the final demo session, everyone had something he or she had done to update their website – although I’m pretty sure that the end of this event will not be the end of their efforts to try and own their stuff online.
It was quite inspiring. In fact, I think I’ve been inspired to have an Indie Web Camp in Brighton. I’m thinking we could have it at the same time as Indie Web Camp Portland, which is on July 11th and 12th.
Bastian sums up his experience of attending Indie Web Camp:
But this weekend brought a new motivational high that I didn’t expect to go that far. I attended the Indie Web Camp in Düsseldorf, Germany and I’m simply blown away.
On the last leg of the journey from Bletchley back to Brighton.
Alas, this leg is standing.
Lorenz.
Please knit now.
Valves.
Enigma.
Cracking the code.
Reservoir nerds.
The bombe.
Standing before Colossus.
This picture was taken with a miniaturised descendent of that machine.
Making a @Clearleft pilgrimage to Bletchley Park. It feels like we’re journeying to the enchanted valley of Imladris.
You can think of each part of a selector as a condition:
condition { }
That translates to code like:
if (condition) { }
So if you have a CSS selector like this:
condition1 condition2 condition3 { }
…it translates to code like this:
if (condition1) {
if (condition2) {
if (condition3) {
}
}
}
That doesn’t feel very elegant, even in its simpler form:
if (condition1 && condition2 && condition3) { }
I like Harry’s rule of thumb:
Think of your selectors as mini programs: Every time you nest or qualify, you are adding an if statement; read these ifs out loud to yourself to try and keep your selectors sane.
I spoke at the Beyond Tellerrand conference today. I wasn’t expecting to speak at the Beyond Tellerrand conference today.
Marc asked me just a few days ago if I might be able to step into the breach. I was going to be attending the conference today anyway—my flight back to Brighton was in the evening—so I said sure, why not?
It was fun. Except for the moment when my throat decided it didn’t want to cooperate with this whole public speaking thing and just closed up for a minute or so. That was just a little bit disconcerting.
Asparagus.
Pancetta-wrapped pork filet on rhubarb.
Good to come home to.
I love lamp.
Desperately seeking @ScottJehl at Beyond Tellerrand so that he can sign my copy of his book.
I made a little improvement to the links section of my site. Now every time I link to something, I check to see if it accepts webmentions and if it does, I ping it to let you know that I’ve linked to it.
Ben Fino-Radin describes how the MoMA’s archivematica “analyzes all digital collections materials as they arrive, and records the results in an obsolescence-proof text format that is packaged and stored with the materials themselves.”
François is here at Indie Web Camp Germany helping out anyone who wants to get their site running on https. He wrote this great post to get people started.
Today was the first day of Indie Web Camp Germany here in Düsseldorf. The environment couldn’t have been better—the swank sipgate building has plenty of room, fantastic food, and ridiculously friendly people on hand to make sure that everything goes smoothly.
Day one is the discussion day. The topics fortuitously formed a great narrative starting with the simple building blocks of microformats, leading into webmentions, then authentication, and finally micropub and posting interfaces.
My brain is full after talking through these technologies in increasing order of complexity. Enough talking. Now I’m ready to start coding. Bring on day two.
Indie Web Camp Germany.
Showing a quick demo of posting to adactio.com from a different site (using IndieAuth and Micropub).
One of the great pleasures of travelling is partaking of the local cuisine. Today I travelled to Düsseldorf. As soon as I arrived, I went out for ramen.
Wait, what?
You might be thinking that I should really be making the most of the pork and potato dishes that Germany is famed for, but the fact is that the ramen here is really good.
I grew up in Cobh—pronounced “cove”—Cork, Ireland. There’s a statue in the middle of town; an angel presiding over the figures of local fishermen who lost their lives 100 years ago when a German U-boat torpedoed and sank The Lusitania off the old head of Kinsale. They were attempting to rescue survivors.
On the outskirts of town there’s an old cemetery where a mass grave was dug for the bodies of the Lusitania victims.
Cobh’s history is filled with ill-fated ships. It was the last stop of The Titanic. The ships are now memorialised as pub names.
For forty four days I wrote and published 100 words every day. I was cutting it close sometimes, getting a post in just before midnight, but I always managed it.
Until yesterday.
Yesterday was a very busy day. I worked hard and I played hard (that’s an explanation, not an excuse).
I’ve been working with a lovely team of designers and developers from John Lewis. They came down to Brighton yesterday and it was very productive.
Then we went for a curry, then karaoke, then a few more drinks. It was past midnight when I got home. No 100 words.
Last day of commuting to London to collaborate with the lovely people at John Lewis. It’s been a fun project.
Going to exercise my hard-won and fought-for right to vote.
Taking my hard-won and fought-for hangover with me.
It was Clearleft’s turn to host Codebar again this evening. As always, it was great. I did my best to introduce some people to HTML and CSS, which was challenging, rewarding, and fun.
In the run-up to the event, I did a little spring cleaning of Clearleft’s bookshelves. I took some books on HTML, CSS, and JavaScript that weren’t being used any more and offered them to Codebar students for the taking.
I was also able to offer some more contemporary books thanks to the generosity of A Book Apart who kindly donated some of their fine volumes to Codebar.
It wasn’t that long ago I mentioned a new book about learning HTML and CSS from scratch, published online for free. Well, now there’s another one. This time the subject is typography on the web.
The content is loaded into the page using a low-level transport mechanism called HTTP (the great thing about using this protocol is that you get URL routing for free). I bucked the trend and decided not to encode the content in JSON. Instead it’s contained in a text format called HTML.
There is some asynchronous loading involved for the rich media; that’s accomplished using a feature of HTML known as the img element.
I’m pretty pleased with the results. The whole thing is scrolling smoothly at sixty frames per second.
I’m going to be taking part in a discussion upstairs in The Eagle in Brighton on May 14th, all about digital preservation. I think it’s going to be really fun. It’s free—you should come along.
At Clearleft, we’ve been running internships for quite a while now (paid internships, of course; there’s no justification for unpaid internships—don’t let anyone tell you otherwise). In that time we’ve been incredibly fortunate in getting fantastic people to jump on board our agency train for a few months.
The newest member of our pantheon is James Madson. He took time out of his globetrotting travels to become a temporary Clearlefty. It was great having him around. He’s a smart, humble, talented guy. I’m very glad that he enjoyed his time with us.
Apps must run on specific platforms for specific devices. The app space, while large, isn’t universal.
Websites:
Websites can be viewed by anyone with a web browser.
And that doesn’t mean foregoing modern features:
A web browser must only understand HTML. Further, newer HTML (like HTML 5) is still supported because the browser is built to ignore HTML it doesn’t understand. As a result, my site can run on the oldest browsers all the way to the newest ones. Got Lynx? No problem. You’ll still be able to find matches nearby. Got the latest smartphone and plentiful data? It’ll work there, too, and take advantage of its features.
This is why progressive enhancement is so powerful.
My site will take advantage of newer technologies like geolocation and local storage. However, the service will not be dependent on them.
Yesterday was April 30th. On April 30th in 1993, the world changed. But this world-changing event was marked by the simplest of actions—a couple of signatures and a some rubber stamps.
When I was at CERN a few years ago with my fellow hackers, Robert Cailliau produced his copy of this document. It passed around the table. When it came to me, I held it like a magic scroll.
“Be careful—there are only two copies of that,” he said. “And CERN have misplaced theirs.”