Journal

2657 sparkline

Monday, March 9th, 2020

41 hours in Galway

It was my birthday recently. I’m a firm believer in the idea that birthday celebrations should last for more than 24 hours. A week is the absolute minimum.

For the day itself, I did indeed indulge in a most luxurious evening out with Jessica at The Little Fish Market in Hove (on the street where we used to live!). The chef, Duncan Ray, is an absolute genius and his love for all things fish-related shines through in his magnificent dishes.

But to keep the celebrations going, we also went on a weekend away to Galway, where I used to live decades ago. It was a quick trip but we packed in a lot. I joked at one point that it felt like one of those travel articles headlined with “36 hours in someplace.” I ran the numbers and it turned out we were in Galway for 41 hours, but I still thought it would be fun to recount events in the imperative style of one of those articles…

A surprisingly sunny day in Galway.

Saturday, February 29th

The 3:30pm train from Dublin will get you into Galway just before 6pm. The train station is right on the doorstep of Loam, the Michelin-starred restaurant where you’ve made your reservation. Enjoy a seven course menu of local and seasonal produce. Despite the quality of the dishes, you may find the overall experience is a little cool, and the service a touch over-rehearsed.

You’ll be released sometime between 8:30pm and 9pm. Stroll through Eyre Square and down Shop Street to the Jury’s Inn, your hotel. It’s nothing luxurious but it’s functional and the location is perfect. It’s close to everything without being in the middle of the noisy weekend action. The only noise you should hear is the rushing of the incredibly fast Corrib river outside your window.

Around 9:30pm, pop ‘round to Dominick Street to enjoy a cocktail in the America Village Apothecary. It’s only open two nights a week, and it’s a showcase of botanicals gathered in Connemara. Have them make you a tasty conconction and then spend time playing guess-that-smell with their specimen jars.

By 10:30pm you should be on your way round the corner to The Crane Bar on Sea Road. Go in the side entrance and head straight upstairs where the music session will be just getting started. Marvel at how chilled out it is for a Saturday night, order a pint, and sit and listen to some lovely jigs’n’reels. Don’t forget to occassionally pester one of the musicians by asking “What was that last tune called? Lovely set!”

Checked in at The Crane Bar. Great tunes! 🎻🎶 — with Jessica

Sunday, March 1st

Skip the hotel breakfast. Instead, get your day started with a coffee from Coffeewerk + Press. Get that coffee to go and walk over to Ard Bia at Nimmos, right at the Spanish Arch. Get there before it opens at 10am. There will already be a line. Once you’re in, order one of the grand brunch options and a nice big pot of tea. The black pudding hash will set you up nicely.

Checked in at Ard Bia at Nimmos. Black pudding hash and a pot of tea — with Jessica

While the weather is far clearer and sunnier than you were expecting, take the opportunity to walk off that hearty brunch with a stroll along the sea front. That’ll blow out the cobwebs.

Galway bay. Galway bay.

When the cold gets too much, head back towards town and duck into Charlie Byrne’s, the independent bookshop. Spend some time in there browsing the shelves and don’t leave without buying something to remember it by.

By 1pm or so, it’s time for some lunch. This is the perfect opportunity to try the sushi at Wa Cafe near the harbour. They have an extensive range of irrestistable nigiri, so just go ahead and get one of everything. The standouts are the local oyster, mackerel, and salmon.

Checked in at wa cafe. Sushi — with Jessica

From there, head to Tigh Cóilí for the 2pm session. Have a Guinness and enjoy the tunes.

Checked in at Tigh Cóilí. Afternoon session — with Jessica

Spend the rest of the afternoon strolling around town. You can walk through the market at St. Nicholas Church, and check out the little Claddagh ring museum at Thomas Dillon’s—the place where you got your wedding rings at the close of the twentieth century.

Return the ring from whence it came!

If you need a pick-me-up, get another coffee from Coffeewerk + Press, but this time grab a spot at the window upstairs so you can watch the world go by outside.

By 6pm, you’ll have a hankering for some more seafood. Head over to Hooked on Henry Street. Order a plate of oysters, and a cup of seafood chowder. If they’ve got ceviche, try that too.

Checked in at Hooked. with Jessica

Walk back along the canal and stop in to The Salt House to sample a flight of beers from Galway Bay Brewery. There’ll probably be some live music.

Checked in at The Salt House. 🍺 — with Jessica

With your appetite suitably whetted, head on over to Cava Bodega for some classic tapas. Be sure to have the scallops with black pudding.

Checked in at Cava Bodega. Scallops with black pudding — with Jessica

The evening session at Tigh Cóilí starts at 8:30pm on a Sunday so you can probably still catch it. You’ll hear some top-class playing from the likes of Mick Conneely and friends.

Checked in at Tigh Cóilí. 🎶🎻 — with Jessica

And when that’s done, there’s still time to catch the session over at The Crane.

Checked in at The Crane Bar. 🎶🎻 — with Jessica

Monday, March 2nd

After a nice lie-in, check out of the hotel and head to McCambridge’s on Shop Street for some breakfast upstairs. A nice bowl of porridge will set you up nicely for the journey back to Dublin.

If you catch the 11am train, you’ll arrive in Dublin by 1:30pm—just enough time to stop off in The Winding Stair for some excellent lunch before heading on to the airport.

Checked in at The Winding Stair. Lunch in Dublin — with Jessica

Getting there

Aer Lingus flies daily from Gatwick to Dublin. Dublin’s Heuston Station has multiple trains per day going to Galway.

Tuesday, March 3rd, 2020

Abolish Silicon Valley by Wendy Liu

I got an email a little while back from Michael at Repeater Books asking me if I wanted an advance copy of Abolish Silicon Valley: How to Liberate Technology From Capitalism by Wendy Liu. Never one to look a gift horse in the mouth, I said “Sure!”

I’m happy to say that the book is most excellent …or at least mostly excellent.

Contrary to what the book title—or its blurb—might tell you, this is a memoir first and foremost. It’s a terrific memoir. It’s utterly absorbing.

Just as the most personal songs can have the most universal appeal, this story feels deeply personal while being entirely accessible. You don’t have to be a computer nerd to sympathise with the struggles of a twenty-something in a start-up trying to make sense of the world. This well-crafted narrative will resonate with any human. It calls to mind Ellen Ullman’s excellent memoir, Close to the Machine—not a comparison I make lightly.

But as you might have gathered from the book’s title, Abolish Silicon Valley isn’t being marketed as a memoir:

Abolish Silicon Valley is both a heartfelt personal story about the wasteful inequality of Silicon Valley, and a rallying call to engage in the radical politics needed to upend the status quo.

It’s true that the book finishes with a political manifesto but that’s only in the final chapter or two. The majority of the book is the personal story, and just as well. Those last few chapters really don’t work in this setting. They feel tonally out of place.

Don’t get me wrong, the contents of those final chapters are right up my alley—they’re preaching to the converted here. But I think they would be better placed in their own publication. The heavily-researched academic style jars with the preceeding personal narrative.

Abolish Silicon Valley is 80% memoir and 20% manifesto. I worry that the marketing isn’t making that clear. It would be a shame if this great book didn’t find its audience.

The book will be released on April 14th. It’s available to pre-order now. I highly recommend doing just that. I think you’ll really enjoy it. But if you get mired down in the final few chapters, know that you can safely skip them.

Insecure

Universal access is at the heart of the World Wide Web. It’s also something I value when I’m building anything on the web. Whatever I’m building, I want you to be able to visit using whatever browser or device that you choose.

Just to be clear, that doesn’t mean that you’re going to have the same experience in an old browser as you are in the latest version of Firefox or Chrome. Far from it. Not only is that not feasible, I don’t believe it’s desirable either. But if you’re using an old browser, while you might not get to enjoy the newest CSS or JavaScript, you should still be able to access a website.

Applying the principle of progressive enhancement makes this emminently doable. As long as I build in a layered way, everyone gets access to the barebones HTML, even if they can’t experience newer features. Crucially, as long as I’m doing some feature detection, those newer features don’t harm older browsers.

But there’s one area where maintaining backward compatibility might well have an adverse effect on modern browsers: security.

I don’t just mean whether or not you’re serving sites over HTTPS. Even if you’re using TLS—Transport Layer Security—not all security is created equal.

Take a look at Mozilla’s very handy SSL Configuration Generator. You get to choose from three options:

  1. Modern. Services with clients that support TLS 1.3 and don’t need backward compatibility.
  2. Intermediate. General-purpose servers with a variety of clients, recommended for almost all systems.
  3. Old. Compatible with a number of very old clients, and should be used only as a last resort.

Because I value universal access, I should really go for the “old” setting. That ensures my site is accessible all the way back to Android 2.3 and Safari 1. But if I do that, I will be supporting TLS 1.0. That’s not good. My site is potentially vulnerable.

Alright then, I’ll go for “intermediate”—that’s the recommended level anyway. Now I’m no longer providing TLS 1.0 support. But that means some older browsers can no longer access my site.

This is exactly the situation I found myself in with The Session. I had a score of A+ from SSL Labs. I was feeling downright smug. Then I got emails from actual users. One had picked up an old Samsung tablet second hand. Another was using an older version of Safari. Neither could access the site.

Sure enough, if you cut off TLS 1.0, you cut off Safari below version six.

Alright, then. Can’t they just upgrade? Well …no. Apple has tied Safari to OS X. If you can’t upgrade your operating system, you can’t upgrade your browser. So if you’re using OS X Mountain Lion, you’re stuck with an insecure version of Safari.

Fortunately, you can use a different browser. It’s possible to install, say, Firefox 37 which supports TLS 1.2.

On desktop, that is. If you’re using an older iPhone or iPad and you can’t upgrade to a recent version of iOS, you’re screwed.

This isn’t an edge case. This is exactly the kind of usage that iPads excel at: you got the device a few years back just to do some web browsing and not much else. It still seems to work fine, and you have no incentive to buy a brand new iPad. And nor should you have to.

In that situation, you’re stuck using an insecure browser.

As a site owner, I can either make security my top priority, which means you’ll no longer be able to access my site. Or I can provide you access, which makes my site less secure for everyone. (That’s what I’ve done on The Session and now my score is capped at B.)

What I can’t do is tell you to install a different browser, because you literally can’t. Sure, technically you can install something called Firefox from the App Store, or you can install something called Chrome. But neither have anything to do with their desktop counterparts. They’re differently skinned versions of Safari.

Apple refuses to allow browsers with any other rendering engine to be installed. Their reasoning?

Security.

Telling the story of performance

At Clearleft, we’ve worked with quite a few clients on site redesigns. It’s always a fascinating process, particularly in the discovery phase. There’s that excitement of figuring out what’s currently working, what’s not working, and what’s missing completely.

The bulk of this early research phase is spent diving into the current offering. But it’s also the perfect time to do some competitor analysis—especially if we want some answers to the “what’s missing?” question.

It’s not all about missing features though. Execution is equally important. Our clients want to know how their users’ experience shapes up compared to the competition. And when it comes to user experience, performance is a huge factor. As Andy says, performance is a UX problem.

There’s no shortage of great tools out there for measuring (and monitoring) performance metrics, but they’re mostly aimed at developers. Quite rightly. Developers are the ones who can solve most performance issues. But that does make the tools somewhat impenetrable if you don’t speak the language of “time to first byte” and “first contentful paint”.

When we’re trying to show our clients the performance of their site—or their competitors—we need to tell a story.

Web Page Test is a terrific tool for measuring performance. It can also be used as a story-telling tool.

You can go to webpagetest.org/easy if you don’t need to tweak settings much beyond the typical site visit (slow 3G on mobile). Pop in your client’s URL and, when the test is done, you get a valuable but impenetrable waterfall chart. It’s not exactly the kind of thing I’d want to present to a client.

Fortunately there’s an attention-grabbing output from each test: video. Download the video of your client’s site loading. Then repeat the test with the URL of a competitor. Download that video too. Repeat for as many competitor URLs as you think appropriate.

Now take those videos and play them side by side. Presentation software like Keynote is perfect for showing multiple videos like this.

This is so much more effective than showing a table of numbers! Clients get to really feel the performance difference between their site and their competitors.

Running all those tests can take time though. But there are some other tools out there that can give a quick dose of performance information.

SpeedCurve recently unveiled Page Speed Benchmarks. You can compare the performance of sites within a particualar sector like travel, retail, or finance. By default, you’ll get a filmstrip view of all the sites loading side by side. Click through on each one and you can get the video too. It might take a little while to gather all those videos, but it’s quicker than using Web Page Test directly. And it might be that the filmstrip view is impactful enough for telling your performance story.

If, during your discovery phase, you find that performance is being badly affected by third-party scripts, you’ll need some way to communicate that. Request Map Generator is fantastic for telling that story in a striking visual way. Pop the URL in there and then take a screenshot of the resulting visualisation.

The beginning of a redesign project is also the time to take stock of current performance metrics so that you can compare the numbers after your redesign launches. Crux.run is really great for tracking performance over time. You won’t get any videos but you will get some very appealing charts and graphs.

Web Page Test, Page Speed Benchmarks, and Request Map Generator are great for telling the story of what’s happening with performance right nowCrux.run balances that with the story of performance over time.

Measuring performance is important. Communicating the story of performance is equally important.

Tuesday, February 18th, 2020

Utopia

Trys and James recently unveiled their Utopia project. They’ve been tinkering away at it behind the scenes for quite a while now.

You can check out the website and read the blog to get the details of how it accomplishes its goal:

Elegantly scale type and space without breakpoints.

I may well be biased, but I really like this project. I’ve been asking myself why I find it so appealing. Here are a few of the attributes of Utopia that strike a chord with me…

It’s collaborative

Collaboration is at the heart of Clearleft’s work. I know everyone says that, but we’ve definitely seen a direct correlation: projects with high levels of collaboration are invariably more successful than projects where people are siloed.

The genesis for Utopia came about after Trys and James worked together on a few different projects. It’s all too easy to let design and development splinter off into their own caves, but on these projects, Trys and James were working (literally) side by side. This meant that they could easily articulate frustrations to one another, and more important, they could easily share their excitement.

The end result of their collaboration is some very clever code. There’s an irony here. This code could be used to discourage collaboration! After all, why would designers and developers sit down together if they can just pass these numbers back and forth?

But I don’t think that Utopia will appeal to designers and developers who work in that way. Born in the spirit of collaboration, I suspect that it will mostly benefit people who value collaboration.

It’s intrinsic

If you’re a control freak, you may not like Utopia. The idea is that you specify the boundaries of what you’re trying to accomplish—minimum/maximum font sizes, minumum/maximum screen sizes, and some modular scales. Then you let the code—and the browser—do all the work.

On the one hand, this feels like surrending control. But on the other hand, because the underlying system is so robust, it’s a way of guaranteeing quality, even in situations you haven’t accounted for.

If someone asks you, “What size will the body copy be when the viewport is 850 pixels wide?”, your answer would have to be “I don’t know …but I do know that it will be appropriate.”

This feels like a very declarative way of designing. It reminds me of the ethos behind Andy and Heydon’s site, Every Layout. They call it algorithmic layout design:

Employing algorithmic layout design means doing away with @media breakpoints, “magic numbers”, and other hacks, to create context-independent layout components. Your future design systems will be more consistent, terser in code, and more malleable in the hands of your users and their devices.

See how breakpoints are mentioned as being a very top-down approach to layout? Remember the tagline for Utopia, which aims for fluid responsive design?

Elegantly scale type and space without breakpoints.

Unsurprisingly, Andy really likes Utopia:

As the co-author of Every Layout, my head nearly fell off from all of the nodding when reading this because this is the exact sort of approach that we preach: setting some rules and letting the browser do the rest.

Heydon describes this mindset as automating intent. I really like that. I think that’s what Utopia does too.

As Heydon said at Patterns Day:

Be your browser’s mentor, not its micromanager.

The idea is that you give it rules, you give it axioms or principles to work on, and you let it do the calculation. You work with the in-built algorithms of the browser and of CSS itself.

This is all possible thanks to improvements to CSS like calc, flexbox and grid. Jen calls this approach intrinsic web design. Last year, I liveblogged her excellent talk at An Event Apart called Designing Intrinsic Layouts.

Utopia feels like it has the same mindset as algorithmic layout design and intrinsic web design. Trys and James are building on the great work already out there, which brings me to the final property of Utopia that appeals to me…

It’s iterative

There isn’t actually much that’s new in Utopia. It’s a combination of existing techniques. I like that. As I said recently:

I’m a great believer in the HTML design principle, Evolution Not Revolution:

It is better to evolve an existing design rather than throwing it away.

First of all, Utopia uses the idea of modular scales in typography. Tim Brown has been championing this idea for years.

Then there’s the idea of typography being fluid and responsive—just like Jason Pamental has been speaking and writing about.

On the code side, Utopia wouldn’t be possible without the work of Mike Reithmuller and his breakthroughs on responsive and fluid typography, which led to Tim’s work on CSS locks.

Utopia takes these building blocks and combines them. So if you’re wondering if it would be a good tool for one of your projects, you can take an equally iterative approach by asking some questions…

Are you using fluid type?

Do your font-sizes increase in proportion to the width of the viewport? I don’t mean in sudden jumps with @media breakpoints—I mean some kind of relationship between font size and the vw (viewport width) unit. If so, you’re probably using some kind of mechanism to cap the minimum and maximum font sizes—CSS locks.

I’m using that technique on Resilient Web Design. But I’m not changing the relative difference between different sized elements—body copy, headings, etc.—as the screen size changes.

Are you using modular scales?

Does your type system have some kind of ratio that describes the increase in type sizes? You probably have more than one ratio (unlike Resilient Web Design). The ratio for small screens should probably be smaller than the ratio for big screens. But rather than jump from one ratio to another at an arbitrary breakpoint, Utopia allows the ratio to be fluid.

So it’s not just that font sizes are increasing as the screen gets larger; the comparative difference is also subtly changing. That means there’s never a sudden jump in font size at any time.

Are you using custom properties?

A technical detail this, but the magic of Utopia relies on two powerful CSS features: calc() and custom properties. These two workhorses are used by Utopia to generate some CSS that you can stick at the start of your stylesheet. If you ever need to make changes, all the parameters are defined at the top of the code block. Tweak those numbers and watch everything cascade.

You’ll see that there’s one—and only one—media query in there. This is quite clever. Usually with CSS locks, you’d need to have a media query for every different font size in order to cap its growth at the maximum screen size. With Utopia, the maximum screen size—100vw—is abstracted into a variable (a custom property). The media query then changes its value to be the upper end of your CSS lock. So it doesn’t matter how many different font sizes you’re setting: because they all use that custom property, one single media query takes care of capping the growth of every font size declaration.

If you’re already using CSS locks, modular scales, and custom properties, Utopia is almost certainly going to be a good fit for you.

If you’re not yet using those techniques, but you’d like to, I highly recommend using Utopia on your next project.

Friday, February 7th, 2020

IncrementURL

Last month I wrote some musings on default browser behaviours. When it comes to all the tasks that browsers do for us, the most fundamental is taking a URL, fetching its contents and giving us the results. As part of that process, browsers also show us the URL of the page currently loaded in a tab or window.

But even at this fundamental level, there are some differences from browser to browser.

Safari only shows you the domain name—and any subdomain names—by default. It looks like nice and tidy, but it obfuscates what page you’re on (until you click on the domain name). This is bad.

Chrome shows you the full URL, nice and straightforward. This is neutral.

Firefox, like Chrome, shows you the full URL, but with a subtle difference. The important part of the URL—usually the domain name—is subtly highlighted in a darker shade of grey. This is good.

The reason I say that what it highlights is usually the domain name is because what it actually highlights is eTLD+1.

The what now?

Well, if you’re looking at a page on adactio.com, that’s the important bit. But what if you’re looking at a page on adactio.github.io? The domain name is important, but so is the subdomain.

It turns out there’s a list out there of which sites and top level domains allow registrations like this. This is the list that Firefox is using for its shading behaviour in displaying URLs.

Safari, by the way, does not use this list. These URLs are displayed identically in Safari, the phisherman’s friend:

  • example.com
  • example.github.io
  • github.example.com

Whereas Firefox displays them as:

  • example.com
  • example.github.io
  • github.example.com

I learned all this from Jake on a recent edition of HTTP 203. Nicolas Hoizey has writen a nice little summary.

Jake acknowledges that what Apple is doing is shisuboptimal, what Firefox is doing is good, and then puts forward an idea for what Chrome could do. (But please note that this is Jake’s personal opinion; not an official proposal from the Chrome team.)

There’s some prior art here. It used to be that, if your SSL certificate included extended validation, the name would be shown in green next to the padlock symbol. So while my website—which uses regular SSL from Let’s Encrypt—would just have a padlock, Medium—which uses EV SSL—would have a padlock and the text “A Medium Corporation”.

Extended validation wasn’t quite the bulletproof verification it was cracked up to be. So browsers don’t use that interface pattern any more.

Jake suggests repurposing this pattern for all URLs. Pull out the important bit—eTLD+1—and show it next to the padlock.

Screenshots of @JaffaTheCake’s idea for separating out the eTLD+1 part of a URL in a browser’s address bar. Screenshots of @JaffaTheCake’s idea for separating out the eTLD+1 part of a URL in a browser’s address bar.

I like this. The full URL is still displayed. This proposal is more of an incremental change. An enhancement that is applied progressively, if you will.

I also like that it builds on existing interface patterns—Firefox’s URL treatment and the deprecated treatment of EV certs. In fact, I think the first step for Chrome should be to match Firefox’s current behaviour, and then go further with something like Jake’s proposal.

This kind of gradual change was exactly what Chrome did with displaying https and http domains.

Chrome treatment for HTTPS pages.

Jake mentions this in the video

We’ve already seen that you have to take small steps here, like we did with the https change.

There’s a fascinating episode of the Freakonomics podcast called In Praise of Incrementalism. I’ve huffduffed it.

I’m a great believer in the HTML design principle, Evolution Not Revolution:

It is better to evolve an existing design rather than throwing it away.

I’d love to see Chrome take the first steps to Jake’s proposal by following Firefox’s lead.

Then again, I’d love it if Chrome followed Firefox’s lead in implementing subgrid.

Thursday, February 6th, 2020

Hydration

As you may have noticed, I’m a fan of progressive enhancement.

It’s not cool. It’s often at odds with “modern” web development, so I end up looking like an old man yelling at a cloud to get off my lawn. Or something.

At its heart though, progressive enhancement seems fairly uncontroversial and inoffensive to me. It’s an approach. A mindset. Here’s how I describe it in Resilient Web Design:

  1. Identify core functionality.
  2. Make that functionality available using the simplest possible technology.
  3. Enhance!

Progressive enhancement makes use of the principle of least power:

Choose the least powerful language suitable for a given purpose.

That’s step two of the three-step process. But the third step is vital.

I think a lot of the hostility towards progressive enhancement comes from a misunderstanding of that three-step process, perhaps thinking that it stops at step two. I’m sure that some have intrepreted progressive enhancement as preventing developers from using the latest and greatest technology. Nothing could be further from the truth!

Taking a layered approach to building on the web gives you permission to try cutting‐edge JavaScript APIs, regardless of how many or how few browsers currently implement them.

The most common misunderstanding of progressive enhancement is that it’s inherently about JavaScript. That’s not true. You can apply progressive enhancement at every step of front-end development: HTML, CSS, and JavaScript.

But because of JavaScript’s strict error-handling model (at least compared to HTML and CSS), it’s in the JavaScript layer that the lack of a progressive enhancement mindset is most often felt.

That’s why I was saddened by the rise of frameworks and mindsets that assume the availability of JavaScript. Single page apps generally follow this assumption. Everything is delivered via JavaScript: content, markup, styles, and behaviour.

This leads to a terrible situation for performance. The user is left staring at a blank screen, waiting for something—anything!—to appear. Browsers are optimised to stream HTML as soon as they can. Delivering your content via JavaScript rather than HTML means you’re not taking advantage of that optimisation. Your users suffer.

But I was very heartened when I saw the pendulum start to swing back the other way a bit…

Let’s say you’re using a JavaScript framework like React. But the reason you’re using it isn’t because you’re doing anything particularly complex in the browser involving state management. You might be using React because you really like the way it encourages modularity and componentisation.

A few years ago, making a single page app was pretty much the only way you could use React. For you as a developer to experience the benefits of modularity and componentisation, users had to pay the price in the payload (and fragility) of client-side JavaScript.

That’s no longer the case. Now that we can run JavaScript on the server, it’s possible to build in a modular, componentised way and still use progressive enhancement.

When I first heard about Gatsby and Next.js, I thought that was the selling point. Run React on the server; send pre-generated HTML down the wire to the user; then enhance with client-side JavaScript.

But that’s not exactly how it works. The pre-generated HTML isn’t functional. It still needs a bucketload of JavaScript before it can do anything. The actual process is: Run React on the server; send pre-generated HTML down the wire to the user; then send everything again but this time in JavaScript, bundled with the entire React library.

This leads to a situation for users that’s almost worse than before. Instead of staring at a blank screen, now they get HTML lickety-split—excellent! But if they try to interact with what’s on screen, they’ll find that nothing is working yet. Even worse, once the JavaScript is delivered, and is being parsed, they probably can’t even scroll—their device is too busy interpreting all that JavaScript. Your users suffer.

All your content is sent twice. First HTML is sent from the server. These days this is called “server-side rendering”, even though for decades the technical term was “serving a web page” (I’m pretty sure the rendering part happens in a browser). Then a JavaScript library—plus all your bespoke JavaScript—is loaded. Then all your content is loaded again as JSON.

So you’ve got a facade of an interface that you can’t actually interact with until a deluge of JavaScript has been loaded, parsed and executed. The term used for this stage of the process is “hydration”, which makes it sound more like a relaxing treatment from Gwyneth Paltrow than the horrible user experience it is.

The idea is that subsequent navigations—which will happen with Ajax—should be snappy. But the price has already been paid by then. The initial loading experience is jagged and frustrating.

Don’t get me wrong: server-side rendering is great …if what you’re sending from the server is functional. It’s the combination of hollow HTML sent from the server, followed by a huge browser-freezing dump of JavaScript that is an anti-pattern.

This use of server-side rendering followed by hydration feels like progressive enhancement, because it separates out the delivery of markup and scripts. But it’s missing the mindset.

The layered approach of progressive enhancement echoes the separation of concerns in the front-end stack: HTML, CSS, and JavaScript—each layer expressing more power. But while these concepts are related, they’re not interchangable. Separating out the layers of your tech stack isn’t necessarily progressive enhancement. If you have some HTML that relies on JavaScript to be useful, then there’s no benefit in separating that HTML into a separate payload. The HTML that you initially send down the wire needs to be functional (at least at a basic level) before the JavaScript arrives.

I was a little disappointed to see Kyle Simpson—who I admire greatly—conflate separation of concerns with progressive enhancement in his talk from JSCamp 2019:

This content is here. I can see it, and it’s even styled. But I can’t click on the damn button because nothing has loaded in the JavaScript layer yet.

Anybody experienced that where you’ve been on a web page and it’s not really fully functional yet? I can see something but I can’t actually make any usage of it yet.

These are all things that cropped out of our thought process that said: “Let’s build the web in layers. Let’s deliver it progressively in layers. Because that’s morally right. We call this progressive enhancement. And let’s not worry too much about all these potential user experience flaws that may happen.”

That’s a spot-on description of server-side rendering and hydration, but it’s a gross mischaracterisation of progressive enhancement.

That button that requires JavaScript to work? That should’ve been generated with JavaScript. (For example, if you’re building a complex web app, consider sending a read-only view down the wire in HTML—then add any interactive interface elements with JavaScript in the browser.)

If people are equating progressive enhancement with thoughtless server-side rendering and hydration, then I can see why they’d be hostile towards it.

Users would be better served with unprogressive non-enhancement:

You take some structured content, which follows the vertical flow of the document in a way that everyone understands.

Which people traverse easily by either dragging their scroll bar with their mouse, or operating the keyboard using the up and down keys, or using the spacebar.

Or if they’re using a touch device, simply flicking backwards and forwards in that easy way that we’ve all become used to. What you do is you take that, and you fucking well leave it alone.

Alas, that’s not what tools like Gatsby offer. The latest post on their blog is called Why Gatsby is better with JavaScript:

But what about sites or pages where there is no client-side interactivity? Even for those pages, Gatsby offers performance benefits by including JavaScript.

I beg to differ.

(By the way, that same blog post also initially tried to equate the performance hit of client-side JavaScript with the performance hit of images. Andy explains why that’s disingenuous.)

Hope is on the horizon for React in the form of partial hydration. I sincerely hope that it will become the default way of balancing server-side rendering with just-in-time client-side interaction.

The situation we have now is the worst of both worlds: server-side rendering followed by a tsunami of hydration. It has a whiff of progressive enhancement to it (because there’s a cosmetic separation of concerns) but it has none of the user benefits.

Wednesday, February 5th, 2020

Design systems roundup

When I started writing a post about architects, gardeners, and design systems, it was going to be a quick follow-up to my post about web standards, dictionaries, and design systems. I had spotted an interesting metaphor in one of Frank’s posts, and I thought it was worth jotting it down.

But after making that connection, I kept writing. I wanted to point out the fetishism we have for creation over curation; building over maintenance.

Then the post took a bit of a dark turn. I wrote about how the most commonly cited reasons for creating a design system—efficiency and consistency—are the same processes that have led to automation and dehumanisation in the past.

That’s where I left things. Others have picked up the baton.

Dave wrote a post called The Web is Industrialized and I helped industrialize it. What I said resonated with him:

This kills me, but it’s true. We’ve industrialized design and are relegated to squeezing efficiencies out of it through our design systems. All CSS changes must now have a business value and user story ticket attached to it. We operate more like Taylor and his stopwatch and Gantt and his charts, maximizing effort and impact rather than focusing on the human aspects of product development.

But he also points out the many benefits of systemetising:

At the same time, I have seen first hand how design systems can yield improvements in accessibility, performance, and shared knowledge across a willing team. I’ve seen them illuminate problems in design and code. I’ve seen them speed up design and development allowing teams to build, share, and validate prototypes or A/B tests before undergoing costly guesswork in production. There’s value in these tools, these processes.

Emphasis mine. I think that’s a key phrase: “a willing team.”

Ethan tackles this in his post The design systems we swim in:

A design system that optimizes for consistency relies on compliance: specifically, the people using the system have to comply with the system’s rules, in order to deliver on that promised consistency. And this is why that, as a way of doing something, a design system can be pretty dehumanizing.

But a design system need not be a constraining straitjacket—a means of enforcing consistency by keeping creators from colouring outside the lines. Used well, a design system can be a tool to give creators more freedom:

Does the system you work with allow you to control the process of your work, to make situational decisions? Or is it simply a set of rules you have to follow?

This is key. A design system is the product of an organisation’s culture. That’s something that Brad digs into his post, Design Systems, Agile, and Industrialization:

I definitely share Jeremy’s concern, but also think it’s important to stress that this isn’t an intrinsic issue with design systems, but rather the organizational culture that exists or gets built up around the design system. There’s a big difference between having smart, reusable patterns at your disposal and creating a dictatorial culture designed to enforce conformity and swat down anyone coloring outside the lines.

Brad makes a very apt comparison with Agile:

Not Agile the idea, but the actual Agile reality so many have to suffer through.

Agile can be a liberating empowering process, when done well. But all too often it’s a quagmire of requirements, burn rates, and story points. We need to make sure that design systems don’t suffer the same fate.

Jeremy’s thoughts on industrialization definitely struck a nerve. Sure, design systems have the ability to dehumanize and that’s something to actively watch out for. But I’d also say to pay close attention to the processes and organizational culture we take part in and contribute to.

Matthew Ström weighed in with a beautifully-written piece called Breaking looms. He provides historical context to the question of automation by relaying the story of the Luddite uprising. Automation may indeed be inevitable, according to his post, but he also provides advice on how to approach design systems today:

We can create ethical systems based in detailed user research. We can insist on environmental impact statements, diversity and inclusion initiatives, and human rights reports. We can write design principles, document dark patterns, and educate our colleagues about accessibility.

Finally, the ouroboros was complete when Frank wrote down his thoughts in a post called Who cares?. For him, the issue of maintenance and care is crucial:

Care applies to the built environment, and especially to digital technology, as social media becomes the weather and the tools we create determine the expectations of work to be done and the economic value of the people who use those tools. A well-made design system created for the right reasons is reparative. One created for the wrong reasons becomes a weapon for displacement. Tools are always beholden to values. This is well-trodden territory.

Well-trodden territory indeed. Back in 2015, Travis Gertz wrote about Design Machines:

Designing better systems and treating our content with respect are two wonderful ideals to strive for, but they can’t happen without institutional change. If we want to design with more expression and variation, we need to change how we work together, build design teams, and forge our tools.

Also on the topic of automation, in 2018 Cameron wrote about Design systems and technological disruption:

Design systems are certainly a new way of thinking about product development, and introduce a different set of tools to the design process, but design systems are not going to lessen the need for designers. They will instead increase the number of products that can be created, and hence increase the demand for designers.

And in 2019, Kaelig wrote:

In order to be fulfilled at work, Marx wrote that workers need “to see themselves in the objects they have created”.

When “improving productivity”, design systems tooling must be mindful of not turning their users’ craft into commodities, alienating them, like cogs in a machine.

All of this is reminding me of Kranzberg’s first law:

Technology is neither good nor bad; nor is it neutral.

I worry that sometimes the messaging around design systems paints them as an inherently positive thing. But design systems won’t fix your problems:

Just stay away from folks who try to convince you that having a design system alone will solve something.

It won’t.

It’s just the beginning.

At the same time, a design system need not be the gateway drug to some kind of post-singularity future where our jobs have been automated away.

As always, it depends.

Remember what Frank said:

A well-made design system created for the right reasons is reparative. One created for the wrong reasons becomes a weapon for displacement.

The reasons for creating a design system matter. Those reasons will probably reflect the values of the company creating the system. At the level of reasons and values, we’ve gone beyond the bounds of the hyperobject of design systems. We’re dealing in the area of design ops—the whys of systemising design.

This is why I’m so wary of selling the benefits of design systems in terms of consistency and efficiency. Those are obviously tempting money-saving benefits, but followed to their conclusion, they lead down the dark path of enforced compliance and eventually, automation.

But if the reason you create a design system is to empower people to be more creative, then say that loud and proud! I know that creativity, autonomy and empowerment is a tougher package to sell than consistency and efficiency, but I think it’s a battle worth fighting.

Design systems are neither good nor bad (nor are they neutral).

Addendum: I’d just like to say how invigorating it’s been to read the responses from Dave, Ethan, Brad, Matthew, and Frank …all of them writing on their own websites. Rumours of the demise of blogging may have been greatly exaggerated.

Monday, February 3rd, 2020

Three books

Lurking: How a Person Became a User by Joanne McNeil will be published on February 25th.

In Lurking, Joanne McNeil digs deep and identifies the primary (if sometimes contradictory) concerns of people online: searching, safety, privacy, identity, community, anonymity, and visibility. She charts what it is that brought people online and what keeps us here even as the social equations of digital life—what we’re made to trade, knowingly or otherwise, for the benefits of the internet—have shifted radically beneath us. It is a story we are accustomed to hearing as tales of entrepreneurs and visionaries and dynamic and powerful corporations, but there is a more profound, intimate story that hasn’t yet been told.

Enemy of All Mankind: A True Story of Piracy, Power, and History’s First Global Manhunt by Steven Johnson will be published on May 12th:

Henry Every was the seventeenth century’s most notorious pirate. The press published wildly popular—and wildly inaccurate—reports of his nefarious adventures. The British government offered enormous bounties for his capture, alive or (preferably) dead. But Steven Johnson argues that Every’s most lasting legacy was his inadvertent triggering of a major shift in the global economy. Enemy of All Mankind focuses on one key event—the attack on an Indian treasure ship by Every and his crew—and its surprising repercussions across time and space. It’s the gripping tale one of the most lucrative crimes in history, the first international manhunt, and the trial of the seventeenth century.

How To Future: Leading and Sense-Making in an Age of Hyperchange by Scott Smith with Madeline Ashby will be published on July 3rd:

Successfully designing for a future requires a picture of that future—a useful map of the horizons ahead that can be used for wayfinding, identifying emerging opportunities or risks. Accurately developing this map means investing in better awareness of signals about the future, understanding trends in context, developing rich insights about what those signals indicate—relative to companies, people, citizens or stakeholders. It also means cultivating ways to share these future insights through tangible yet provocative scenarios or stories, turn these into prototypes, or connect them to strategies.

Friday, January 31st, 2020

Union

The nation I live in has decided to impose sanctions on itself. The government has yet to figure out the exact details. It won’t be good.

Today marks the day that the ironically-named Kingdom of Great Britain and Northern Ireland officially leaves the European Union. Nothing will change on a day to day basis (until the end of this year, when the shit really hits the fan).

Looking back on 2019, I had the pleasure and privelige of places that will remain in the European Union. Hamburg, Düsseldorf, Utrecht, Miltown Malbay, Kinsale, Madrid, Amsterdam, Paris, Frankfurt, Antwerp, Berlin, Vienna, Cobh.

Maybe I should do a farewell tour in 2020.

Grüße aus Hamburg!

Auf Wiedersehen, Düsseldorf!

Going for a stroll in Utrecht at dusk.

The road to Miltown.

Checked in at Kinsale Harbour. with Jessica

Checked in at La Casa del Bacalao. Tapas! — with Jessica

Hello Amsterdam!

Indoor aviation.

Guten Tag, Frankfurt.

Catch you later, Antwerp.

The Ballardian exterior of Tempelhof.

Losing my religion.

Boats in Cobh.