Tags: medium

471

sparkline

Saturday, April 4th, 2020

A bit of Blarney

I don’t talk that much on here about my life’s work. Contrary to appearances, my life’s work is not banging on about semantic markup, progressive enhancement, and service workers.

No, my life’s work is connected to Irish traditional music. Not as a musician, I hasten to clarify—while I derive enormous pleasure from playing tunes on my mandolin, that’s more of a release than a vocation.

My real legacy, it turns out, is being the creator and caretaker of The Session, an online community and archive dedicated to Irish traditional music. I might occassionally mention it here, but only when it’s related to performance, accessibility, or some other front-end aspect. I’ve never really talked about the history, meaning, and purpose of The Session.

Well, if you’re at all interested in that side of my life, you can now listen to me blather on about it for over an hour, thanks to the Blarney Pilgrims podcast.

I’ve been huffduffing episodes of this podcast for quite a while now. It’s really quite excellent. If you’re at all interested in Irish traditional music, the interviews with the likes of Kevin Burke, John Carty, Liz Carroll and Catherine McEvoy are hard to beat.

So imagine my surprise when they contacted me to ask me to chat and play some tunes! It really was an honour.

I was also a bit of guinea pig. Normally they’d record these kinds of intimate interviews face to face, but what with The Situation and all, my chat was the first remotely recorded episode.

I’ve been on my fair share of podcasts—most recently the Design Systems Podcast—but this one was quite different. Instead of talking about my work on the web, this focussed on what I was doing before the web came along. So if you don’t want to hear me talking about my childhood, give this a miss.

But if you’re interested in hearing my reminisce and discuss the origin and evolution of The Session, have a listen. The chat is interspersed with some badly-played tunes from me on the mandolin, but don’t let that put you off.

Wednesday, April 1st, 2020

A reading of The Enormous Space by J.G. Ballard

Staying at home triggered a memory for me. I remembered reading a short story many years ago. It was by J.G. Ballard, and it described a man who makes the decision not to leave the house.

Being a J.G. Ballard story, it doesn’t end there. Over the course of the story, the house grows and grows in size, forcing the protaganist into ever-smaller refuges within his own home. It really stuck with me.

I tried tracking it down with some Duck Duck Going. Searching for “j.g. ballard weird short story” doesn’t exactly narrow things down, but eventually I spotted the book that I had read the story in. It was called War Fever. I think I read it back when I was living in Germany, so that would’ve been in the ’90s. I certainly don’t have a copy of the book any more.

But I was able to look up a table of contents and find a title for the story that was stuck in my head. It’s called The Enormous Space.

Alas, I couldn’t find any downloadable versions—War Fever doesn’t seem to be available for the Kindle.

Then I remembered the recent announcement from the Internet Archive that it was opening up the National Emergency Library. The usual limits on “checking out” books online are being waived while physical libraries remain closed.

I found The Complete Stories of J.G. Ballard and borrowed it just long enough to re-read The Enormous Space.

If anything, it’s creepier and weirder than I remembered. But it’s laced with more black comedy than I remembered.

I thought you might like to hear this story, so I made a recording of myself reading The Enormous Space.

Monday, March 30th, 2020

Living Through The Future

You can listen to audio version of Living Through The Future.

Usually when we talk about “living in the future”, it’s something to do with technology: smartphones, satellites, jet packs… But I’ve never felt more like I’m living in the future than during The Situation.

On the one hand, there’s nothing particularly futuristic about living through a pandemic. They’ve occurred throughout history and this one could’ve happened at any time. We just happen to have drawn the short straw in 2020. Really, this should feel like living in the past: an outbreak of a disease that disrupts everyone’s daily life? Nothing new about that.

But there’s something dizzyingly disconcerting about the dominance of technology. This is the internet’s time to shine. Think you’re going crazy now? Imagine what it would’ve been like before we had our network-connected devices to keep us company. We can use our screens to get instant updates about technologies of world-shaping importance …like beds and face masks. At the same time as we’re starting to worry about getting hold of fresh vegetables, we can still make sure that whatever meals we end up making, we can share them instantaneously with the entire planet. I think that, despite William Gibson’s famous invocation, I always figured that the future would feel pretty futuristic all ‘round—not lumpy with old school matters rubbing shoulders with technology so advanced that it’s indistinguishable from magic.

When I talk about feeling like I’m living in the future, I guess what I mean is that I feel like I’m living at a time that will become History with a capital H. I start to wonder what we’ll settle on calling this time period. The Covid Point? The Corona Pause? 2020-P?

At some point we settled on “9/11” for the attacks of September 11th, 2001 (being a fan of ISO-8601, I would’ve preferred 2001-09-11, but I’ll concede that it’s a bit of a mouthful). That was another event that, even at the time, clearly felt like part of History with a capital H. People immediately gravitated to using historical comparisons. In the USA, the comparison was Pearl Harbour. Outside of the USA, the comparison was the Cuban missile crisis.

Another comparison between 2001-09-11 and what we’re currently experiencing now is how our points of reference come from fiction. Multiple eyewitnesses in New York described the September 11th attacks as being “like something out of a movie.” For years afterwards, the climactic showdowns in superhero movies that demolished skyscrapers no longer felt like pure escapism.

For The Situation, there’s no shortage of prior art to draw upon for comparison. If anthing, our points of reference should be tales of isolation like Robinson Crusoe. The mundane everyday tedium of The Situation can’t really stand up to comparison with the epic scale of science-fictional scenarios, but that’s our natural inclination. You can go straight to plague novels like Stephen King’s The Stand or Emily St. John Mandel’s Station Eleven. Or you can get really grim and cite Cormac McCarthy’s The Road. But you can go the other direction too and compare The Situation with the cozy catastrophes of John Wyndham like Day Of The Triffids (or just be lazy and compare it to any of the multitude of zombie apocalypses—an entirely separate kind of viral dystopia).

In years to come there will be novels set during The Situation. Technically they will be literary fiction—or even historical fiction—but they’ll feel like science fiction.

I remember the Chernobyl disaster having the same feeling. It was really happening, it was on the news, but it felt like scene-setting for a near-future dystopian apocalypse. Years later, I was struck when reading Wolves Eat Dogs by Martin Cruz-Smith. In 2006, I wrote:

Halfway through reading the book, I figured out what it was: Wolves Eat Dogs is a Cyberpunk novel. It happens to be set in present-day reality but the plot reads like a science-fiction story. For the most part, the book is set in the post-apocolyptic landscape of Prypiat, near Chernobyl. This post-apocolyptic scenario just happens to be real.

The protagonist, Arkady Renko, is sent to this frightening hellish place following a somewhat far-fetched murder in Moscow. Killing someone with a minute dose of a highly radioactive material just didn’t seem like a very realistic assassination to me.

Then I saw the news about Alexander Litvinenko, the former Russian spy who died this week, quite probably murdered with a dose of polonium-210.

I’ve got the same tingling feeling about The Situation. Fact and fiction are blurring together. Past, present, and future aren’t so easy to differentiate.

I really felt it last week standing in the back garden, looking up at the International Space Station passing overhead on a beautifully clear and crisp evening. I try to go out and see the ISS whenever its flight path intersects with southern England. Usually I’d look up and try to imagine what life must be like for the astronauts and cosmonauts on board, confined to that habitat with nowhere to go. Now I look up and feel a certain kinship. We’re all experiencing a little dose of what that kind of isolation must feel like. Though, as the always-excellent Marina Koren points out:

The more experts I spoke with for this story, the clearer it became that, actually, we have it worse than the astronauts. Spending months cooped up on the ISS is a childhood dream come true. Self-isolating for an indefinite period of time because of a fast-spreading disease is a nightmare.

Whenever I look up at the ISS passing overhead I feel a great sense of perspective. “Look what we can do!”, I think to myself. “There are people living in space!”

Last week that feeling was still there but it was tempered with humility. Yes, we can put people in space, but here we are with our entire way of life put on pause by something so small and simple that it’s technically not even a form of life. It’s like we’re the martians in H.G. Wells’s War Of The Worlds; all-conquering and formidable, but brought low by a dose of dramatic irony, a Virus Ex Machina.

Thursday, March 26th, 2020

Apple’s attack on service workers

Apple aren’t the best at developer relations. But, bad as their communications can be, I’m willing to cut them some slack. After all, they’re not used to talking with the developer community.

John Wilander wrote a blog post that starts with some excellent news: Full Third-Party Cookie Blocking and More. Safari is catching up to Firefox and disabling third-party cookies by default. Wonderful! I’ve had third-party cookies disabled for a few years now, and while something occassionally breaks, it’s honestly a pretty great experience all around. Denying companies the ability to track users across sites is A Good Thing.

In the same blog post, John said that client-side cookies will be capped to a seven-day lifespan, as previously announced. Just to be clear, this only applies to client-side cookies. If you’re setting a cookie on the server, using PHP or some other server-side language, it won’t be affected. So persistent logins are still doable.

Then, in an audacious example of burying the lede, towards the end of the blog post, John announces that a whole bunch of other client-side storage technologies will also be capped to seven days. Most of the technologies are APIs that, like cookies, can be used to store data: Indexed DB, Local Storage, and Session Storage (though there’s no mention of the Cache API). At the bottom of the list is this:

Service Worker registrations

Okay, let’s clear up a few things here (because they have been so poorly communicated in the blog post)…

The seven day timer refers to seven days of Safari usage, not seven calendar days (although, given how often most people use their phones, the two are probably interchangable). So if someone returns to your site within a seven day period of using Safari, the timer resets to zero, and your service worker gets a stay of execution. Lucky you.

This only applies to Safari. So if your site has been added to the home screen and your web app manifest has a value for the “display” property like “standalone” or “full screen”, the seven day timer doesn’t apply.

That piece of information was missing from the initial blog post. Since the blog post was updated to include this clarification, some people have taken this to mean that progressive web apps aren’t affected by the upcoming change. Not true. Only progressive web apps that have been added to the home screen (and that have an appropriate “display” value) will be spared. That’s a vanishingly small percentage of progressive web apps, especially on iOS. To add a site to the home screen on iOS, you need to dig and scroll through the share menu to find the right option. And you need to do this unprompted. There is no ambient badging in Safari to indicate that a site is installable. Chrome’s install banner isn’t perfect, but it’s better than nothing.

Just a reminder: a progressive web app is a website that

  • runs on HTTPS,
  • has a service worker,
  • and a web manifest.

Adding to the home screen is something you can do with a progressive web app (or any other website). It is not what defines progressive web apps.

In any case, this move to delete service workers after seven days of using Safari is very odd, and I’m struggling to find the connection to the rest of the blog post, which is about technologies that can store data.

As I understand it, with the crackdown on setting third-party cookies, trackers are moving to first-party technologies. So whereas in the past, a tracking company could tell its customers “Add this script element to your pages”, now they have to say “Add this script element and this script file to your pages.” That JavaScript file can then store a unique idenitifer on the client. This could be done with a cookie, with Local Storage, or with Indexed DB, for example. But I’m struggling to understand how a service worker script could be used in this way. I’d really like to see some examples of this actually happening.

The best explanation I can come up with for this move by Apple is that it feels like the neatest solution. That’s neat as in tidy, not as in nifty. It is definitely not a nifty solution.

If some technologies set by a specific domain are being purged after seven days, then the tidy thing to do is purge all technologies from that domain. Service workers are getting included in that dragnet.

Now, to be fair, browsers and operating systems are free to clean up storage space as they see fit. Caches, Local Storage, Indexed DB—all of those are subject to eventually getting cleaned up.

So I was curious. Wanting to give Apple the benefit of the doubt, I set about trying to find out how long service worker registrations currently last before getting deleted. Maybe this announcement of a seven day time limit would turn out to be not such a big change from current behaviour. Maybe currently service workers last for 90 days, or 60, or just 30.

Nope:

There was no time limit previously.

This is not a minor change. This is a crippling attack on service workers, a technology specifically designed to improve the user experience for return visits, whether it’s through improved performance or offline access.

I wouldn’t be so stunned had this announcement come with an accompanying feature that would allow Safari users to know when a website is a progressive web app that can be added to the home screen. But Safari continues to ignore the existence of progressive web apps. And now it will actively discourage people from using service workers.

If you’d like to give feedback on this ludicrous development, you can file a bug (down in the cellar in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying “Beware of the Leopard”).

No doubt there will still be plenty of Apple apologists telling us why it’s good that Safari has wished service workers into the cornfield. But make no mistake. This is a terrible move by Apple.

I will say this though: given The Situation we’re all living in right now, some good ol’ fashioned Hot Drama by a browser vendor behaving badly feels almost comforting.

Monday, March 23rd, 2020

Outlet

We’re all hunkering down in our homes. That seems to be true of our online homes too.

People are sharing their day-to-day realities on their websites and I’m here for it. Like, I’m literally here for it. I can’t go anywhere.

On an episode of the Design Observer podcast, Jessica Helfand puts this into context:

During times of crisis, people want to make things. There’s a surge in the keeping of journals when there’s a war… it’s a response to the feeling of vulnerability, like corporeal vulnerability. My life is under attack. I am imprisoned in my house. I have to make something to say I was here, to say I mattered, to say this day happened… It’s like visual graphic reassurance.

It’s not just about crisis though. Scott Kelly talks about the value of keeping a journal during prolonged periods of repitition. And he should know—he spent a year in space:

NASA has been studying the effects of isolation on humans for decades, and one surprising finding they have made is the value of keeping a journal. Throughout my yearlong mission, I took the time to write about my experiences almost every day. If you find yourself just chronicling the days’ events (which, under the circumstances, might get repetitive) instead try describing what you are experiencing through your five senses or write about memories. Even if you don’t wind up writing a book based on your journal like I did, writing about your days will help put your experiences in perspective and let you look back later on what this unique time in history has meant.

That said, just stringing a coherent sentence together can seem like too much during The Situation. That’s okay. Your online home can also provide relief and distraction through tidying up. As Ethan puts it:

let a website be a worry stone

It can be comforting to get into the zone doing housekeeping on your website. How about a bit of a performance audit? Or maybe look into more fluid typography? Or perhaps now is the time to tinker about with that dark mode you’ve been planning?

Whatever you end up doing, my point is that your website is quite literally an outlet. While you’re stuck inside, your website is not just a place you can go to, it’s a place you can control, a place you can maintain, a place you can tidy up, a place you can expand. Most of all, it’s a place you can lose yourself in, even if it’s just for a little while.

Friday, March 20th, 2020

Local

How are you doing? Are you holding up okay?

It’s okay if you’re not. This is a tough time.

It’s very easy to become despondent about the state of the world. If you tend to lean towards pessimism, The Situation certainly seems to be validating your worldview right now.

I’m finding that The Situation is also a kind of Rorschach test. If you’ve always felt that humanity wasn’t deserving of your faith—that “we are the virus”—then there’s plenty happening right now to bolster that opinion. But if you’ve always thought that human beings are fundamentally good and decent, there’s just as much happening to reinforce that viewpoint.

I’ve noticed concentric circles of feelings tied to geography—positive in the centre, and very negative at the edges. What I mean is, if you look at what’s happening in your building and your street, it’s quite amazing how people are pulling together:

Our street (and the guy who runs the nearby corner store) is self-organizing so that everyone’s looking out for each other, checking up on elderly and self-isolating folks, sharing contact details, picking up shopping if necessary, and generally just being good humans.

This goodwill extends just about to the level of city mayorships. But once you look further than that, things turn increasingly sour. At the country level, incompetence and mismanagement seem to be the order of the day. And once you expand out to the whole world, who can blame you for feeling overwhelmed with despair?

But the world is made up of countries, and countries are made up of communities, and these communities are made up of people who are pulling together and helping one another.

Best of all, you can absolutely be part of this wonderful effort. In normal times, civic activism would require you to take action, get out there, and march in the streets. Now you can be a local hero by staying at home.

That’s it. Stay inside, resist the urge to congregate, and chat to your friends and relatives online instead. If you do that, you are being a most excellent human being—the kind that restores your faith in humanity.

I know it feels grim and overwhelming but, again, look at what’s triggering those feelings—is it the national news? International? I know it’s important to stay informed about the big picture—this is a global pandemic, after all—but don’t lose sight of what’s close to hand. Look closer to home and you’ll see the helpers—heck, you are one of the helpers.

On Ev’s blog, Fiona Cameron Lister quotes the president of the Italian Society of Psychiatrists:

Fear of an epidemic is as old as mankind itself. In this case its effect is amplified by incomplete, even false information which has caused public confidence in our institutions to collapse.

She points out that the media are in the business of amplifying the outliers of negative behaviour—panic buying, greed, and worst-case scenarios. But she goes on to say:

It doesn’t take much to start a panic and we are teetering on the brink.

Not to be the “well, actually” guy but …well, actually…

That view of humanity as being poised on the brink of mass panic is the common consensus viewpoint; it even influences public policy. But the data doesn’t support this conclusion. (If you want details, I highly recommend reading Critical Mass: How One Thing Leads to Another by Philip Ball.) Thinking of ordinary people as being one emergency away from panicking is itself giving into fear.

I guess what I’m saying is, if you’re feeling misanthropic about your fellow humans right now, try rebalancing your intake. Yes, it’s good to keep yourself informed about national and global events, but make sure to give plenty of attention to the local level too. You may just find your heart warming and your spirits lifting.

After all, you’re a good person, right? And you probably also think of yourself as a fairly ordinary person, right? So if you’re doing the right thing—making small sacrifices and being concerned for your neighbours—then logic dictates that most other people are too.

I have faith in you:

When this is over, I hope we will be proud of how well we loved one another.

Thursday, March 19th, 2020

Nice

Yesterday was Wednesday. Wednesday evening is when I play in an Irish trad session at The Jolly Brewer. It’s a highlight of my week.

Needless to say, there was no session yesterday. I’ll still keep playing tunes while we’re all socially distancing, but it’s not quite the same. I concur with this comment:

COVID-19 has really made me realize that we need to be grateful for the people and activities we take for granted. Things like going out for food, seeing friends, going to the gym, etc., are fun, but are not essential for (physical) survival.

It reminds of Brian Eno’s definition of art: art is anything we don’t have to do. It’s the same with social activities. We don’t have to go to concerts—we can listen to music at home. We don’t have to go the cinema—we can watch films at home. We don’t have to go to conferences—we can read books and blog posts at home. We don’t have to go out to restaurants—all our nutritional needs can be met at home.

But it’s not the same though, is it?

I think about the book Station Eleven a lot. The obvious reason why I’d be thinking about it is that it describes a deadly global pandemic. But that’s not it. Even before The Situation, Station Eleven was on my mind for helping provide clarity on the big questions of life; y’know, the “what’s it all about?” questions like “what’s the meaning of life?”

Part of the reason I think about Station Eleven is its refreshingly humanist take on a post-apocalyptic society. As I discussed on this podcast episode a few years back:

It’s interesting to see a push-back against the idea that if society is removed we are going to revert to life being nasty, brutish and short. Things aren’t good after this pandemic wipes out civilisation, but people are trying to put things back together and get along and rebuild.

Related to that, Station Eleven describes a group of people in a post-pandemic world travelling around performing Shakespeare plays. At first I thought this was a ridiculous conceit. Then I realised that this was the whole point. We don’t have to watch Shakespeare to survive. But there’s a difference between surviving and living.

I’m quite certain that one positive outcome of The Situation will be a new-found appreciation for activities we don’t have to do. I’m looking forward to sitting in a pub with a friend or two, or going to see a band, or a play or a film, and just thinking “this is nice.”

Tuesday, March 17th, 2020

Home

Clearleft is a remote-working company right now. I mean, that’s hardly surprising—just about everyone I know is working from home.

We made this decision on Friday. It was clear that the spread of COVID-19 was going exponential (even with the very incomplete data available in the UK). Despite the wishy-washy advice from the government—which has since pivoted drastically—we made the decision to literally get ahead of the curve. We had one final get-together in the studio yesterday to plan logistics and pick up equipment. Then it was time to start this chapter.

I’ve purloined:

  • one Aeron chair,
  • one big monitor,
  • a wired keyboard,
  • a wireless mouse, and
  • noise-cancelling headphones.

Cassie kindly provided the use of her van to get that stuff home. The Aeron chair proved to be extremely tricky to get through my narrow front door. For a while there, it looked like I’d need to take the door off the hinges but with a whole lotta pushin’ and a-pullin’ Jessica and I managed to somehow get it in.

Now I’ve got a reasonable home studio set up, I can get back to working on that conference talk I’m prepar… Oh.

Yeah, I guess I’ve got a stay of execution on that. For the past few months I’ve had my head down preparing a new hour-long talk for An Event Apart. Yes, it takes me that long to put a talk together—it feels kinda like writing a book. I’d like to think it’s because I’m so meticulous, but the truth is that I’m just very slow at most things. Also, I’m a bad one for procrastinating.

For the past week or so, while I’ve been making pretty darn good progress on the talk, a voice in the back of my head has been whispering “Hey, maybe the conference won’t even happen!” In answer to which, the voice in the front of my head has been saying “Would you shut the fuck up? I’m trying to work here!”

I was due to debut the talk at An Event Apart Seattle in May. Sure enough, that event has quite rightly been cancelled. So have a lot of other excellent events. It’s a real shame, and my heart goes out to event organisers who pour so much of themselves into their events; their love, their care, and not least, their financial risk.

I speak at quite a few events every year. I really, really enjoy it. For one, public speaking is one of the few things I think I’m actually any good at. Also, I just love the chance to meet my peers and collectively nerd out together for a short while.

Then there’s the travel. This year I was planning on drastically reducing my plane travel. I had bagged myself speaking slots in European cities that I could reach by train: Cologne, Strasbourg, even Lisbon, along with domestic destinations like Nottingham, Manchester, and Plymouth. I was quite looking forward to some train adventures. But those will have to wait.

Right now I’m going to settle into this new home routine. It’s not entirely new to me. Back in the early 2000s, I worked from home as a freelancer. Back then, Jessica and I worked not just in the same room, but on opposite ends of the same table!

We’ve got more room now. Jessica has her own office space. I’m getting used to mine. But as Jessica pointed out:

I’ve worked from home for 20(!) years, I love it, I’m made for it, I can’t imagine not doing it—and I’ve gotten absolutely nothing done for the past week.

Newly WFH folks, the situation now is totally unconducive to concentration and productivity. Be gentle with yourselves.

Wednesday, March 11th, 2020

A curl in every port

A few years back, Zach Bloom wrote The History of the URL: Path, Fragment, Query, and Auth. He recently expanded on it and republished it on the Cloudflare blog as The History of the URL. It’s well worth the time to read the whole thing. It’s packed full of fascinating tidbits.

In the section on ports, Zach says:

The timeline of Gopher and HTTP can be evidenced by their default port numbers. Gopher is 70, HTTP 80. The HTTP port was assigned (likely by Jon Postel at the IANA) at the request of Tim Berners-Lee sometime between 1990 and 1992.

Ooh, I can give you an exact date! It was January 24th, 1992. I know this because of the hack week in CERN last year to recreate the first ever web browser.

Kimberly was spelunking down the original source code, when she came across this line in the HTUtils.h file:

#define TCP_PORT 80 /* Allocated to http by Jon Postel/ISI 24-Jan-92 */

We showed this to Jean-François Groff, who worked on the original web technologies like libwww, the forerunner to libcurl. He remembers that day. It felt like they had “made it”, receiving the official blessing of Jon Postel (in the same RFC, incidentally, that gave port 70 to Gopher).

Then he told us something interesting about the next line of code:

#define OLD_TCP_PORT 2784 /* Try the old one if no answer on 80 */

Port 2784? That seems like an odd choice. Most of us would choose something easy to remember.

Well, it turns out that 2784 is easy to remember if you’re Tim Berners-Lee.

Those were the last four digits of his parents’ phone number.

Tuesday, March 10th, 2020

Lighthouse bookmarklet

I use Firefox. You should too. It’s fast, secure, and more privacy-focused than the leading browser from the big G.

When it comes to web development, the CSS developer tooling in Firefox is second-to-none. But when it comes to JavaScript and network-related debugging (like service workers), Chrome’s tools are currently better than Firefox’s (for now). For example, Chrome has a tab in its developer tools that lets you run Lighthouse on the currently open tab.

Yesterday, I got the Calibre newsletter, which always has handy performance-related links from Karolina. She pointed to a Lighthouse extension for Firefox. “Excellent!”, I thought, and I immediately installed it. But I had some qualms about installing a plug-in from Google into a browser from Mozilla, particularly as the plug-in page says:

This is not a Recommended Extension. Make sure you trust it before installing

Well, I gave it a go. It turns out that all it actually does is redirect to the online version of Lighthouse. “Hang on”, I thought. “This could just be a bookmarklet!”

So I immediately uninstalled the browser extension and made this bookmarklet:

Lighthouse

Drag that up to your desktop browser’s bookmarks toolbar. Press it whenever you’re on a site that you want to test.

Monday, March 9th, 2020

41 hours in Galway

It was my birthday recently. I’m a firm believer in the idea that birthday celebrations should last for more than 24 hours. A week is the absolute minimum.

For the day itself, I did indeed indulge in a most luxurious evening out with Jessica at The Little Fish Market in Hove (on the street where we used to live!). The chef, Duncan Ray, is an absolute genius and his love for all things fish-related shines through in his magnificent dishes.

But to keep the celebrations going, we also went on a weekend away to Galway, where I used to live decades ago. It was a quick trip but we packed in a lot. I joked at one point that it felt like one of those travel articles headlined with “36 hours in someplace.” I ran the numbers and it turned out we were in Galway for 41 hours, but I still thought it would be fun to recount events in the imperative style of one of those articles…

A surprisingly sunny day in Galway.

Saturday, February 29th

The 3:30pm train from Dublin will get you into Galway just before 6pm. The train station is right on the doorstep of Loam, the Michelin-starred restaurant where you’ve made your reservation. Enjoy a seven course menu of local and seasonal produce. Despite the quality of the dishes, you may find the overall experience is a little cool, and the service a touch over-rehearsed.

You’ll be released sometime between 8:30pm and 9pm. Stroll through Eyre Square and down Shop Street to the Jury’s Inn, your hotel. It’s nothing luxurious but it’s functional and the location is perfect. It’s close to everything without being in the middle of the noisy weekend action. The only noise you should hear is the rushing of the incredibly fast Corrib river outside your window.

Around 9:30pm, pop ‘round to Dominick Street to enjoy a cocktail in the America Village Apothecary. It’s only open two nights a week, and it’s a showcase of botanicals gathered in Connemara. Have them make you a tasty conconction and then spend time playing guess-that-smell with their specimen jars.

By 10:30pm you should be on your way round the corner to The Crane Bar on Sea Road. Go in the side entrance and head straight upstairs where the music session will be just getting started. Marvel at how chilled out it is for a Saturday night, order a pint, and sit and listen to some lovely jigs’n’reels. Don’t forget to occassionally pester one of the musicians by asking “What was that last tune called? Lovely set!”

Checked in at The Crane Bar. Great tunes! 🎻🎶 — with Jessica

Sunday, March 1st

Skip the hotel breakfast. Instead, get your day started with a coffee from Coffeewerk + Press. Get that coffee to go and walk over to Ard Bia at Nimmos, right at the Spanish Arch. Get there before it opens at 10am. There will already be a line. Once you’re in, order one of the grand brunch options and a nice big pot of tea. The black pudding hash will set you up nicely.

Checked in at Ard Bia at Nimmos. Black pudding hash and a pot of tea — with Jessica

While the weather is far clearer and sunnier than you were expecting, take the opportunity to walk off that hearty brunch with a stroll along the sea front. That’ll blow out the cobwebs.

Galway bay. Galway bay.

When the cold gets too much, head back towards town and duck into Charlie Byrne’s, the independent bookshop. Spend some time in there browsing the shelves and don’t leave without buying something to remember it by.

By 1pm or so, it’s time for some lunch. This is the perfect opportunity to try the sushi at Wa Cafe near the harbour. They have an extensive range of irrestistable nigiri, so just go ahead and get one of everything. The standouts are the local oyster, mackerel, and salmon.

Checked in at wa cafe. Sushi — with Jessica

From there, head to Tigh Cóilí for the 2pm session. Have a Guinness and enjoy the tunes.

Checked in at Tigh Cóilí. Afternoon session — with Jessica

Spend the rest of the afternoon strolling around town. You can walk through the market at St. Nicholas Church, and check out the little Claddagh ring museum at Thomas Dillon’s—the place where you got your wedding rings at the close of the twentieth century.

Return the ring from whence it came!

If you need a pick-me-up, get another coffee from Coffeewerk + Press, but this time grab a spot at the window upstairs so you can watch the world go by outside.

By 6pm, you’ll have a hankering for some more seafood. Head over to Hooked on Henry Street. Order a plate of oysters, and a cup of seafood chowder. If they’ve got ceviche, try that too.

Checked in at Hooked. with Jessica

Walk back along the canal and stop in to The Salt House to sample a flight of beers from Galway Bay Brewery. There’ll probably be some live music.

Checked in at The Salt House. 🍺 — with Jessica

With your appetite suitably whetted, head on over to Cava Bodega for some classic tapas. Be sure to have the scallops with black pudding.

Checked in at Cava Bodega. Scallops with black pudding — with Jessica

The evening session at Tigh Cóilí starts at 8:30pm on a Sunday so you can probably still catch it. You’ll hear some top-class playing from the likes of Mick Conneely and friends.

Checked in at Tigh Cóilí. 🎶🎻 — with Jessica

And when that’s done, there’s still time to catch the session over at The Crane.

Checked in at The Crane Bar. 🎶🎻 — with Jessica

Monday, March 2nd

After a nice lie-in, check out of the hotel and head to McCambridge’s on Shop Street for some breakfast upstairs. A nice bowl of porridge will set you up nicely for the journey back to Dublin.

If you catch the 11am train, you’ll arrive in Dublin by 1:30pm—just enough time to stop off in The Winding Stair for some excellent lunch before heading on to the airport.

Checked in at The Winding Stair. Lunch in Dublin — with Jessica

Getting there

Aer Lingus flies daily from Gatwick to Dublin. Dublin’s Heuston Station has multiple trains per day going to Galway.

Tuesday, March 3rd, 2020

Abolish Silicon Valley by Wendy Liu

I got an email a little while back from Michael at Repeater Books asking me if I wanted an advance copy of Abolish Silicon Valley: How to Liberate Technology From Capitalism by Wendy Liu. Never one to look a gift horse in the mouth, I said “Sure!”

I’m happy to say that the book is most excellent …or at least mostly excellent.

Contrary to what the book title—or its blurb—might tell you, this is a memoir first and foremost. It’s a terrific memoir. It’s utterly absorbing.

Just as the most personal songs can have the most universal appeal, this story feels deeply personal while being entirely accessible. You don’t have to be a computer nerd to sympathise with the struggles of a twenty-something in a start-up trying to make sense of the world. This well-crafted narrative will resonate with any human. It calls to mind Ellen Ullman’s excellent memoir, Close to the Machine—not a comparison I make lightly.

But as you might have gathered from the book’s title, Abolish Silicon Valley isn’t being marketed as a memoir:

Abolish Silicon Valley is both a heartfelt personal story about the wasteful inequality of Silicon Valley, and a rallying call to engage in the radical politics needed to upend the status quo.

It’s true that the book finishes with a political manifesto but that’s only in the final chapter or two. The majority of the book is the personal story, and just as well. Those last few chapters really don’t work in this setting. They feel tonally out of place.

Don’t get me wrong, the contents of those final chapters are right up my alley—they’re preaching to the converted here. But I think they would be better placed in their own publication. The heavily-researched academic style jars with the preceeding personal narrative.

Abolish Silicon Valley is 80% memoir and 20% manifesto. I worry that the marketing isn’t making that clear. It would be a shame if this great book didn’t find its audience.

The book will be released on April 14th. It’s available to pre-order now. I highly recommend doing just that. I think you’ll really enjoy it. But if you get mired down in the final few chapters, know that you can safely skip them.

Insecure

Universal access is at the heart of the World Wide Web. It’s also something I value when I’m building anything on the web. Whatever I’m building, I want you to be able to visit using whatever browser or device that you choose.

Just to be clear, that doesn’t mean that you’re going to have the same experience in an old browser as you are in the latest version of Firefox or Chrome. Far from it. Not only is that not feasible, I don’t believe it’s desirable either. But if you’re using an old browser, while you might not get to enjoy the newest CSS or JavaScript, you should still be able to access a website.

Applying the principle of progressive enhancement makes this emminently doable. As long as I build in a layered way, everyone gets access to the barebones HTML, even if they can’t experience newer features. Crucially, as long as I’m doing some feature detection, those newer features don’t harm older browsers.

But there’s one area where maintaining backward compatibility might well have an adverse effect on modern browsers: security.

I don’t just mean whether or not you’re serving sites over HTTPS. Even if you’re using TLS—Transport Layer Security—not all security is created equal.

Take a look at Mozilla’s very handy SSL Configuration Generator. You get to choose from three options:

  1. Modern. Services with clients that support TLS 1.3 and don’t need backward compatibility.
  2. Intermediate. General-purpose servers with a variety of clients, recommended for almost all systems.
  3. Old. Compatible with a number of very old clients, and should be used only as a last resort.

Because I value universal access, I should really go for the “old” setting. That ensures my site is accessible all the way back to Android 2.3 and Safari 1. But if I do that, I will be supporting TLS 1.0. That’s not good. My site is potentially vulnerable.

Alright then, I’ll go for “intermediate”—that’s the recommended level anyway. Now I’m no longer providing TLS 1.0 support. But that means some older browsers can no longer access my site.

This is exactly the situation I found myself in with The Session. I had a score of A+ from SSL Labs. I was feeling downright smug. Then I got emails from actual users. One had picked up an old Samsung tablet second hand. Another was using an older version of Safari. Neither could access the site.

Sure enough, if you cut off TLS 1.0, you cut off Safari below version six.

Alright, then. Can’t they just upgrade? Well …no. Apple has tied Safari to OS X. If you can’t upgrade your operating system, you can’t upgrade your browser. So if you’re using OS X Mountain Lion, you’re stuck with an insecure version of Safari.

Fortunately, you can use a different browser. It’s possible to install, say, Firefox 37 which supports TLS 1.2.

On desktop, that is. If you’re using an older iPhone or iPad and you can’t upgrade to a recent version of iOS, you’re screwed.

This isn’t an edge case. This is exactly the kind of usage that iPads excel at: you got the device a few years back just to do some web browsing and not much else. It still seems to work fine, and you have no incentive to buy a brand new iPad. And nor should you have to.

In that situation, you’re stuck using an insecure browser.

As a site owner, I can either make security my top priority, which means you’ll no longer be able to access my site. Or I can provide you access, which makes my site less secure for everyone. (That’s what I’ve done on The Session and now my score is capped at B.)

What I can’t do is tell you to install a different browser, because you literally can’t. Sure, technically you can install something called Firefox from the App Store, or you can install something called Chrome. But neither have anything to do with their desktop counterparts. They’re differently skinned versions of Safari.

Apple refuses to allow browsers with any other rendering engine to be installed. Their reasoning?

Security.

Telling the story of performance

At Clearleft, we’ve worked with quite a few clients on site redesigns. It’s always a fascinating process, particularly in the discovery phase. There’s that excitement of figuring out what’s currently working, what’s not working, and what’s missing completely.

The bulk of this early research phase is spent diving into the current offering. But it’s also the perfect time to do some competitor analysis—especially if we want some answers to the “what’s missing?” question.

It’s not all about missing features though. Execution is equally important. Our clients want to know how their users’ experience shapes up compared to the competition. And when it comes to user experience, performance is a huge factor. As Andy says, performance is a UX problem.

There’s no shortage of great tools out there for measuring (and monitoring) performance metrics, but they’re mostly aimed at developers. Quite rightly. Developers are the ones who can solve most performance issues. But that does make the tools somewhat impenetrable if you don’t speak the language of “time to first byte” and “first contentful paint”.

When we’re trying to show our clients the performance of their site—or their competitors—we need to tell a story.

Web Page Test is a terrific tool for measuring performance. It can also be used as a story-telling tool.

You can go to webpagetest.org/easy if you don’t need to tweak settings much beyond the typical site visit (slow 3G on mobile). Pop in your client’s URL and, when the test is done, you get a valuable but impenetrable waterfall chart. It’s not exactly the kind of thing I’d want to present to a client.

Fortunately there’s an attention-grabbing output from each test: video. Download the video of your client’s site loading. Then repeat the test with the URL of a competitor. Download that video too. Repeat for as many competitor URLs as you think appropriate.

Now take those videos and play them side by side. Presentation software like Keynote is perfect for showing multiple videos like this.

This is so much more effective than showing a table of numbers! Clients get to really feel the performance difference between their site and their competitors.

Running all those tests can take time though. But there are some other tools out there that can give a quick dose of performance information.

SpeedCurve recently unveiled Page Speed Benchmarks. You can compare the performance of sites within a particualar sector like travel, retail, or finance. By default, you’ll get a filmstrip view of all the sites loading side by side. Click through on each one and you can get the video too. It might take a little while to gather all those videos, but it’s quicker than using Web Page Test directly. And it might be that the filmstrip view is impactful enough for telling your performance story.

If, during your discovery phase, you find that performance is being badly affected by third-party scripts, you’ll need some way to communicate that. Request Map Generator is fantastic for telling that story in a striking visual way. Pop the URL in there and then take a screenshot of the resulting visualisation.

The beginning of a redesign project is also the time to take stock of current performance metrics so that you can compare the numbers after your redesign launches. Crux.run is really great for tracking performance over time. You won’t get any videos but you will get some very appealing charts and graphs.

Web Page Test, Page Speed Benchmarks, and Request Map Generator are great for telling the story of what’s happening with performance right nowCrux.run balances that with the story of performance over time.

Measuring performance is important. Communicating the story of performance is equally important.

Tuesday, February 18th, 2020

Utopia

Trys and James recently unveiled their Utopia project. They’ve been tinkering away at it behind the scenes for quite a while now.

You can check out the website and read the blog to get the details of how it accomplishes its goal:

Elegantly scale type and space without breakpoints.

I may well be biased, but I really like this project. I’ve been asking myself why I find it so appealing. Here are a few of the attributes of Utopia that strike a chord with me…

It’s collaborative

Collaboration is at the heart of Clearleft’s work. I know everyone says that, but we’ve definitely seen a direct correlation: projects with high levels of collaboration are invariably more successful than projects where people are siloed.

The genesis for Utopia came about after Trys and James worked together on a few different projects. It’s all too easy to let design and development splinter off into their own caves, but on these projects, Trys and James were working (literally) side by side. This meant that they could easily articulate frustrations to one another, and more important, they could easily share their excitement.

The end result of their collaboration is some very clever code. There’s an irony here. This code could be used to discourage collaboration! After all, why would designers and developers sit down together if they can just pass these numbers back and forth?

But I don’t think that Utopia will appeal to designers and developers who work in that way. Born in the spirit of collaboration, I suspect that it will mostly benefit people who value collaboration.

It’s intrinsic

If you’re a control freak, you may not like Utopia. The idea is that you specify the boundaries of what you’re trying to accomplish—minimum/maximum font sizes, minumum/maximum screen sizes, and some modular scales. Then you let the code—and the browser—do all the work.

On the one hand, this feels like surrending control. But on the other hand, because the underlying system is so robust, it’s a way of guaranteeing quality, even in situations you haven’t accounted for.

If someone asks you, “What size will the body copy be when the viewport is 850 pixels wide?”, your answer would have to be “I don’t know …but I do know that it will be appropriate.”

This feels like a very declarative way of designing. It reminds me of the ethos behind Andy and Heydon’s site, Every Layout. They call it algorithmic layout design:

Employing algorithmic layout design means doing away with @media breakpoints, “magic numbers”, and other hacks, to create context-independent layout components. Your future design systems will be more consistent, terser in code, and more malleable in the hands of your users and their devices.

See how breakpoints are mentioned as being a very top-down approach to layout? Remember the tagline for Utopia, which aims for fluid responsive design?

Elegantly scale type and space without breakpoints.

Unsurprisingly, Andy really likes Utopia:

As the co-author of Every Layout, my head nearly fell off from all of the nodding when reading this because this is the exact sort of approach that we preach: setting some rules and letting the browser do the rest.

Heydon describes this mindset as automating intent. I really like that. I think that’s what Utopia does too.

As Heydon said at Patterns Day:

Be your browser’s mentor, not its micromanager.

The idea is that you give it rules, you give it axioms or principles to work on, and you let it do the calculation. You work with the in-built algorithms of the browser and of CSS itself.

This is all possible thanks to improvements to CSS like calc, flexbox and grid. Jen calls this approach intrinsic web design. Last year, I liveblogged her excellent talk at An Event Apart called Designing Intrinsic Layouts.

Utopia feels like it has the same mindset as algorithmic layout design and intrinsic web design. Trys and James are building on the great work already out there, which brings me to the final property of Utopia that appeals to me…

It’s iterative

There isn’t actually much that’s new in Utopia. It’s a combination of existing techniques. I like that. As I said recently:

I’m a great believer in the HTML design principle, Evolution Not Revolution:

It is better to evolve an existing design rather than throwing it away.

First of all, Utopia uses the idea of modular scales in typography. Tim Brown has been championing this idea for years.

Then there’s the idea of typography being fluid and responsive—just like Jason Pamental has been speaking and writing about.

On the code side, Utopia wouldn’t be possible without the work of Mike Reithmuller and his breakthroughs on responsive and fluid typography, which led to Tim’s work on CSS locks.

Utopia takes these building blocks and combines them. So if you’re wondering if it would be a good tool for one of your projects, you can take an equally iterative approach by asking some questions…

Are you using fluid type?

Do your font-sizes increase in proportion to the width of the viewport? I don’t mean in sudden jumps with @media breakpoints—I mean some kind of relationship between font size and the vw (viewport width) unit. If so, you’re probably using some kind of mechanism to cap the minimum and maximum font sizes—CSS locks.

I’m using that technique on Resilient Web Design. But I’m not changing the relative difference between different sized elements—body copy, headings, etc.—as the screen size changes.

Are you using modular scales?

Does your type system have some kind of ratio that describes the increase in type sizes? You probably have more than one ratio (unlike Resilient Web Design). The ratio for small screens should probably be smaller than the ratio for big screens. But rather than jump from one ratio to another at an arbitrary breakpoint, Utopia allows the ratio to be fluid.

So it’s not just that font sizes are increasing as the screen gets larger; the comparative difference is also subtly changing. That means there’s never a sudden jump in font size at any time.

Are you using custom properties?

A technical detail this, but the magic of Utopia relies on two powerful CSS features: calc() and custom properties. These two workhorses are used by Utopia to generate some CSS that you can stick at the start of your stylesheet. If you ever need to make changes, all the parameters are defined at the top of the code block. Tweak those numbers and watch everything cascade.

You’ll see that there’s one—and only one—media query in there. This is quite clever. Usually with CSS locks, you’d need to have a media query for every different font size in order to cap its growth at the maximum screen size. With Utopia, the maximum screen size—100vw—is abstracted into a variable (a custom property). The media query then changes its value to be the upper end of your CSS lock. So it doesn’t matter how many different font sizes you’re setting: because they all use that custom property, one single media query takes care of capping the growth of every font size declaration.

If you’re already using CSS locks, modular scales, and custom properties, Utopia is almost certainly going to be a good fit for you.

If you’re not yet using those techniques, but you’d like to, I highly recommend using Utopia on your next project.

Friday, February 7th, 2020

IncrementURL

Last month I wrote some musings on default browser behaviours. When it comes to all the tasks that browsers do for us, the most fundamental is taking a URL, fetching its contents and giving us the results. As part of that process, browsers also show us the URL of the page currently loaded in a tab or window.

But even at this fundamental level, there are some differences from browser to browser.

Safari only shows you the domain name—and any subdomain names—by default. It looks like nice and tidy, but it obfuscates what page you’re on (until you click on the domain name). This is bad.

Chrome shows you the full URL, nice and straightforward. This is neutral.

Firefox, like Chrome, shows you the full URL, but with a subtle difference. The important part of the URL—usually the domain name—is subtly highlighted in a darker shade of grey. This is good.

The reason I say that what it highlights is usually the domain name is because what it actually highlights is eTLD+1.

The what now?

Well, if you’re looking at a page on adactio.com, that’s the important bit. But what if you’re looking at a page on adactio.github.io? The domain name is important, but so is the subdomain.

It turns out there’s a list out there of which sites and top level domains allow registrations like this. This is the list that Firefox is using for its shading behaviour in displaying URLs.

Safari, by the way, does not use this list. These URLs are displayed identically in Safari, the phisherman’s friend:

  • example.com
  • example.github.io
  • github.example.com

Whereas Firefox displays them as:

  • example.com
  • example.github.io
  • github.example.com

I learned all this from Jake on a recent edition of HTTP 203. Nicolas Hoizey has writen a nice little summary.

Jake acknowledges that what Apple is doing is shisuboptimal, what Firefox is doing is good, and then puts forward an idea for what Chrome could do. (But please note that this is Jake’s personal opinion; not an official proposal from the Chrome team.)

There’s some prior art here. It used to be that, if your SSL certificate included extended validation, the name would be shown in green next to the padlock symbol. So while my website—which uses regular SSL from Let’s Encrypt—would just have a padlock, Medium—which uses EV SSL—would have a padlock and the text “A Medium Corporation”.

Extended validation wasn’t quite the bulletproof verification it was cracked up to be. So browsers don’t use that interface pattern any more.

Jake suggests repurposing this pattern for all URLs. Pull out the important bit—eTLD+1—and show it next to the padlock.

Screenshots of @JaffaTheCake’s idea for separating out the eTLD+1 part of a URL in a browser’s address bar. Screenshots of @JaffaTheCake’s idea for separating out the eTLD+1 part of a URL in a browser’s address bar.

I like this. The full URL is still displayed. This proposal is more of an incremental change. An enhancement that is applied progressively, if you will.

I also like that it builds on existing interface patterns—Firefox’s URL treatment and the deprecated treatment of EV certs. In fact, I think the first step for Chrome should be to match Firefox’s current behaviour, and then go further with something like Jake’s proposal.

This kind of gradual change was exactly what Chrome did with displaying https and http domains.

Chrome treatment for HTTPS pages.

Jake mentions this in the video

We’ve already seen that you have to take small steps here, like we did with the https change.

There’s a fascinating episode of the Freakonomics podcast called In Praise of Incrementalism. I’ve huffduffed it.

I’m a great believer in the HTML design principle, Evolution Not Revolution:

It is better to evolve an existing design rather than throwing it away.

I’d love to see Chrome take the first steps to Jake’s proposal by following Firefox’s lead.

Then again, I’d love it if Chrome followed Firefox’s lead in implementing subgrid.

Thursday, February 6th, 2020

Hydration

As you may have noticed, I’m a fan of progressive enhancement.

It’s not cool. It’s often at odds with “modern” web development, so I end up looking like an old man yelling at a cloud to get off my lawn. Or something.

At its heart though, progressive enhancement seems fairly uncontroversial and inoffensive to me. It’s an approach. A mindset. Here’s how I describe it in Resilient Web Design:

  1. Identify core functionality.
  2. Make that functionality available using the simplest possible technology.
  3. Enhance!

Progressive enhancement makes use of the principle of least power:

Choose the least powerful language suitable for a given purpose.

That’s step two of the three-step process. But the third step is vital.

I think a lot of the hostility towards progressive enhancement comes from a misunderstanding of that three-step process, perhaps thinking that it stops at step two. I’m sure that some have intrepreted progressive enhancement as preventing developers from using the latest and greatest technology. Nothing could be further from the truth!

Taking a layered approach to building on the web gives you permission to try cutting‐edge JavaScript APIs, regardless of how many or how few browsers currently implement them.

The most common misunderstanding of progressive enhancement is that it’s inherently about JavaScript. That’s not true. You can apply progressive enhancement at every step of front-end development: HTML, CSS, and JavaScript.

But because of JavaScript’s strict error-handling model (at least compared to HTML and CSS), it’s in the JavaScript layer that the lack of a progressive enhancement mindset is most often felt.

That’s why I was saddened by the rise of frameworks and mindsets that assume the availability of JavaScript. Single page apps generally follow this assumption. Everything is delivered via JavaScript: content, markup, styles, and behaviour.

This leads to a terrible situation for performance. The user is left staring at a blank screen, waiting for something—anything!—to appear. Browsers are optimised to stream HTML as soon as they can. Delivering your content via JavaScript rather than HTML means you’re not taking advantage of that optimisation. Your users suffer.

But I was very heartened when I saw the pendulum start to swing back the other way a bit…

Let’s say you’re using a JavaScript framework like React. But the reason you’re using it isn’t because you’re doing anything particularly complex in the browser involving state management. You might be using React because you really like the way it encourages modularity and componentisation.

A few years ago, making a single page app was pretty much the only way you could use React. For you as a developer to experience the benefits of modularity and componentisation, users had to pay the price in the payload (and fragility) of client-side JavaScript.

That’s no longer the case. Now that we can run JavaScript on the server, it’s possible to build in a modular, componentised way and still use progressive enhancement.

When I first heard about Gatsby and Next.js, I thought that was the selling point. Run React on the server; send pre-generated HTML down the wire to the user; then enhance with client-side JavaScript.

But that’s not exactly how it works. The pre-generated HTML isn’t functional. It still needs a bucketload of JavaScript before it can do anything. The actual process is: Run React on the server; send pre-generated HTML down the wire to the user; then send everything again but this time in JavaScript, bundled with the entire React library.

This leads to a situation for users that’s almost worse than before. Instead of staring at a blank screen, now they get HTML lickety-split—excellent! But if they try to interact with what’s on screen, they’ll find that nothing is working yet. Even worse, once the JavaScript is delivered, and is being parsed, they probably can’t even scroll—their device is too busy interpreting all that JavaScript. Your users suffer.

All your content is sent twice. First HTML is sent from the server. These days this is called “server-side rendering”, even though for decades the technical term was “serving a web page” (I’m pretty sure the rendering part happens in a browser). Then a JavaScript library—plus all your bespoke JavaScript—is loaded. Then all your content is loaded again as JSON.

So you’ve got a facade of an interface that you can’t actually interact with until a deluge of JavaScript has been loaded, parsed and executed. The term used for this stage of the process is “hydration”, which makes it sound more like a relaxing treatment from Gwyneth Paltrow than the horrible user experience it is.

The idea is that subsequent navigations—which will happen with Ajax—should be snappy. But the price has already been paid by then. The initial loading experience is jagged and frustrating.

Don’t get me wrong: server-side rendering is great …if what you’re sending from the server is functional. It’s the combination of hollow HTML sent from the server, followed by a huge browser-freezing dump of JavaScript that is an anti-pattern.

This use of server-side rendering followed by hydration feels like progressive enhancement, because it separates out the delivery of markup and scripts. But it’s missing the mindset.

The layered approach of progressive enhancement echoes the separation of concerns in the front-end stack: HTML, CSS, and JavaScript—each layer expressing more power. But while these concepts are related, they’re not interchangable. Separating out the layers of your tech stack isn’t necessarily progressive enhancement. If you have some HTML that relies on JavaScript to be useful, then there’s no benefit in separating that HTML into a separate payload. The HTML that you initially send down the wire needs to be functional (at least at a basic level) before the JavaScript arrives.

I was a little disappointed to see Kyle Simpson—who I admire greatly—conflate separation of concerns with progressive enhancement in his talk from JSCamp 2019:

This content is here. I can see it, and it’s even styled. But I can’t click on the damn button because nothing has loaded in the JavaScript layer yet.

Anybody experienced that where you’ve been on a web page and it’s not really fully functional yet? I can see something but I can’t actually make any usage of it yet.

These are all things that cropped out of our thought process that said: “Let’s build the web in layers. Let’s deliver it progressively in layers. Because that’s morally right. We call this progressive enhancement. And let’s not worry too much about all these potential user experience flaws that may happen.”

That’s a spot-on description of server-side rendering and hydration, but it’s a gross mischaracterisation of progressive enhancement.

That button that requires JavaScript to work? That should’ve been generated with JavaScript. (For example, if you’re building a complex web app, consider sending a read-only view down the wire in HTML—then add any interactive interface elements with JavaScript in the browser.)

If people are equating progressive enhancement with thoughtless server-side rendering and hydration, then I can see why they’d be hostile towards it.

Users would be better served with unprogressive non-enhancement:

You take some structured content, which follows the vertical flow of the document in a way that everyone understands.

Which people traverse easily by either dragging their scroll bar with their mouse, or operating the keyboard using the up and down keys, or using the spacebar.

Or if they’re using a touch device, simply flicking backwards and forwards in that easy way that we’ve all become used to. What you do is you take that, and you fucking well leave it alone.

Alas, that’s not what tools like Gatsby offer. The latest post on their blog is called Why Gatsby is better with JavaScript:

But what about sites or pages where there is no client-side interactivity? Even for those pages, Gatsby offers performance benefits by including JavaScript.

I beg to differ.

(By the way, that same blog post also initially tried to equate the performance hit of client-side JavaScript with the performance hit of images. Andy explains why that’s disingenuous.)

Hope is on the horizon for React in the form of partial hydration. I sincerely hope that it will become the default way of balancing server-side rendering with just-in-time client-side interaction.

The situation we have now is the worst of both worlds: server-side rendering followed by a tsunami of hydration. It has a whiff of progressive enhancement to it (because there’s a cosmetic separation of concerns) but it has none of the user benefits.

Wednesday, February 5th, 2020

Design systems roundup

When I started writing a post about architects, gardeners, and design systems, it was going to be a quick follow-up to my post about web standards, dictionaries, and design systems. I had spotted an interesting metaphor in one of Frank’s posts, and I thought it was worth jotting it down.

But after making that connection, I kept writing. I wanted to point out the fetishism we have for creation over curation; building over maintenance.

Then the post took a bit of a dark turn. I wrote about how the most commonly cited reasons for creating a design system—efficiency and consistency—are the same processes that have led to automation and dehumanisation in the past.

That’s where I left things. Others have picked up the baton.

Dave wrote a post called The Web is Industrialized and I helped industrialize it. What I said resonated with him:

This kills me, but it’s true. We’ve industrialized design and are relegated to squeezing efficiencies out of it through our design systems. All CSS changes must now have a business value and user story ticket attached to it. We operate more like Taylor and his stopwatch and Gantt and his charts, maximizing effort and impact rather than focusing on the human aspects of product development.

But he also points out the many benefits of systemetising:

At the same time, I have seen first hand how design systems can yield improvements in accessibility, performance, and shared knowledge across a willing team. I’ve seen them illuminate problems in design and code. I’ve seen them speed up design and development allowing teams to build, share, and validate prototypes or A/B tests before undergoing costly guesswork in production. There’s value in these tools, these processes.

Emphasis mine. I think that’s a key phrase: “a willing team.”

Ethan tackles this in his post The design systems we swim in:

A design system that optimizes for consistency relies on compliance: specifically, the people using the system have to comply with the system’s rules, in order to deliver on that promised consistency. And this is why that, as a way of doing something, a design system can be pretty dehumanizing.

But a design system need not be a constraining straitjacket—a means of enforcing consistency by keeping creators from colouring outside the lines. Used well, a design system can be a tool to give creators more freedom:

Does the system you work with allow you to control the process of your work, to make situational decisions? Or is it simply a set of rules you have to follow?

This is key. A design system is the product of an organisation’s culture. That’s something that Brad digs into his post, Design Systems, Agile, and Industrialization:

I definitely share Jeremy’s concern, but also think it’s important to stress that this isn’t an intrinsic issue with design systems, but rather the organizational culture that exists or gets built up around the design system. There’s a big difference between having smart, reusable patterns at your disposal and creating a dictatorial culture designed to enforce conformity and swat down anyone coloring outside the lines.

Brad makes a very apt comparison with Agile:

Not Agile the idea, but the actual Agile reality so many have to suffer through.

Agile can be a liberating empowering process, when done well. But all too often it’s a quagmire of requirements, burn rates, and story points. We need to make sure that design systems don’t suffer the same fate.

Jeremy’s thoughts on industrialization definitely struck a nerve. Sure, design systems have the ability to dehumanize and that’s something to actively watch out for. But I’d also say to pay close attention to the processes and organizational culture we take part in and contribute to.

Matthew Ström weighed in with a beautifully-written piece called Breaking looms. He provides historical context to the question of automation by relaying the story of the Luddite uprising. Automation may indeed be inevitable, according to his post, but he also provides advice on how to approach design systems today:

We can create ethical systems based in detailed user research. We can insist on environmental impact statements, diversity and inclusion initiatives, and human rights reports. We can write design principles, document dark patterns, and educate our colleagues about accessibility.

Finally, the ouroboros was complete when Frank wrote down his thoughts in a post called Who cares?. For him, the issue of maintenance and care is crucial:

Care applies to the built environment, and especially to digital technology, as social media becomes the weather and the tools we create determine the expectations of work to be done and the economic value of the people who use those tools. A well-made design system created for the right reasons is reparative. One created for the wrong reasons becomes a weapon for displacement. Tools are always beholden to values. This is well-trodden territory.

Well-trodden territory indeed. Back in 2015, Travis Gertz wrote about Design Machines:

Designing better systems and treating our content with respect are two wonderful ideals to strive for, but they can’t happen without institutional change. If we want to design with more expression and variation, we need to change how we work together, build design teams, and forge our tools.

Also on the topic of automation, in 2018 Cameron wrote about Design systems and technological disruption:

Design systems are certainly a new way of thinking about product development, and introduce a different set of tools to the design process, but design systems are not going to lessen the need for designers. They will instead increase the number of products that can be created, and hence increase the demand for designers.

And in 2019, Kaelig wrote:

In order to be fulfilled at work, Marx wrote that workers need “to see themselves in the objects they have created”.

When “improving productivity”, design systems tooling must be mindful of not turning their users’ craft into commodities, alienating them, like cogs in a machine.

All of this is reminding me of Kranzberg’s first law:

Technology is neither good nor bad; nor is it neutral.

I worry that sometimes the messaging around design systems paints them as an inherently positive thing. But design systems won’t fix your problems:

Just stay away from folks who try to convince you that having a design system alone will solve something.

It won’t.

It’s just the beginning.

At the same time, a design system need not be the gateway drug to some kind of post-singularity future where our jobs have been automated away.

As always, it depends.

Remember what Frank said:

A well-made design system created for the right reasons is reparative. One created for the wrong reasons becomes a weapon for displacement.

The reasons for creating a design system matter. Those reasons will probably reflect the values of the company creating the system. At the level of reasons and values, we’ve gone beyond the bounds of the hyperobject of design systems. We’re dealing in the area of design ops—the whys of systemising design.

This is why I’m so wary of selling the benefits of design systems in terms of consistency and efficiency. Those are obviously tempting money-saving benefits, but followed to their conclusion, they lead down the dark path of enforced compliance and eventually, automation.

But if the reason you create a design system is to empower people to be more creative, then say that loud and proud! I know that creativity, autonomy and empowerment is a tougher package to sell than consistency and efficiency, but I think it’s a battle worth fighting.

Design systems are neither good nor bad (nor are they neutral).

Addendum: I’d just like to say how invigorating it’s been to read the responses from Dave, Ethan, Brad, Matthew, and Frank …all of them writing on their own websites. Rumours of the demise of blogging may have been greatly exaggerated.

Monday, February 3rd, 2020

Three books

Lurking: How a Person Became a User by Joanne McNeil will be published on February 25th.

In Lurking, Joanne McNeil digs deep and identifies the primary (if sometimes contradictory) concerns of people online: searching, safety, privacy, identity, community, anonymity, and visibility. She charts what it is that brought people online and what keeps us here even as the social equations of digital life—what we’re made to trade, knowingly or otherwise, for the benefits of the internet—have shifted radically beneath us. It is a story we are accustomed to hearing as tales of entrepreneurs and visionaries and dynamic and powerful corporations, but there is a more profound, intimate story that hasn’t yet been told.

Enemy of All Mankind: A True Story of Piracy, Power, and History’s First Global Manhunt by Steven Johnson will be published on May 12th:

Henry Every was the seventeenth century’s most notorious pirate. The press published wildly popular—and wildly inaccurate—reports of his nefarious adventures. The British government offered enormous bounties for his capture, alive or (preferably) dead. But Steven Johnson argues that Every’s most lasting legacy was his inadvertent triggering of a major shift in the global economy. Enemy of All Mankind focuses on one key event—the attack on an Indian treasure ship by Every and his crew—and its surprising repercussions across time and space. It’s the gripping tale one of the most lucrative crimes in history, the first international manhunt, and the trial of the seventeenth century.

How To Future: Leading and Sense-Making in an Age of Hyperchange by Scott Smith with Madeline Ashby will be published on July 3rd:

Successfully designing for a future requires a picture of that future—a useful map of the horizons ahead that can be used for wayfinding, identifying emerging opportunities or risks. Accurately developing this map means investing in better awareness of signals about the future, understanding trends in context, developing rich insights about what those signals indicate—relative to companies, people, citizens or stakeholders. It also means cultivating ways to share these future insights through tangible yet provocative scenarios or stories, turn these into prototypes, or connect them to strategies.

Friday, January 31st, 2020

Union

The nation I live in has decided to impose sanctions on itself. The government has yet to figure out the exact details. It won’t be good.

Today marks the day that the ironically-named Kingdom of Great Britain and Northern Ireland officially leaves the European Union. Nothing will change on a day to day basis (until the end of this year, when the shit really hits the fan).

Looking back on 2019, I had the pleasure and privelige of places that will remain in the European Union. Hamburg, Düsseldorf, Utrecht, Miltown Malbay, Kinsale, Madrid, Amsterdam, Paris, Frankfurt, Antwerp, Berlin, Vienna, Cobh.

Maybe I should do a farewell tour in 2020.

Grüße aus Hamburg!

Auf Wiedersehen, Düsseldorf!

Going for a stroll in Utrecht at dusk.

The road to Miltown.

Checked in at Kinsale Harbour. with Jessica

Checked in at La Casa del Bacalao. Tapas! — with Jessica

Hello Amsterdam!

Indoor aviation.

Guten Tag, Frankfurt.

Catch you later, Antwerp.

The Ballardian exterior of Tempelhof.

Losing my religion.

Boats in Cobh.