Journal

2776 sparkline

Wednesday, April 21st, 2021

Get the FLoC out

I’ve always liked the way that web browsers are called “user agents” in the world of web standards. It’s such a succinct summation of what browsers are for, or more accurately who browsers are for. Users.

The term makes sense when you consider that the internet is for end users. That’s not to be taken for granted. This assertion is now enshrined in the Internet Engineering Task Force’s RFC 8890—like Magna Carta for the network age. It’s also a great example of prioritisation in a design principle:

When there is a conflict between the interests of end users of the Internet and other parties, IETF decisions should favor end users.

So when a web browser—ostensibly an agent for the user—prioritises user-hostile third parties, we get upset.

Google Chrome—ostensibly an agent for the user—is running an origin trial for Federated Learning of Cohorts (FLoC). This is not a technology that serves the end user. It is a technology that serves third parties who want to target end users. The most common use case is behavioural advertising, but targetting could be applied for more nefarious purposes.

The Electronic Frontier Foundation wrote an explainer last month: Google Is Testing Its Controversial New Ad Targeting Tech in Millions of Browsers. Here’s What We Know.

Let’s back up a minute and look at why this is happening. End users are routinely targeted today (for behavioural advertising and other use cases) through third-party cookies. Some user agents like Apple’s Safari and Mozilla’s Firefox are stamping down on this, disabling third party cookies by default.

Seeing which way the wind is blowing, Google’s Chrome browser will also disable third-party cookies at some time in the future (they’re waiting to shut that barn door until the fire is good’n’raging). But Google isn’t just in the browser business. Google is also in the ad tech business. So they still want to advertisers to be able to target end users.

Yes, this is quite the cognitive dissonance: one part of the business is building a user agent while a different part of the company is working on ways of tracking end users. It’s almost as if one company shouldn’t simultaneously be the market leader in three separate industries: search, advertising, and web browsing. (Seriously though, I honestly think Google’s search engine would get better if it were split off from the parent company, and I think that Google’s web browser would also get better if it were a separate enterprise.)

Anyway, one possible way of tracking users without technically tracking individual users is to assign them to buckets, or cohorts of interest based on their browsing habits. Does that make you feel safer? Me neither.

That’s what Google is testing with the origin trial of FLoC.

If you, as an end user, don’t wish to be experimented on like this, there are a few things you can do:

  • Don’t use Chrome. No other web browser is participating in this experiment. I recommend Firefox.
  • If you want to continue to use Chrome, install the Duck Duck Go Chrome extension.
  • Alternatively, if you manually disable third-party cookies, your Chrome browser won’t be included in the experiment.
  • Or you could move to Europe. The origin trial won’t be enabled for users in the European Union, which is coincidentally where GDPR applies.

That last decision is interesting. On the one hand, the origin trial is supposed to be on a small scale, hence the lack of European countries. On the other hand, the origin trial is “opt out” instead of “opt in” so that they can gather a big enough data set. Weird.

The plan is that if and when FLoC launches, websites would have to opt in to it. And when I say “plan”, I meanbest guess.”

I, for one, am filled with confidence that Google would never pull a bait-and-switch with their technologies.

In the meantime, if you’re a website owner, you have to opt your website out of the origin trial. You can do this by sending a server header. A meta element won’t do the trick, I’m afraid.

I’ve done it for my sites, which are served using Apache. I’ve got this in my .conf file:

<IfModule mod_headers.c>
Header always set Permissions-Policy "interest-cohort=()"
</IfModule>

If you don’t have access to your server, tough luck. But if your site runs on Wordpress, there’s a proposal to opt out of FLoC by default.

Interestingly, none of the Chrome devs that I follow are saying anything about FLoC. They’re usually quite chatty about proposals for potential standards, but I suspect that this one might be embarrassing for them. It was a similar situation with AMP. In that case, Google abused its monopoly position in search to blackmail publishers into using Google’s format. Now Google’s monopoly in advertising is compromising the integrity of its browser. In both cases, it makes it hard for Chrome devs claiming to have the web’s best interests at heart.

But one of the advantages of having a huge share of the browser market is that Chrome can just plough ahead and unilaterily implement whatever it wants even if there’s no consensus from other browser makers. So that’s what Google is doing with FLoC. But their justification for doing this doesn’t really work unless other browsers play along.

Here’s Google’s logic:

  1. Third-party cookies are on their way out so advertisers will no longer be able to use that technology to target users.
  2. If we don’t provide an alternative, advertisers and other third parties will use fingerprinting, which we all agree is very bad.
  3. So let’s implement Federated Learning of Cohorts so that advertisers won’t use fingerprinting.

The problem is with step three. The theory is that if FLoC gives third parties what they need, then they won’t reach for fingerprinting. Even if there were any validity to that hypothesis, the only chance it has of working is if every browser joins in with FLoC. Otherwise ad tech companies are leaving money on the table. Can you seriously imagine third parties deciding that they just won’t target iPhone or iPad users any more? Remember that Safari is the only real browser on iOS so unless FLoC is implemented by Apple, third parties can’t reach those people …unless those third parties use fingerprinting instead.

Google have set up a situation where it looks like FLoC is going head-to-head with fingerprinting. But if FLoC becomes a reality, it won’t be instead of fingerprinting, it will be in addition to fingerprinting.

Google is quite right to point out that fingerprinting is A Very Bad Thing. But their concerns about fingerprinting sound very hollow when you see that Chrome is pushing ahead and implementing a raft of browser APIs that other browser makers quite rightly point out enable more fingerprinting: Web Bluetooth, Battery Status, Proximity Sensor, and so on.

When it comes to those APIs, the message from Google is that fingerprinting is a solveable problem.

But when it comes to third party tracking, the message from Google is that fingerprinting is inevitable and so we must provide an alternative.

Which one is it?

Google’s flimsy logic for why FLoC is supposedly good for end users just doesn’t hold up. If they were honest and said that it’s to maintain the status quo of the ad tech industry, it would make much more sense.

The flaw in Google’s reasoning is the fundamental idea that tracking is necessary for advertising. That’s simply not true. Sacrificing user privacy is fundamental to behavioural advertising …but behavioural advertising is not the only kind of advertising. It isn’t even a very good kind of advertising.

Marko Saric sums it up:

FLoC seems to be Google’s way of saving a dying business. They are trying to keep targeted ads going by making them more “privacy-friendly” and “anonymous”. But behavioral profiling and targeted advertisement is not compatible with a privacy-respecting web.

What’s striking is that the very monopolies that make Google and Facebook the leaders in behavioural advertising would also make them the leaders in contextual advertising. Almost everyone uses Google’s search engine. Almost everyone uses Facebook’s social network. An advertising model based on what you’re currently looking at would keep Google and Facebook in their dominant positions.

Google made their first many billions exclusively on contextual advertising. Google now prefers to push the message that behavioral advertising based on personal data collection is superior but there is simply no trustworthy evidence to that.

I sincerely hope that Chrome will align with Safari, Firefox, Vivaldi, Brave, Edge and every other web browser. Everyone already agrees that fingerprinting is the real enemy. Imagine the combined brainpower that could be brought to bear on that problem if all browsers made user privacy a priority.

Until that day, I’m not sure that Google Chrome can be considered a user agent.

Tuesday, April 20th, 2021

Numbers

Core web vitals from Google are the ingredients for an alphabet soup of exlusionary intialisms. But once you get past the unnecessary jargon, there’s a sensible approach underpinning the measurements.

From May—no, June—these measurements will be a ranking signal for Google search so performance will become more of an SEO issue. This is good news. This is what Google should’ve done years ago instead of pissing up the wall with their dreadful and damaging AMP project that blackmailed publishers into using a proprietary format in exchange for preferential search treatment. It was all done supposedly in the name of performance, but in reality all it did was antagonise users and publishers alike.

Core web vitals are an attempt to put numbers on user experience. This is always a tricky balancing act. You’ve got to watch out for the McNamara fallacy. Harry has already started noticing this:

A new and unusual phenomenon: clients reluctant (even refusing) to fix performance issues unless they directly improve Vitals.

Once you put a measurement on something, there’s a danger of focusing too much on the measurement. Chris is worried that we’re going to see tips’n’tricks for gaming core web vitals:

This feels like the start of a weird new era of web performance where the metrics of web performance have shifted to user-centric measurements, but people are implementing tricky strategies to game those numbers with methods that, if anything, slightly harm user experience.

The map is not the territory. The numbers are a proxy for user experience, but it’s notoriously difficult to measure intangible ideas like pain and frustration. As Laurie says:

This is 100% the downside of automatic tools that give you a “score”. It’s like gameification. It’s about hitting that perfect score instead of the holistic experience.

And Ethan has written about the power imbalance that exists when Google holds all the cards, whether it’s AMP or core web vitals:

Google used its dominant position in the marketplace to force widespread adoption of a largely proprietary technology for creating websites. By switching to Core Web Vitals, those power dynamics haven’t materially changed.

We would do well to remember:

When you measure, include the measurer.

But if we’re going to put numbers to user experience, the core web vitals are a pretty good spread of measurements: largest contentful paint, cumulative layout shift, and first input delay.

(If you prefer using initialisms, remember that CFP is Certified Financial Planner, CLS is Community Legal Services, and FID is Flame Ionization Detector. Together they form CWV, Catholic War Veterans.)

The State of the Web — the links

An Event Apart Spring Summit is happening right now. I opened the show yesterday with a talk called The State Of The Web:

The World Wide Web has come a long way in its three decades of existence. There’s so much we can do now with HTML, CSS, and JavaScript: animation, layout, powerful APIs… we can even make websites that work offline! And yet the web isn’t exactly looking rosy right now. The problems we face aren’t technical in nature. We’re facing a crisis of expectations: we’ve convinced people that the web is slow, buggy, and inaccessible. But it doesn’t have to be this way. There is no fate but what we make. In this perspective-setting talk, we’ll go on a journey to the past, present, and future of web design and development. You’ll laugh, you’ll cry, and by the end, you’ll be ready to make the web better.

I wrote about preparing this talk and you can see the outline on Kinopio. I thought it turned out well, but I never actually know until people see it. So I’m very gratified and relieved that it went down very well indeed. Phew!

Eric and the gang at An Event Apart asked for a round-up of links related to this talk and I was more than happy to oblige. I’ve separated them into some of the same categories that the talk covers.

I know that these look like a completely disconnected grab-bag of concepts—you’d have to see the talk to get the connections. But even without context, these are some rabbit holes you can dive down…

Apollo 8

Hypertext

The World Wide Web

NASA

This (somewhat epic) slidedeck is done.

Thursday, April 8th, 2021

The state of UX

There is much introspection and navel-gazing in the world of user experience design. More than usual, I mean.

Jesse James Garrett recently said:

I don’t think I know anyone that’s been in UX more than a decade who’s happy with how it’s going.

In a recent issue of the dConstruct newsletter—which you really should subscribe to—I pointed to three bowls of porridge left out by three different ursine experience designers.

Mark Hurst wrote Why I’m losing faith in UX. Too hot!

Scott Berkun wrote How To Put Faith in Design. Too cold!

Peter Merholz wrote Waking up from the dream of UX. Just right!

As an aside, does it bother anyone else that the Goldilocks story violates the laws of thermodynamics?

Anyway, this hand-wringing around the role of UX today seemed like a suitably hot topic for one of our regular roundtable chats at Clearleft. We invited Peter along too and he was kind enough to give us his time.

It was a fun discussion. Peter pointed out that whenever he hears an older designer bemoaning the current state of design, he has to wonder what’s happened in their lives to make them feel that way (it’s like when people complain about the music of today and how it’s not as good as the music of whatever time period I was a teenager). And let’s face it, the good ol’ days weren’t so good for everyone. It was overwhelmingly dominated by privileged white dudes. The more that changes, the better …and it needs to change far, far more.

There was a general agreement that the current gnashing of teeth isn’t unique to UX. It’s something that just about any discipline will inevitably go through. Peter’s epiphany was to compare it with the hand-wringing around Agile:

The frustration exhibited with the “dream of UX” is (I think) identical to the frustration the original Agile community sees with how it has been industrialized (koff-SAFe-koff).

Perhaps the industrialisation of what once a cottage industry is the price of success. But that’s not necessarily bad, as long as you industrialise the right things. If UX has become the churning out of wireframes at scale, then something has gone very wrong. If UX has become the implementation of dark patterns at scale, then something has gone very wrong.

In some organisations, perhaps that’s exactly what’s happened. In which case, I can totally understand the disillusionment. But in other places, I see the opposite happening. I see UX designers bringing questions of ethics to the forefront. I see UX designers—dare I say it?—having their proverbial seat at the table.

Chris went so far as to claim that we are in fact in a golden age of user experience design. Controversial! But think about it, he said. Over the next few days, pay attention to interactions you have with technology, and consider the thought and skill that has gone into them.

I had Chris’s provocation in mind when I wrote about booking my vaccination appointment:

I just need to get in, accomplish my task, and get out again. This is where the World Wide Web shines.

Maybe Chris is right. Maybe the golden age of UX is here. It’s just not evenly distributed. Yet.

It’s an interesting time for the discipline of user experience design. I’ve always maintained that the best way to get a temperature check for your chosen field is to go to a really good conference. If you’re a UX designer and you want to understand the state of the UX nation, you should get a ticket for the online UX Fest in June. See you there!

Tuesday, April 6th, 2021

Of the web

I’m subscribed to a lot of blogs in my RSS reader. I follow some people because what they write about is very different to what I know about. But I also follow lots of people who have similar interests and ideas to me. So I’m not exactly in an echo chamber, but I do have the reverb turned up pretty high.

Sometimes these people post thoughts that are eerily similar to what I’ve been thinking about. Ethan has been known to do this. Get out of my head, Marcotte!

But even if Ethan wasn’t some sort of telepath, he’d still be in my RSS reader. We’re friends. Lots of the people in my RSS reader are my friends. When I read their words, I can hear their voices.

Then there are the people I’ve never met. Like Desirée García, Piper Haywood, or Jim Nielsen. Never met them, don’t know them, but damn, do I enjoy reading their blogs. Last year alone, I ended up linking to Jim’s posts ten different times.

Or Baldur Bjarnason. I can’t remember when I first came across his writing, but it really, really resonates with me. I probably owe him royalties for the amount of times I’ve cited his post Over-engineering is under-engineering.

His latest post is postively Marcottian in how it exposes what’s been fermenting in my own mind. But because he writes clearly, it really helps clarify my own thinking. It’s often been said that you should write to figure out what you think, and I can absolutely relate to that. But here’s a case where somebody else’s writing really helps to solidify my own thoughts.

Which type of novelty-seeking web developer are you?

It starts with some existentialist stock-taking. I can relate, what with the whole five decades thing. But then it turns the existential questioning to the World Wide Web itself, or rather, the people building the web.

In a way, it’s like taking the question of the great divide (front of the front end and back of the front end), and then turning it 45 degrees to reveal an entirely hidden dimension.

In examining the nature of the web, he hits on the litmus of how you view encapsulation:

I mention this first as it’s the aspect of the web that modern web developers hate the most without even giving it a label. Single-Page-Apps and GraphQL are both efforts to eradicate the encapsulation that’s baked into the foundation of every layer of the web.

Most modern devs are trying to get rid of it but it’s one of the web’s most strategic advantages.

I hadn’t thought of this before.

By default, if you don’t go against the grain of the web, each HTTP endpoint is encapsulated from each other.

Moreover, all of this can happen really fast if you aren’t going overboard with your CSS and JS.

He finishes with a look at another of the web’s most powerful features: distribution. In between are the things that make the web webby: hypertext and flexibility (The Dao of the Web).

It’s the idea that the web isn’t a single fixed thing but a fluid multitude whose shape is dictated by its surroundings.

This resonates with me because it highlights two different ways of viewing the web.

On the one hand, you can see the web purely as a distribution channel. In the past you might have been distributing a Flash movie. These days you might be distributing a single page app. Either way, the web is there as a low-friction way of getting your creation in front of other people.

The other way of building for the web is to go with the web’s grain, embracing flexibility and playing to the strengths of the medium through progressive enhancement. This is the distinction I was getting at when I talked about something being not just on the web, but of the web.

With that mindset, Baldur then takes us through some of the technologies that he’s excited about, like SvelteKit and Hotwire. I think it’s the same mindset that got me excited about service workers. As Baldur says:

They are helping the web become better at being its own thing.

That’s my tagline right there.

Saturday, April 3rd, 2021

Principles and the English language

I work with words. Sometimes they’re my words. Sometimes they’re words that my colleagues have written:

One of my roles at Clearleft is “content buddy.” If anyone is writing a talk, or a blog post, or a proposal and they want an extra pair of eyes on it, I’m there to help.

I also work with web technologies, usually front-of-the-front-end stuff. HTML, CSS, and JavaScript. The technologies that users experience directly in web browsers.

I think a lot about design principles for the web. The two principles I keep coming back to are the robustness principle and the principle of least power.

When it comes to words, the guide that I return to again and again is George Orwell, specifically his short essay, Politics and the English Language.

Towards the end, he offers some rules for writing.

  1. Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.
  2. Never use a long word where a short one will do.
  3. If it is possible to cut a word out, always cut it out.
  4. Never use the passive where you can use the active.
  5. Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.
  6. Break any of these rules sooner than say anything outright barbarous.

These look a lot like design principles. Not only that, but some of them look like specific design principles. Take the robustness principle:

Be conservative in what you send, be liberal in what you accept.

That first part applies to Orwell’s third rule:

If it is possible to cut a word out, always cut it out.

Be conservative in what words you send.

Then there’s the principle of least power:

Choose the least powerful language suitable for a given purpose.

Compare that to Orwell’s second rule:

Never use a long word where a short one will do.

That could be rephrased as:

Choose the shortest word suitable for a given purpose.

Or, going in the other direction, the principle of least power could be rephrased in Orwell’s terms as:

Never use a powerful language where a simple language will do.

Oh, I like that! I like that a lot.

Tuesday, March 30th, 2021

The principle of most availability

I’ve been thinking some more about the technical experience of booking a vaccination apointment and how much joy it brought me.

I’ve written before about how I’ve got a blind spot for the web so it’s no surprise that I was praising the use of a well marked-up form, styled clearly, and unencumbered by unnecessary JavaScript. But other technologies were in play too: Short Message Service (SMS) and email.

All of those technologies are platform-agnostic.

No matter what operating system I’m using, or what email software I’ve chosen, email works. It gets more complicated when you introduce HTML email. My response to that is the same as the old joke; you know the one: “Doctor, it hurts when I do this.” (“Well, don’t do that.”)

No matter what operating system my phone is using, SMS works. It gets more complicated when you introduce read receipts, memoji, or other additions. See my response to HTML email.

Then there’s the web. No matter what operating system I’m using on a device that could be a phone or a tablet or a laptop or desktop tower, and no matter what browser I’ve chosen to use, the World Wide Web works.

I originally said:

It feels like the principle of least power in action.

But another way of rephrasing “least power” is “most availability.” Technologies that are old, simple, and boring tend to be more widely available.

I remember when software used to come packaged in boxes and displayed on shelves. The packaging always had a list on the side. It looked like the nutritional information on a food product, but this was a list of “system requirements”: operating system, graphics card, sound card, CPU. I never liked the idea of system requirements. It felt so …exclusionary. And for me, the promise of technology was liberation and freedom to act on my own terms.

Hence my soft spot for the boring and basic technologies like email, SMS, and yes, web pages. The difference with web pages is that you can choose to layer added extras on top. As long as the fundamental functionality is using universally-supported technology, you’re free to enhance with all the latest CSS and JavaScript. If any of it fails, that’s okay: it falls back to a nice solid base.

Alas, many developers don’t build with this mindset. I mean, I understand why: it means thinking about users with the most boring, least powerful technology. It’s simpler and more exciting to assume that everyone’s got a shared baseline of newer technology. But by doing that, you’re missing out on one of the web’s superpowers: that something served up at the same URL with the same underlying code can simultaneously serve people with older technology and also provide a whizz-bang experience to people with the latest and greatest technology.

Anyway, I’ve been thinking about the kind of communication technologies that are as universal as email, SMS, and the web.

QR codes are kind of heading in that direction, although I still have qualms because of their proprietary history. But there’s something nice and lo-fi about them. They’re like print stylesheets in reverse (and I love print stylesheets). A funky little bridge between the physical and the digital. I just wish they weren’t so opaque: you never know if scanning that QR code will actually take you to the promised resource, or if you’re about to rickroll yourself.

Telephone numbers kind of fall into the same category as SMS, but with the added option of voice. I’ve always found the prospect of doing something with, say, Twilio’s API more interesting than building something inside a walled garden like Facebook Messenger or Alexa.

I know very little about chat apps or voice apps, but I don’t think there’s a cross-platform format that works with different products, right? I imagine it’s like the situation with native apps which require a different codebase for each app store and operating system. And so there’s a constant stream of technologies that try to fulfil the dream of writing once and running everywhere: React Native, Flutter.

They’re trying to solve a very clear and obvious problem: writing the same app more than once is really wasteful. But that’s the nature of the game when it comes to runtime-specific apps. The only alternative is to either deliberately limit your audience …or apply the principle of least power/most availability.

The wastefulness of having to write the same app for multiple platforms isn’t the only thing that puts me off making native apps. The exclusivity works in two directions. There’s the exclusive nature of the runtime that requires a bespoke codebase. There’s also the exclusive nature of the app store. It feels like a return to shelves of packaged software with strict system requirements. You can’t just walk in and put your software on the shelf. That’s the shopkeeper’s job.

There is no shopkeeper for the World Wide Web.

Monday, March 29th, 2021

Season two of the Clearleft podcast

Season two of the Clearleft podcast is in the can. Taking a step back and looking at the six episodes, I think it turned out very well indeed.

Episode One

Design Leadership. Lots of smart people in this one. And I like that the source material is a real mix: conference talks, a roundtable discussion, and an interview.

Episode Two

Employee Experience Design. More of a deep dive than a broad overview. It’s pretty much a two-hander from Chris and Katie.

Episode Three

Accessibility. This one got a lot of attention, and rightly so in my opinion. It’s got three excellent contributors: Laura, Léonie, and Cassie. My job was to get out of the way and string their knowledge bombs together.

Episode Four

Prototyping. I had three good stories to work with from Benjamin, Lorenzo, and Trys. Then at the last minute I was able to add an interview with Adekunle which ties the whole thing up nicely.

Episode Five

Diversity and Inclusion. I think this might be the highlight of the season. Again, it’s got a mix of source material from conference talks and interviews. The quality of the contributions is exceptionally good. Once again, I found my job was to mostly get out of the way and line things up so they flowed well.

Episode Six

Remote Work. I wish I could see that it was my perfect planning that led to this episode being released exactly one year on from the start of lockdown. But really it was just a very fortunate coincidence. It did give this episode some extra resonance though. And I like that the final episode of the season has the widest range of contributors. It’s like the whole cast came back for the season finale.

I also wrote a bit about what I did behind the scenes for each episode of this season:

  1. Design Leadership
  2. Employee Experience Design
  3. Accessibility
  4. Prototyping
  5. Diversity and Inclusion
  6. Remote Work

My sincerest thanks to everyone who contributed to this season of the Clearleft podcast, especially everyone outside Clearleft who kindly agreed to be interviewed: Temi, Laura, Léonie, Adekunle, Rifa, and Elaine.

The Clearleft podcast will take a little break now and so will I. But I’m already thinking about topics for the next season. I feel like I’m starting to get a feel for what’s working so you can another six-episode season down the line.

To make sure you don’t miss the next season, I recommend subscribing to the RSS feed, or you can subscribe on Apple, Spotify, Google, and anywhere else that dispenses podcasts.

Saturday, March 27th, 2021

Five decades

Phil turned 50 around the same time as I did. He took the opportunity to write some half-century notes. I thoroughly enjoyed reading them and it got me thinking about my own five decades of life.

0–10

A lot happened in the first few years. I was born in England but my family back moved to Ireland when I was three. Then my father died not long after that. I was young enough that I don’t really have any specific memories of that time. I have hazy impressionistic images of London in my mind but at this point I don’t know if they’re real or imagined.

10–20

Most of this time was spent being a youngster in Cobh, county Cork. All fairly uneventful. Being a teenage boy, I was probably a dickhead more than I realised at the time. It was also the 80s so there was a lot of shittiness happening in the background: The Troubles; Chernobyl; Reagan and Thatcher; the constant low-level expectation of nuclear annihilation. And most of the music was terrible—don’t let anyone tell you otherwise.

20–30

This was the period with the most new experiences. I started my twenties by dropping out of Art College in Cork and moving to Galway to be a full-time slacker. I hitch-hiked and busked around Europe. I lived in Canada for six months. Eventually I ended up in Freiburg in southern Germany where I met Jessica. The latter half of this decade was spent there, settling down a bit. I graduated from playing music on the street to selling bread in a bakery to eventually making websites. Before I turned 30, Jessica and I got married.

30–40

We move to Brighton! I continue to make websites and play music with Salter Cane. Half way through my thirties I co-found Clearleft with Andy and Rich. I also start writing books and speaking at conferences. I find that not only is this something I enjoy, but it’s something I’m actually good at. And it gives me the opportunity to travel and see more of the world.

40–50

It’s more of the same for the next ten years. More Clearleft, more writing, more speaking and travelling. Jessica and I got a mortgage on a flat at the start of the decade and exactly ten years later we’ve managed to pay it off, which feels good (I don’t like having any debt hanging over me).

That last decade certainly feels less eventful than, say, that middle decade but then, isn’t that the way with most lives? As Phil says:

If my thirties went by more quickly than my twenties, my forties just zipped by.

You’ve got the formative years in your 20s when you’re trying to figure yourself out so you’re constantly dabbling in a bit of everything (jobs, music, drugs, travel) and then things get straighter. So when it comes to memories, your brain can employ a more rigourous compression algorithm. Instead of storing each year separately, your memories are more like a single year times five or ten. And so it feels like time passes much quicker in later life than it did in those more formative experimental years.

But experimentation can be stressful too—“what if I never figure it out‽” Having more routine can be satisfying if you’re reasonably confident you’ve chosen a good path. I feel like I have (but then, so do most people).

Now it’s time for the next decade. In the short term, the outlook is for more of the same—that’s the outlook for everyone while the world is on pause for The Situation. But once that’s over, who knows? I intend to get back to travelling and seeing the world. That’s probably more to do with being stuck in one place for over a year than having mid-century itchy feet.

I don’t anticipate any sudden changes in lifestyle or career. If anything, I plan to double down on doing things I like and saying “no” to any activities I now know I don’t like. So my future will almost certainly involve more websites, more speaking, maybe more writing, and definitely more Irish traditional music.

I feel like having reached the milestone of 50, I should have at least a few well-earned pieces of advice to pass on. The kind of advice I wish I had received when I was younger. But I’ve racked my brains and this is all I’ve got:

Never eat an olive straight off the tree. You know this already but maybe part of your mind thinks “how bad can it be really?” Trust me. It’s disgusting.

Tuesday, March 23rd, 2021

Service worker weirdness in Chrome

I think I’ve found some more strange service worker behaviour in Chrome.

It all started when I was checking out the very nice new redesign of WebPageTest. I figured while I was there, I’d run some of my sites through it. I passed in a URL from The Session. When the test finished, I noticed that the “screenshot” tab said that something was being logged to the console. That’s odd! And the file doing the logging was the service worker script.

I fired up Chrome (which isn’t my usual browser), and started navigating around The Session with dev tools open to see what appeared in the console. Sure enough, there was a failed fetch attempt being logged. The only time my service worker script logs anything is in the catch clause of fetching pages from the network. So Chrome was trying to fetch a web page, failing, and logging this error:

The service worker navigation preload request failed with a network error.

But all my pages were loading just fine. So where was the error coming from?

After a lot of spelunking and debugging, I think I’ve figured out what’s happening…

First of all, I’m making use of navigation preloads in my service worker. That’s all fine.

Secondly, the website is a progressive web app. It has a manifest file that specifies some metadata, including start_url. If someone adds the site to their home screen, this is the URL that will open.

Thirdly, Google recently announced that they’re tightening up the criteria for displaying install prompts for progressive web apps. If there’s no network connection, the site still needs to return a 200 OK response: either a cached copy of the URL or a custom offline page.

So here’s what I think is happening. When I navigate to a page on the site in Chrome, the service worker handles the navigation just fine. It also parses the manifest file I’ve linked to and checks to see if that start URL would load if there were no network connection. And that’s when the error gets logged.

I only noticed this behaviour because I had specified a query string on my start URL in the manifest file. Instead of a start_url value of /, I’ve set a start_url value of /?homescreen. And when the error shows up in the console, the URL being fetched is /?homescreen.

Crucially, I’m not seeing a warning in the console saying “Site cannot be installed: Page does not work offline.” So I think this is all fine. If I were actually offline, there would indeed be an error logged to the console and that start_url request would respond with my custom offline page. It’s just a bit confusing that the error is being logged when I’m online.

I thought I’d share this just in case anyone else is logging errors to the console in the catch clause of fetches and is seeing an error even when everything appears to be working fine. I think there’s nothing to worry about.

Update: Jake confirmed my diagnosis and agreed that the error is a bit confusing. The good news is that it’s changing. In Chrome Canary the error message has already been updated to:

DOMException: The service worker navigation preload request failed due to a network error. This may have been an actual network error, or caused by the browser simulating offline to see if the page works offline: see https://w3c.github.io/manifest/#installability-signals

Much better!