Tags: v

698

sparkline

Interaction 19

Right before heading to Geneva to spend the week hacking at CERN, I was in Seattle with a sizable Clearleft contingent to attend Interaction 19, the annual conference put on by the Interaction Design Association.

Ben has rounded up the highlights from my fellow Clearlefties. There are some good talks listed there: John Maeda, Nelly Ben Hayoun, and Jon Bell were thoroughly enjoyable. Some other talks were just okay, and there was one talk, by IXDA president Alok Nandi, that was almost impressive in how rambling and incoherent it was. It was like being in a scene from Silicon Valley. I remember clapping at the end; not out of appreciation, but out of relief.

If truth be told, Interaction 19 had about a day’s worth of really great content …spread out over three days. To be fair, that’s par for the course. When we went to Interaction 17 in New York, the hit/miss ratio was about the same:

There were some really good talks at the event, but alas, the muti-track format made it difficult to see all of them. Continuous partial FOMO was the order of the day.

And as I said at the time:

To be honest, the conference was only part of the motivation for the trip. Spending a week in New York with a gaggle of Clearlefties was its own reward.

So I’m willing to cut Interaction 19 a lot of slack. Even if quite a few of the talks were just so-so, getting to hang with Clearlefties in Seattle during snowmageddon was a lot of fun (and you’ll be pleased to hear that we didn’t even resort to cannibalism to survive).

But while the content of the conference was fair to middling, the organisation of it was a shambles:

Imagine the Fyre Festival but in downtown Seattle in winter. Welcome to @ixdconf. #ixd19

They sold more tickets than there were seats. I ended up watching the first morning’s keynotes being streamed to a screen in a conference room in a different building.

Now, I’ve been at events with keynotes that have overflow rooms—South by Southwest does this. But that’s at a different scale. This is a conference with a known number of attendees, each one of them spending over a thousand dollars to attend. I’m pretty sure that a first-come, first-served policy isn’t the best way of treating those attendees.

Anyway, here’s what I submitted for that round-up of the best talks, but which, for reasons of prudence, was omitted from the final post:

I really enjoyed the keynote by Liz Jackson on inclusive design. I would’ve enjoyed it even more if I could’ve seen it in person. Instead I watched it live-streamed to a meeting room two buildings over because the conference sold more tickets than they had seats for. This was after queueing in the cold for registration. So I feel like I learned a lot from Interaction 19 …about how not to organise a conference.

Still, as Ben notes:

We all enjoyed ourselves thoroughly, despite best efforts by the West Coast snow to disrupt the entire city.

I’m going to be back in Seattle in just under two weeks for An Event Apart. Now that’s a conference! It runs like a well-oiled machine, and every talk in its single track has been curated for excellence …with one exception.

Timelines of the web

Recreating the original WorldWideWeb browser was an exercise in digital archeology. With a working NeXT machine in the room, Kimberly was able to examine the source code for the first every browser and discover a treasure trove within. Like this gem in HTUtils.h:

#define TCP_PORT 80 /* Allocated to http by Jon Postel/ISI 24-Jan-92 */

Sure enough, by June of 1992 port 80 was documented as being officially assigned to the World Wide Web (Gopher got port 70). Jean-François Groff—who worked on the World Wide Web project with Tim Berners-Lee—told us that this was a moment they were very pleased about. It felt like this project of theirs was going places.

Jean-François also told us that the WorldWideWeb browser/editor was kind of like an advanced prototype. The idea was to get something up and running as quickly as possible. Well, the NeXT operating system had a very robust Text Object, so the path of least resistance for Tim Berners-Lee was to take the existing word-processing software and build a hypertext component on top of it. Likewise, instead of creating a brand new format, he used the existing SGML format and added one new piece: linking with A tags.

So the WorldWideWeb application was kind of like a word processor and document viewer mashed up with hypertext. Ted Nelson complains to this day that the original sin of the web was that it borrowed this page-based metaphor. But Nelson’s Project Xanadu, originally proposed in 1974 wouldn’t become a working reality until 2014—a gap of forty years. Whereas Tim Berners-Lee proposed his system in March 1989 and had working code within a year. There’s something to be said for being pragmatic and working with what you’ve got.

The web was also a mashup of ideas. Hypertext existed long before the web—Ted Nelson coined the term in 1963. There were conferences and academic discussions devoted to hypertext and hypermedia. But almost all the existing hypertext systems—including Tim Berners-Lee’s own ENQUIRE system from the early 80s—were confined to a local machine. Meanwhile networked computers were changing everything. First there was the ARPANET, then the internet. Tim Berners-Lee’s ambitious plan was to mash up hypertext with networks.

Going into our recreation of WorldWideWeb at CERN, I knew I wanted to convey this historical context somehow.

The World Wide Web officially celebrates its 30th birthday in March of this year. It’s kind of an arbitrary date: it’s the anniversary of the publication of Information Management: A Proposal. Perhaps a more accurate date would be the day the first website—and first web server—went online. But still. Let’s roll with this date of March 12, 1989. I thought it would be interesting not only to look at what’s happened between 1989 and 2019, but also to look at what happened between 1959 and 1989.

So now I’ve got two time cones that converge in the middle: 1959 – 1989 and 1989 – 2019. For the first time period, I made categories of influences: formats, hypertext, networks, and computing. For the second time period, I catalogued notable results: browsers, servers, and the evolution of HTML.

I did a little bit of sketching and quickly realised that these converging timelines could be represented somewhat like particle collisions. Once I had that idea in my head, I knew how I would be spending my time during the hack week.

Rather than jumping straight into the collider visualisation, I took some time to make a solid foundation to build on. I wanted to be sure that the timeline itself would be understable even if it were, say, viewed in the first ever web browser.

Progressive enhancement. Marking up (and styling) an interactive timeline that looks good in a modern browser and still works in the first ever web browser.

I marked up each timeline as an ordered list of h-events:

<li class="h-event y1968">
  <a href="https://en.wikipedia.org/wiki/NLS_%28computer_system%29" class="u-url">
    <time class="dt-start" datetime="1968-12-09">1968</time>
    <abbr class="p-name" title="oN-Line System">NLS</abbr>
  </a>
</li>

With the markup in place, I could concentrate on making it look halfway decent. For small screens, the layout is very basic—just a series of lists. When the screen gets wide enough, I lay those lists out horzontally one on top of the other. In this view, you can more easily see when events coincide. For example, ENQUIRE, Usenet, and Smalltalk all happen in 1980. But the real beauty comes when the screen is wide enough to display everthing at once. You can see how an explosion of activity in the early 90s. In 1994 alone, we get the release of Netscape Navigator, the creation of HTTPS, and the launch of Amazon.com.

The whole thing is powered by CSS transforms and positioning. Each year on a timeline has its own class that gets moved to the correct chronological point using calc(). I wanted to use translateX() but I couldn’t get the maths to work for that, so I had use plain ol’ left and right:

.y1968 {
  left: calc((1968 - 1959) * (100%/30) - 5em);
}

For events before 1989, it’s the distance of the event from 1959. For events after 1989, it’s the distance of the event from 2019:

.y2014 {
  right: calc((2019 - 2014) * (100%/30) - 5em);
}

(Each h-event has a width of 5em so that’s where the extra bit at the end comes from.)

I had to do some tweaking for legibility: bunches of events happening around the same time period needed to be separated out so that they didn’t overlap too much.

As a finishing touch, I added a few little transitions when the page loaded so that the timeline fans out from its centre point.

Et voilà!

Progressive enhancement. Marking up (and styling) an interactive timeline that looks good in a modern browser and still works in the first ever web browser.

I fiddled with the content a bit after peppering Robert Cailliau with questions over lunch. And I got some very valuable feedback from Jean-François. Some examples he provided:

1971: Unix man pages, one of the first instances of writing documents with a markup language that is interpreted live by a parser before being presented to the user.

1980: Usenet News, because it was THE everyday discussion medium by the time we created the web technology, and the Web first embraced news as a built-in information resource, then various platforms built on the web rendered it obsolete.

1982: Literary Machines, Ted Nelson’s book which was on our desk at all times

I really, really enjoyed building this “collider” timeline. It was a chance for me to smash together my excitement for web history with my enjoyment of using the raw materials of the web; HTML and CSS in this case.

The timeline pales in comparison to the achievement of the rest of the team in recreating the WorldWideWeb application but I was just glad to be able to contribute a little something to the project.

Hello WorldWideWeb.

Ch-ch-ch-changes

It’s browser updatin’ time! Firefox 65 just dropped. So did Chrome 72. Safari 12.1 is shipping with iOS 12.2.

It’s interesting to compare the release notes for each browser and see the different priorities reflected in them (this is another reason why browser diversity is A Good Thing).

A lot of the Firefox changes are updates to dev tools; they just keep getting better and better. In fact, I’m not sure “dev tools” is the right word for them. With their focus on layout, typography, and accessibility, “design tools” might be a better term.

Oh, and Firefox is shipping support for some CSS properties that really help with print style sheets, so I’m disproportionately pleased about that.

In Safari’s changes, I’m pleased to see that the datalist element is finally getting implemented. I’ve been a fan of that element for many years now. (Am I a dork for having favourite HTML elements? Or am I a dork for even having to ask that question?)

And, of course, it wouldn’t be a Safari release without a new made up meta tag. From the people who brought you such hits as viewport and apple-mobile-web-app-capable, comes …supported-color-schemes (Apple likes to make up meta tags almost as much as Google likes to make up rel values).

There’ll be a whole bunch of improvements in how progressive web apps will behave once they’ve been added to the home screen. We’ll finally get some state persistence if you navigate away from the window!

Updated the behavior of websites saved to the home screen on iOS to pause in the background instead of relaunching each time.

Maximiliano Firtman has a detailed list of the good, the bad, and the “not sure yet if good” for progressive web apps on iOS 12.2 beta. Thomas Steiner has also written up the progress of progressive web apps in iOS 12.2 beta. Both are published on Ev’s blog.

At first glance, the release notes for Chrome 72 are somewhat paltry. The big news doesn’t even seem to be listed there. Maximiliano Firtman again:

Chrome 72 for Android shipped the long-awaited Trusted Web Activity feature, which means we can now distribute PWAs in the Google Play Store!

Very interesting indeed! I’m not sure if I’m ready to face the Kafkaesque process of trying to add something to the Google Play Store just yet, but it’s great to know that I can. Combined with the improvements coming in iOS 12.2, these are exciting times for progressive web apps!

New Adventures 2019

My trip to Nottingham for the New Adventures conference went very well indeed.

First of all, I had an all-day workshop to run. I was nervous. Because I no longer prepare slides for workshops—and instead rely on exercises and discussions—I always feel like I’m winging it. I’m not winging it, but without the security blanket of a slide deck, I don’t have anything to fall back on.

As it turned out, I needn’t have worried. The workshop went great. Well, I thought it went great but you’d really have to ask the attendees to know for sure. One of the workshop participants, Westley Knight, wrote about his experience:

The workshop itself was fluid enough to cater to the topics that the attendees were interested in; from over-arching philosophy to technical detail around service workers and new APIs. It has helped me to understand that learning in this kind of environment doesn’t have to be rigorously structured, and can be shaped as the day progresses.

(By the way, if you’d like me to run this workshop at your company, get in touch.)

With the workshop done, it was time for me to freak out fully about my conference talk. I was set to open the show. No pressure.

Actually, I felt pretty damn good about what I had been preparing for the past few months (it takes me aaages to put a talk together), but I always get nervous about presenting new material—until I’ve actually given the talk in front of a real audience, I don’t actually know if it’s any good or not.

Clare was speaking right after me, but she was having some technical issues. It’s funny; as soon as she had a problem, I immediately switched modes from conference speaker to conference organiser. Instead of being nervous, I flipped into being calm and reassuring, getting Clare’s presentation—and fonts—onto my laptop, and making sure her talk would go as smoothly as possible (it did!).

My talk went down well. The audience was great. Everyone paid attention, laughed along with the jokes, and really listened to what I was trying to say. For a speaker, you can’t ask for better than that. And people said very nice things about the talk afterwards. Sam Goddard wrote about how it resonated with him.

Wearing my eye-watering loud paisley shirt on stage at New Adventures.

You can peruse the slides from my presentation but they make very little sense out of context. But video of the talk is forthcoming.

The advantage to being on first was that I got my talk over with at the start of the day. Then I could relax and enjoy all the other talks. And enjoy them I did! I think all of the speakers were feeling the same pressure I was, and everybody brought their A-game. There were some recurring themes throughout the day: responsibility; hope; diversity; inclusion.

So New Adventures was already an excellent event by the time we got to Ethan, who was giving the closing talk. His talk elevated the day into something truly sublime.

Look, I could gush over how good Ethan’s talk was, or try to summarise it, but there’s really no point. I’ll just say that I felt the same sense of being present at something genuinely important that I felt when I was in the room for his original responsive web design talk at An Event Apart back in 2010. When the video is released, you really must watch it. In the meantime, you can read through the articles and books that Ethan cited in his presentation.

New Adventures 2019 was worth attending just for that one talk. I was very grateful I had the opportunity to attend, and I still can’t quite believe that I also had the opportunity to speak.

Building links

In just over a week, I’ll be giving the opening talk at the New Adventures conference in Nottingham. I’ll be giving a workshop the day before too. There are still tickets available for both.

I have to admit, I’m kind of nervous about this talk. It’s been quite a while since the last New Adventures, but it’s always had quite the cachet. I think I went to most of them. It’s quite strange—and quite an honour—to shift gears from attendee to speaker.

The talk I’ll be giving is called Building. That might be a noun. That might be a verb. You decide:

Every new medium looks to what has come before for guidance. Web design has taken cues from centuries of typography and graphic design. Web development has borrowed metaphors and ideas from the world of architecture. Let’s take a tour of some of the most influential ideas from architecture that have crossed over into the web, from pattern languages to responsive design. Together we’ll uncover how to build resilient, performant, accessible and beautiful structures that work with the grain of the materials of the web.

This talk builds upon the talk I gave at last year’s An Event Apart called The Way Of The Web. It also reflects many of the ideas in Resilient Web Design. When I gave a run-through of the talk at Clearleft last week, Andy called it a “greatest hits.” For a while there, I was feeling guilty about retreading some ground I’ve covered in previous talks and writings. Then I realised it was pretty arrogant of me to think that anyone in the audience would be familiar with any of it.

Besides, I’ve got a whole new avenue of exploration in this talk. It’s about language and metaphor—how we talk about what we do on the web. I’ve just finished giving another run-through at the Clearleft studio and I’m feeling pretty good about it. That’s good, because I find that giving a talk in a small room to a handful of colleagues is way more stressful than giving a talk to hundreds of people at a conference.

Just as I put together links related to last year’s talk, I figured I’d provide some hyperlinks for anyone interested in the topics raised in this new talk…

Books

Articles

Audio

Writing for hiring

Cassie joined Clearleft as a junior front-end developer last year. It’s really wonderful having her around. It’s a win-win situation: she’s enthusiastic and eager to learn; I’m keen to help her skill up in any way I can. And it’s working out great for the company—she has already demonstrated that she can produce quality HTML and CSS.

I’m very happy about Cassie’s success, not just on a personal level, but also from a business perspective. Hiring people into junior roles—when you’ve got the time and ability to train them—is an excellent policy. Hiring Charlotte back in 2014 was Clearleft’s first foray into hiring for a junior front-end dev position and it was a huge success. Cassie is demonstrating that it wasn’t just a fluke.

Alas, we can’t only hire junior developers. We’ve got a lot of work in the pipeline right now and we’re going to need a full-time seasoned developer who can hit the ground running. That’s why Clearleft is recruiting for a senior front-end developer.

As lead developer, Danielle will make the hiring decision, but because she’s so busy on project work right now—hence the need to hire more people—I’m trying to help her out any way I can. I offered to write the job description.

Seeing as I couldn’t just write “A clone of Danielle, please”, I had to think about what makes for a great front-end developer who uses their experience wisely. But I didn’t want to create a list of requirements, and I certainly didn’t want to create a list of specific technologies.

My first instinct was to look at other job ads and take my cue from them. But, let’s face it, most job ads are badly written, and prone to turning into laundry lists. So I decided to just write like I normally would. You know, like a human.

Here’s what I wrote. I hope it’s okay. I don’t really have much to compare it to, other than what I don’t want it to be.

Have a read of it and see what you think. And if you’re an experienced front-end developer who’d like to work by the seaside, you should apply for the role.

Code print

You know what I like? Print stylesheets!

I mean, I’m not a huge fan of trying to get the damn things to work consistently—thanks, browsers—but I love the fact that they exist (athough I’ve come across a worrying number of web developers who weren’t aware of their existence). Print stylesheets are one more example of the assumption-puncturing nature of the web: don’t assume that everyone will be reading your content on a screen. News articles, blog posts, recipes, lyrics …there are many situations where a well-considered print stylesheet can make all the difference to the overall experience.

You know what I don’t like? QR codes!

It’s not because they’re ugly, or because they’ve been over-used by the advertising industry in completely inapropriate ways. No, I don’t like QR codes because they aren’t an open standard. Still, I must grudgingly admit that they’re a convenient way of providing a shortcut to a URL (albeit a completely opaque one—you never know if it’s actually going to take you to the URL it promises or to a Rick Astley video). And now that the parsing of QR codes is built into iOS without the need for any additional application, the barrier to usage is lower than ever.

So much as I might grit my teeth, QR codes and print stylesheets make for good bedfellows.

I picked up a handy tip from a Smashing Magazine article about print stylesheets a few years back. You can the combination of a @media print and generated content to provide a QR code for the URL of the page being printed out. Google’s Chart API provides a really handy shortcut for generating QR codes:

https://chart.googleapis.com/chart?cht=qr&chs=150x150&chl=http://example.com

Except that there’s no telling how long that will continue to work. Google being Google, they’ve deprecated the simple image chart API in favour of the over-engineered JavaScript alternative. So just as I recently had to migrate all my maps over to Leaflet when Google changed their Maps API from under the feet of developers, the clock is ticking on when I’ll have to find an alternative to the Image Charts API.

For now, I’ve got the QR code generation happening on The Session for individual discussions, events, recordings, sessions, and tunes. For the tunes, there’s also a separate URL for each setting of a tune, specifically for printing out. I’ve added a QR code there too.

Experimenting with print stylesheets and QR codes.

I’ve been thinking about another potential use for QR codes. I’m preparing a new talk for An Event Apart Seattle. The talk is going to be quite practical—for a change—and I’m going to be encouraging people to visit some URLs. It might be fun to include the biggest possible QR code on a slide.

I’d better generate the images before Google shuts down that API.

Books I read in 2018

I read twenty books in 2018, which is exactly the same amount as I read in 2017. Reflecting on that last year, I said “It’s not as many as I hoped.” It does seem like a meagre amount, but in my defence, some of the books I read this year were fairly hefty tomes.

I decided to continue my experiment from last year of alternating fiction and non-fiction books. That didn’t quite work out, but it makes for a good guiding principle.

In ascending reading order, these are the books I read in 2018

A Fire Upon The Deep by Vernor Vinge

★★★☆☆

I started this towards the end of 2017 and finished it at the start of 2018. A good sci-fi romp, but stretched out a little bit long.

Time Travel: A History by James Gleick

★★★★☆

I really enjoyed this, but then, that’s hardly a surprise. The subject matter is tailor made for me. I don’t think this quite matches the brilliance of Gleick’s The Information, but I got a real kick out of it. A book dedicated to unearthing the archeology of a science-fiction concept is a truly fascinating idea. And it’s not just about time travel, per se—this is a meditation on the nature of time itself.

Traction by Gino Wickman

Andy was quite taken with this management book and purchased multiple copies for the Clearleft leadership team. I’ll refrain from rating it because it was more like a homework assignment than a book I would choose to read. It crystalises some good organisational advice into practical steps, but it probably could’ve been quite a bit shorter.

Provenance by Ann Leckie

★★★☆☆

It feels very unfair but inevitable to compare this to Ann Leckie’s amazing debut Imperial Radch series. It’s not in quite the same league, but it’s also not trying to be. This standalone book has a lighter tone. It’s a rollicking good sci-fi procedural. It may not be as mind-blowingly inventive as Ancillary Justice, but it’s still a thoroughly enjoyable read.

Visions, Ventures, Escape Velocities: A Collection of Space Futures edited by Ed Finn and Joey Eschrich, with guest editor Juliet Ulman

★★★☆☆

This book is free to download so it’s rather excellent value for money. It alternates sci-fi short stories with essays. Personally, I would skip the essays—they’re all a bit too academic for my taste. But some of these stories are truly excellent. There’s a really nice flow to the collection: it begins in low Earth orbit, then expands out to the Mars, the asteroid belt, and beyond. Death on Mars by Madeline Ashby was a real standout for me.

The Best of Richard Matheson by Richard Matheson, edited by Victor LaValle

★★★★☆

For some reason, I was sent a copy of this book by an editor at Penguin Classics. I have no idea why, but thank you, Sam! This turned out to be a lot of fun. I had forgotten just how many classics of horror and sci-fi are the work of Richard Matheson. He probably wrote your favourite Twilight Zone episode. There’s a real schlocky enoyment to be had from snacking on these short stories, occassionally interspersed with genuinely disturbing moments and glimpses of beauty.

Close To The Machine: Technophilia And Its Discontents by Ellen Ullman

★★★☆☆

Lots of ’90s feels in this memoir. A lot of this still resonates today. It’s kind of fascinating to read it now with the knowledge of how this whole internet thing would end up going.

Gnomon by Nick Harkaway

★★★★☆

This gripped me from the start, and despite its many twisty strands, it managed to keep me with it all the way through. Maybe it’s a bit longer than it needs to be, and maybe some of the diversions don’t entirely work, but it makes up for that with its audaciousness. I still prefer Goneaway World, but any Nick Harkaway book is a must-read.

Hidden Figures by Margot Lee Shetterly

★★★★☆

Terrific stuff. If you’ve seen the movie, you’ve got about one tenth of the story. The book charts a longer arc and provides much deeper social and political context.

Dawn by Octavia Butler

★★★☆☆

This is filled with interesting ideas, but the story never quite gelled for me. I’m not sure if I should continue with the rest of the Lilith’s Brood series. But there’s something compelling and unsettling in here.

Sapiens: A Brief History Of Humankind by Yuval Noah Harari

★★☆☆☆

Frustratingly inconsistent. Here’s my full review.

The Fifth Season by N.K. Jemisin

★★★★☆

The Obelisk Gate by N.K. Jemisin

★★★☆☆

The Stone Sky by N.K. Jemisin

★★★☆☆

I devoured these books back-to-back. The Fifth Season was terrific—packed to the brim with inventiveness. But neither The Obelisk Gate nor The Stone Sky quite did it for me. Maybe my expectations were set too high by that first installment. But The Broken Earth is still a fascinating and enjoyable series.

Programmed Inequality by Marie Hicks

I was really looking forward to this one, but I found its stiff academic style hard to get through. I still haven’t finished it. But I figure if I could read Sapiens through to the end, I can certainly manage this. The subject matter is certainly fascinating, and the research is really thorough, but I’m afraid the book is showing its thesis roots.

The Power by Naomi Alderman

★★★☆☆

This plays out its conceit well, and it’s a fun read, but it’s not quite a classic. It feels more like a Neil Gamain or Lauren Beukes page-turner than, say, a Margaret Atwood exploration. Definitely worth a read, though.

New York 2140 by Kim Stanley Robinson

★★★★☆

The world-building (or maybe it’s world rebuilding) is terrific. But once again, as is often the case with Kim Stanley Robinson, I find the plot to be lacking. This is not in the same league as Aurora. It’s more like 2312-on-sea. It’s frustrating. I’m torn between giving it three stars or four. I’m going to be generous because even though it’s not the best Kim Stanley Robinson book, it contains some of his best writing. There are passages that are breathtakingly good.

A Thread Across The Ocean by John Steele Gordon

★★★★☆

After (temporarily) losing my library copy of New York 2140, I picked this up in a bookstore in Charlottesville so I’d have something to read during my stay there. I was very glad I did. I really, really enjoyed this. It’s all about the transatlantic telegraphic cable, so if that’s your thing—as it is mine—you’re going to enjoy this. It makes a great companion piece to Tom Standage’s The Victorian Internet. Come for the engineering, stay for the nautical tales of derring-do.

Borne by Jeff VanderMeer

★★★★☆

Not as disturbing as the Southern Reach Trilogy, but equally unsettling in its own way. Shades of Oryx and Crake, but in a more fantastically surreal setting.

The Airs Of Earth by Brian Aldiss

★★★☆☆

A good collection of short stories from the master of sci-fi. I’ve got a backlog of old pulpy paperback Aldiss collections like this that make for good snackfood for the mind.

Algorithms to Live By: The Computer Science of Human Decisions by Brian Christian and Tom Griffiths

A Christmas present from my brother-in-law. I just cracked this open, so you’ll have to come back next year to find out how it fared.

Alright. Now it’s time to pick the winners.

I think the best fiction book I read this year was Nick Harkaway’s Gnomon.

For non-fiction, it’s a tough call. I really enjoyed Hidden Figures and A Thread Across The Ocean, but I think I’m going to have to give the top spot to James Gleick’s Time Travel: A History.

But there were no five star books this year. Maybe that will change in 2019. And maybe I’ll read more books next year, too. We’ll see.

In 2017, seven of the twenty books I read were by women. In 2018, it was nine out of twenty (not counting anthologies). That’s better, but I want keep that trajectory going in 2019.

Cindy Li

2005 feels like a pivotal year in my memory. That’s the year that Rich, Andy and I formed Clearleft. It was also the year that the three of us went to South by Southwest for the first time. That was amazing. Not because of the event itself, but because of the people. I met, hung out with, and formed firm friendships with people whose blogs I had been reading for years—it really was like my RSS reader had come to life. It’s also where I met Cindy for the first time.

Me and Cindy

We ended up hanging out a lot there, and afterwards. She came to England. We met up in Florida (her family is in Jacksonville, not far from St. Augustine, where Jessica’s family is from—in fact, we may well have been in the same St. Augustine pastry shop at the same time before we even met). And of course we’d see each other at conferences …like that one time in San Diego, when she joined me in my first ever karaoke experience (little did I know that she was in on the rick-rolling). Wherever geeks gathered, Cindy was there. Cindy could outgeek all of us, whether it was nerding out about good food or Star Wars. That was until she met her match at the Web Directions North conference in Vanouver in early 2007.

The winter collection

Matt came all the way from England for that conference. I distinctly remember sitting with him on the bus back from the post-conference snowboarding trip to Whistler after the conference. He was able to point out all the filming locations from The X Files, Battlestar Galactica, and every other sci-fi TV show. He met Cindy the next day and, of course, they clicked.

Cindy ended up moving to San Francisco, and I’d visit her den of nerdery whenever I was in town. Meanwhile, Matt was crossing the Atlantic at every available opportunity to spend time with Cindy. On one of those trips, they went down to the courthouse and tied the knot.

Given the short notice for the wedding, they decided that they’d have a bigger marriage celebration further down the line. At that year’s South by Southwest, Cindy and Matt took me aside and asked if I’d officiate at their wedding. “But I can’t officiate a wedding!”, I said. “I’ve got no qualifications!” “A-ha!”, they pointed out, “It’s technically not an official marriage ceremony—we’re already married.”

November 6th, 2010.

That’s how I came to give the most important public speaking engagement of my life. It was nerve-wracking and wonderful.

Matt and Cindy

I’ll never forget when Cindy and Matt came to Boston for An Event Apart a few years later. I was so happy to see Cindy that I didn’t even notice the most striking thing about her; after we hugged, she just stared at me and pointed at her belly until the lightbulb went off over my head. Cindy was pregnant!

They had a beautiful baby boy named Apollo. Isn’t that an awesome name?

Jeremy with Apollo.

They managed to match that awesomeness with the naming of their second boy, Orion. Apollo and Orion!

When Cindy was pregnant with Orion, she didn’t have the opportunity to surprise me with the news in person, like she had done with Apollo. She Facetimed me and Jessica to tell us the news. But she had other news to share with us that she didn’t want to be widely known. She had just been diagnosed with cancer.

I don’t really want to talk about that, but just consider what it must have been like to be going through treatment and being pregnant at the same time! Orion is a miracle, and Cindy was the miracle worker.

(The reason I don’t want to talk about Cindy’s cancer is, well, for one, she didn’t want it to be known so I’m still thinking of it as a private matter, but also Cindy could never be defined by how she died, but rather how she lived.)

By this time Cindy and Matt had moved to Pittsburgh. Myself and Jessica visited when we could. I was there for my birthday last year, and together we recreacted delicacies from that pastry shop in Saint Augustine.

Apollo amazed

A lot of my memories of Cindy involve amazing food. Like that time we all went to The French Laundry. This year we made plans to go to Alinea in October. Cindy got reservations. Jessica and I booked our plane tickets. But it wasn’t to be. It became clear that Cindy wasn’t able to travel and that there wasn’t much time left. Instead of a trip to Chicago, we made a trip to Pittsburgh. We were hoping to see Cindy one last time. But she died just a few days before we showed up.

But remember what I said about Cindy being defined by her life, not her death? It’s so, so true. Literally everyone who knew her was a better person for it. Her energy. Her indominatable spirit. She really was truly inspiring. She still inspires me. I know it sounds like a cliché to say that only her body has gone, while her spirit lives on, but in this case, it’s really true. Her spirit is alive in every single person who knew her. And if that isn’t enough of a cliché, I’m going to come right out and say it: I’m a better person for having known Cindy.

What happened to Cindy was so horribly, horribly unfair (did I mention that she didn’t smoke, or drink, or even use bad language?). But there’s one thing that I’m so very grateful for: I’m so, so glad that she had Matt. I always knew that Cindy was amazing—she’s Wonder Woman—but I’ve come to realise that she really did find her match. Matt is Superman. I am in awe of his strength. I cannot imagine what he is going through right now. Like Cindy, he is an inspiration to me.

Cindy is gone, but that love between Cindy and Matt …that’s forever.

Cindy and Matt

Vienna

Back in December 1997, when Jessica and I were living in Freiburg, Dan came to visit. Together, we boarded a train east to Vienna. There we would ring in the new year to the sounds of the Salonorchester Alhambra, the band that Dan’s brother Andrew was playing in (and the band that would later be my first paying client when I made their website—I’ve still got the files lying around somewhere).

That was a fun New Year’s ball …although I remember my mortification when we went for gulash beforehand and I got a drop on the pristine tux that I had borrowed from Andrew.

My other memory of that trip was going to the Kunsthistorisches Museum to see the amazing Bruegel collection. It’s hard to imagine that ever being topped, but then this year, they put together a “once in a lifetime” collection, gathering even more Bruegel masterpieces together in Vienna.

Jessica got the crazy idea in her head that we could go there. In a day.

Looking at the flights, it turned out to be not such a crazy idea after all. Sure, it meant an early start, but it was doable. We booked our museum tickets, and then we booked plane tickets.

That’s how we ended up going to Vienna for the day this past Monday. It was maybe more time than I’d normally like to spend in airports in a 24 hour period, but it was fun. We landed, went into town for a wiener schnitzel, and then it was off to the museum for an afternoon of medieval masterpieces. Hunters in the Snow, the Tower of Babel, and a newly restored Triumph of Death sent from the Prado were just some of the highlights.

There’s a website to accompany the exhibition called Inside Bruegel. You can zoom on each painting to see the incredible detail. You can even compare the infrared and x-ray views. Dive in and explore the world of Pieter Bruegel the Elder.

The Battle between Carnival and Lent

Browsers

Microsoft’s Edge browser is going to switch its rendering engine over to Chromium.

I am deflated and disappointed.

There’s just no sugar-coating this. I’m sure the decision makes sound business sense for Microsoft, but it’s not good for the health of the web.

Very soon, the vast majority of browsers will have an engine that’s either Blink or its cousin, WebKit. That may seem like good news for developers when it comes to testing, but trust me, it’s a sucky situation of innovation and agreement. Instead of a diverse browser ecosystem, we’re going to end up with incest and inbreeding.

There’s one shining exception though. Firefox. That browser was originally created to combat the seemingly unstoppable monopolistic power of Internet Explorer. Now that Microsoft are no longer in the rendering engine game, Firefox is once again the only thing standing in the way of a complete monopoly.

I’ve been using Firefox as my main browser for a while now, and I can heartily recommend it. You should try it (and maybe talk to your relatives about it at Christmas). At this point, which browser you use no longer feels like it’s just about personal choice—it feels part of something bigger; it’s about the shape of the web we want.

Jeffrey wrote that browser diversity starts with us:

The health of Firefox is critical now that Chromium will be the web’s de facto rendering engine.

Even if you love Chrome, adore Gmail, and live in Google Docs or Analytics, no single company, let alone a user-tracking advertising giant, should control the internet.

Andy Bell also writes about browser diversity:

I’ll say it bluntly: we must support Firefox. We can’t, as a community allow this browser engine monopoly. We must use Firefox as our main dev browsers; we must encourage our friends and families to use it, too.

Yes, it’s not perfect, nor are Mozilla, but we can help them to develop and grow by using Firefox and reporting issues that we find. If we just use and build for Chromium, which is looking likely (cough Internet Explorer monopoly cough), then Firefox will fall away and we will then have just one major engine left. I don’t ever want to see that.

Uncle Dave says:

If the idea of a Google-driven Web is of concern to you, then I’d encourage you to use Firefox. And don’t be a passive consumer; blog, tweet, and speak about its killer features. I’ll start: Firefox’s CSS Grid, Flexbox, and Variable Font tools are the best in the business.

Mozilla themselves came out all guns blazing when they said Goodbye, EdgeHTML:

Microsoft is officially giving up on an independent shared platform for the internet. By adopting Chromium, Microsoft hands over control of even more of online life to Google.

Tim describes the situation as risking a homogeneous web:

I don’t think Microsoft using Chromium is the end of the world, but it is another step down a slippery slope. It’s one more way of bolstering the influence Google currently has on the web.

We need Google to keep pushing the web forward. But it’s critical that we have other voices, with different viewpoints, to maintain some sense of balance. Monocultures don’t benefit anyone.

Andre Alves Garzia writes that while we Blink, we lose the web:

Losing engines is like losing languages. People may wish that everyone spoke the same language, they may claim it leads to easier understanding, but what people fail to consider is that this leads to losing all the culture and way of thought that that language produced. If you are a Web developer smiling and happy that Microsoft might be adopting Chrome, and this will make your work easier because it will be one less browser to test, don’t be! You’re trading convenience for diversity.

I like that analogy with language death. If you prefer biological analogies, it’s worth revisiting this fantastic post by Rachel back in August—before any of us knew about Microsoft’s decision—all about the ecological impact of browser diversity:

Let me be clear: an Internet that runs only on Chrome’s engine, Blink, and its offspring, is not the paradise we like to imagine it to be.

That post is a great history lesson, documenting how things can change, and how decisions can have far-reaching unintended consequences.

So these are the three browser engines we have: WebKit/Blink, Gecko, and EdgeHTML. We are unlikely to get any brand new bloodlines in the foreseeable future. This is it.

If we lose one of those browser engines, we lose its lineage, every permutation of that engine that would follow, and the unique takes on the Web it could allow for.

And it’s not likely to be replaced.

Programming CSS

There’s a worrying tendency for “real” programmers look down their noses at CSS. It’s just a declarative language, they point out, not a fully-featured programming language. Heck, it isn’t even a scripting language.

That may be true, but that doesn’t mean that CSS isn’t powerful. It’s just powerful in different ways to traditional languages.

Take CSS selectors, for example. At the most basic level, they work like conditional statments. Here’s a standard if statement:

if (condition) {
// code here
}

The condition needs to evaluate to true in order for the code in the curly braces to be executed. Sound familiar?

condition {
// styles here
}

That’s a very simple mapping, but what if the conditional statement is more complicated?

if (condition1 && condition2) {
// code here
}

Well, that’s what the decendant selector does:

condition1 condition2 {
// styles here
}

In fact, we can get even more specific than that by using the child combinator, the sibling combinator, and the adjacent sibling combinator:

  • condition1 > condition2
  • condition1 ~ condition2
  • condition2 + condition2

AND is just one part of Boolean logic. There’s also OR:

if (condition1 || condition2) {
// code here
}

In CSS, we use commas:

condition1, condition2 {
// styles here
}

We’ve even got the :not() pseudo-class to complete the set of Boolean possibilities. Once you add quantity queries into the mix, made possible by :nth-child and its ilk, CSS starts to look Turing complete. I’ve seen people build state machines using the adjacent sibling combinator and the :checked pseudo-class.

Anyway, my point here is that CSS selectors are really powerful. And yet, quite often we deliberately choose not to use that power. The entire raison d’être for OOCSS, BEM, and Smacss is to deliberately limit the power of selectors, restricting them to class selectors only.

On the face of it, this might seem like an odd choice. After all, we wouldn’t deliberately limit ourselves to a subset of a programming language, would we?

We would and we do. That’s what templating languages are for. Whether it’s PHP’s Smarty or Twig, or JavaScript’s Mustache, Nunjucks, or Handlebars, they all work by providing a deliberately small subset of features. Some pride themselves on being logic-less. If you find yourself trying to do something that the templating language doesn’t provide, that’s a good sign that you shouldn’t be trying to do it in the template at all; it should be in the controller.

So templating languages exist to enforce simplicity and ensure that the complexity happens somewhere else. It’s a similar story with BEM et al. If you find you can’t select something in the CSS, that’s a sign that you probably need to add another class name to the HTML. The complexity is confined to the markup in order to keep the CSS more straightforward, modular, and maintainable.

But let’s not forget that that’s a choice. It’s not that CSS in inherently incapable of executing complex conditions. Quite the opposite. It’s precisely because CSS selectors (and the cascade) are so powerful that we choose to put guard rails in place.

Prototypes and production

When we do front-end development at Clearleft, we’re usually delivering production code, often in the form of a component library. That means our priorities are performance, accessibility, robustness, and other markers of quality when it comes to web development.

But every so often, we use the materials of front-end development—HTML, CSS, and JavaScript—to produce something that isn’t intended for production. I’m talking about prototyping.

There are plenty of non-code prototyping tools out there, and our designers often reach for them to communicate subtleties like motion design. But when it comes to testing a prototype with real users, it’s hard to beat the flexibility of HTML, CSS, and JavaScript. Load it up in a browser and away you go.

We do a lot of design sprints, where time is of the essence. The prototype we produce on the penultimate day of the sprint definitely won’t be production quality, but it will be good enough to test.

What’s interesting is that—when it comes to prototyping—our usual front-end priorities can and should go out the window. The priority now is speed. If that means sacrificing semantics or performance, then so be it. If I’m building a prototype and I find myself thinking “now, what’s the right class name for this component?”, then I know I’m in the wrong mindset. That question might be valid for production code, but it’s a waste of time for prototypes.

So these two kinds of work require very different attitudes. For production work, quality is key. For prototyping, making something quickly is what matters.

Whereas I would think long and hard about the performance impacts of third-party libraries and frameworks on a public project, I won’t give it a second thought when it comes to a prototype. Throw all the JavaScript frameworks and CSS libraries you want at it (although I would argue that in-browser technologies like CSS Grid have made CSS libraries like Bootstrap less necessary, even for prototyping).

Alternating between production projects and prototyping projects can be quite fun, if a little disorienting. It’s almost like I have to flip a switch in my brain to change tracks.

When a prototype is successful, works great, and tests well, there’s a real temptation to use the prototype code as the basis for the final product. Don’t do this! I’ve made that mistake in the past and it always ends badly. I ended up spending far more time trying to wrangle prototype code to a production level than if I had just started from a clean slate.

Build prototypes to test ideas, designs, interactions, and interfaces …and then throw the code away. The value of a prototype is in answering questions and testing hypotheses. Don’t fall for the sunk cost fallacy when it’s time to switch over into production mode.

Of course it should go without saying that you should never, ever release prototype code into production.

And yet…

More and more live sites seem to be built with a prototyping mindset. Weighty JavaScript frameworks are used regardless of appropriateness. Accessibility, if it’s even considered at all, is relegated to an afterthought. Fragile architectures are employed that rely on first loading and then executing JavaScript in order to render basic content. Developer experience is prioritised over user experience.

Heydon recently highlighted an article that offered this tip for aspiring web developers:

As for HTML, there’s not much to learn right away and you can kind of learn as you go, but before making your first templates, know the difference between in-line elements like span and how they differ from block ones like div.

That’s perfectly reasonable advice …if you’re building a prototype. But if you’re building something for public consumption, you have a duty of care to the end users.

Food and music

Going from Iceland to Greece in a day gave me a mild bit of currency exchange culture shock. Iceland is crazy expensive, especially given the self-immolation of the pound right now. Greece is remarkably cheap. You can eat like a king for unreasonably reasonable prices.

For me, food is one of the great pleasures in life. Trying new kinds of food is one of my primary motivators for travelling. It’s fascinating to me to see the differences—and similarities—across cultures. In many ways, food is like a universal language, but a language that we all speak in different dialects.

Herring. A feast of lamb.

It’s a similar story with music. There’s a fundamental universality in music across cultures, but there’s also a vast gulf of differences.

On my first night in Reykjavik, I wound up at an Irish music session. I know, I know—I sound like such a cliché, going to a foreign country and immediately seeking out something familiar. But I had been invited along by a kind soul who got in touch through The Session after I posted my travel plans there. Luckily for me, there was a brand new session starting that very evening. I didn’t have an instrument, but someone very kindly lent me their banjo and I had a thoroughly enjoyable time playing along with the jigs and reels.

As an added bonus—and you really don’t get to hear this at most trad sessions—there was even a bit of Icelandic singing courtesy of Bára Grímsdóttir. I snatched a little sample of it.

A few nights later I was in a quiet, somewhat smokey tavern in Thessaloniki. There was no Irish music to be found, but the rembetika music played on gorgeous bouzoukis and baglamas was in full flow.

Conferencing

I just wrapped up my last speaking gig of the year. It came at the end of a streak of attending European conferences without speaking at any of them—quite a nice feeling!

I already mentioned that I was in Berlin for the (excellent) Indie Web Camp. That was immediately followed by a one-day Accessibility Club conference. It was really, really good.

I have to say, I was initially apprehensive when I saw the sheer amount of speakers on the schedule. I was worried that my attention couldn’t handle it all. But the talks were a mixture of shorter 20 minute presentations, and a few longer 40 minute presentations. That worked really well—the day fairly zipped by. And just in case you think it would hard to have an entire day devoted to accessibility, the breadth of talks was remarkably diverse. Hats off to a well-organised and well-executed event!

The next day was Beyond Tellerrand. This has my favourite conference format: two days; one track; curated; a mix of design and development (see also An Event Apart and Smashing Conference). Marc’s love and care shines through every pore of the event. I thoroughly enjoyed the talks, and the hanging out with lovely people.

Alas, I had to miss the final afternoon of Beyond Tellerrand to head home to Brighton. I needed to get back for FF Conf. It was excellent, as always. Remy and Julie really give it their all. Remy even stepped in to give a (great) talk himself this year, when a speaker couldn’t make it.

A week later, I went to Iceland for Material. I really enjoyed last year’s inaugural event, and if anything, this year’s topped it. I just love how eclectic and different the talks are, and yet it all weirdly hangs together in a thoughtfully curated way. (Oh, and Remy, when you start to put together the line-up for next year’s FF Conf, be sure to check out Charlotte Dann—her talk at Material was the perfect mix of code and creativity.)

As well as sharing an organiser with Accessibility Club, Material had a similar format—keynote talks from invited presenters, interspersed with shorter talks by locals. The mix was great. I won’t even try to describe the range of topics. I’m not sure I could explain how a conference podium morphed into a bar at the end of one of the talks. I think the best description of Material would be to say it’s like the inside of Brian’s head. In a good way.

I was supposed to be back in Brighton for one night after Material, but the stormy weather kept myself and Jessica in Reykjavik for an extra night. Thanks to Brian’s hospitality, we had a bed for the night.

There followed a long travel day as we made our way from Reykjavik to Gatwick, and then straight on to Thessaloniki, where we spent five days even though we only had the clothes we packed for the brief trip to Iceland. (Yes, we went shopping.)

I was there to speak at Voxxed Days. These events happen in various locations around the world, and just a few weeks ago, I spoke at the one in Bristol. It was …different.

After experiencing so many lovingly crafted events—Accessibility Club, Beyond Tellerrand, FF Conf, and Material—I’m afraid that Voxxed Days Thessaloniki was quite a comedown. It’s not that it was corporate per se—I believe it’s organised by developers for developers—but it felt like it was for people who worked in corporate environments. There were multiple tracks (I’m really not a fan of that), and some great speakers on the line-up like Stephanie and Simona, but the atmosphere felt kind of grim in a David Brentian sort of way. It probably wasn’t helped by the cheeky chappie of an MC who referred to one of the speakers as “darling.”

Anyway, I spoke first thing on the first day and I didn’t end up sticking around long. Normally I don’t speak and run, but I didn’t fancy the vibe of the exhibitor hall with its booth-babesque sales teams. Voxxed Days doesn’t pay its speakers so I didn’t feel any great obligation to hang around. The magnificent food and rembetika music of Thessaloniki was calling.

I just got back from Greece, and that wraps up my conference attending (and speaking) for 2018. I’ve already got a couple of events lined up for 2019. I’m delighted to be speaking at the return of Colly’s New Adventures conference. I’m less delighted about preparing a brand new talk I promised—I’m really feeling the pressure to deliver the goods at such an auspicious event with an intimidatingly superb line-up of speakers.

I’m also going to be preparing a different all-new talk for An Event Apart Seattle in March. For once, I’m going to try to make it somewhat practical and talk about service workers. If you know of any other events that might want a presentation like that in 2019, drop me a line.

Perhaps I will see you in Nottingham or in Seattle. If you’re planning on going to New Adventures, use the discount code ADACTIO10 to get 10% of the price of the conference or workshop ticket. If you’re planning on going to An Event Apart, use the discount code AEAKEITH for $100 off.

Optimise without a face

I’ve been playing around with the newly-released Squoosh, the spiritual successor to Jake’s SVGOMG. You can drag images into the browser window, and eyeball the changes that any optimisations might make.

On a project that Cassie is working on, it worked really well for optimising some JPEGs. But there were a few images that would require a bit more fine-grained control of the optimisations. Specifically, pictures with human faces in them.

I’ve written about this before. If there’s a human face in image, I open that image in a graphics editing tool like Photoshop, select everything but the face, and add a bit of blur. Because humans are hard-wired to focus on faces, we’ll notice any jaggy artifacts on a face, but we’re far less likely to notice jagginess in background imagery: walls, materials, clothing, etc.

On the face of it (hah!), a browser-based tool like Squoosh wouldn’t be able to optimise for faces, but then Cassie pointed out something really interesting…

When we were both at FFConf on Friday, there was a great talk by Eleanor Haproff on machine learning with JavaScript. It turns out there are plenty of smart toolkits out there, and one of them is facial recognition. So I wonder if it’s possible to build an in-browser tool with this workflow:

  • Drag or upload an image into the browser window,
  • A facial recognition algorithm finds any faces in the image,
  • Those portions of the image remain crisp,
  • The rest of the image gets a slight blur,
  • Download the optimised image.

Maybe the selecting/blurring part would need canvas? I don’t know.

Anyway, I thought this was a brilliant bit of synthesis from Cassie, and now I’ve got two questions:

  1. Does this exist yet? And, if not,
  2. Does anyone want to try building it?

Push without notifications

On the first day of Indie Web Camp Berlin, I led a session on going offline with service workers. This covered all the usual use-cases: pre-caching; custom offline pages; saving pages for offline reading.

But on the second day, Sebastiaan spent a fair bit of time investigating a more complex use of service workers with the Push API.

The Push API is what makes push notifications possible on the web. There are a lot of moving parts—browser, server, service worker—and, frankly, it’s way over my head. But I’m familiar with the general gist of how it works. Here’s a typical flow:

  1. A website prompts the user for permission to send push notifications.
  2. The user grants permission.
  3. A whole lot of complicated stuff happens behinds the scenes.
  4. Next time the website publishes something relevant, it fires a push message containing the details of the new URL.
  5. The user’s service worker receives the push message (even if the site isn’t open).
  6. The service worker creates a notification linking to the URL, interrupting the user, and generally adding to the weight of information overload.

Here’s what Sebastiaan wanted to investigate: what if that last step weren’t so intrusive? Here’s the alternate flow he wanted to test:

  1. A website prompts the user for permission to send push notifications.
  2. The user grants permission.
  3. A whole lot of complicated stuff happens behinds the scenes.
  4. Next time the website publishes something relevant, it fires a push message containing the details of the new URL.
  5. The user’s service worker receives the push message (even if the site isn’t open).
  6. The service worker fetches the contents of the URL provided in the push message and caches the page. Silently.

It worked.

I think this could be a real game-changer. I don’t know about you, but I’m very, very wary of granting websites the ability to send me push notifications. In fact, I don’t think I’ve ever given a website permission to interrupt me with push notifications.

You’ve seen the annoying permission dialogues, right?

In Firefox, it looks like this:

Will you allow name-of-website to send notifications?

[Not Now] [Allow Notifications]

In Chrome, it’s:

name-of-website wants to

Show notifications

[Block] [Allow]

But in actual fact, these dialogues are asking for permission to do two things:

  1. Receive messages pushed from the server.
  2. Display notifications based on those messages.

There’s no way to ask for permission just to do the first part. That’s a shame. While I’m very unwilling to grant permission to be interrupted by intrusive notifications, I’d be more than willing to grant permission to allow a website to silently cache timely content in the background. It would be a more calm technology.

Think of the use cases:

  • I grant push permission to a magazine. When the magazine publishes a new article, it’s cached on my device.
  • I grant push permission to a podcast. Whenever a new episode is published, it’s cached on my device.
  • I grant push permission to a blog. When there’s a new blog post, it’s cached on my device.

Then when I’m on a plane, or in the subway, or in any other situation without a network connection, I could still visit these websites and get content that’s fresh to me. It’s kind of like background sync in reverse.

There’s plenty of opportunity for abuse—the cache could get filled with content. But websites can already do that, and they don’t need to be granted any permissions to do so; just by visiting a website, it can add multiple files to a cache.

So it seems that the reason for the permissions dialogue is all about displaying notifications …not so much about receiving push messages from the server.

I wish there were a way to implement this background-caching pattern without requiring the user to grant permission to a dialogue that contains the word “notification.”

I wonder if the act of adding a site to the home screen could implicitly grant permission to allow use of the Push API without notifications?

In the meantime, the proposal for periodic synchronisation (using background sync) could achieve similar results, but in a less elegant way; periodically polling for new content instead of receiving a push message when new content is published. Also, it requires permission. But at least in this case, the permission dialogue should be more specific, and wouldn’t include the word “notification” anywhere.

Webmentions at Indie Web Camp Berlin

I was in Berlin for most of last week, and every day was packed with activity:

By the time I got back to Brighton, my brain was full …just in time for FF Conf.

All of the events were very different, but equally enjoyable. It was also quite nice to just attend events without speaking at them.

Indie Web Camp Berlin was terrific. There was an excellent turnout, and once again, I found that the format was just right: a day of discussions (BarCamp style) followed by a day of doing (coding, designing, hacking). I got very inspired on the first day, so I was raring to go on the second.

What I like to do on the second day is try to complete two tasks; one that’s fairly straightforward, and one that’s a bit tougher. That way, when it comes time to demo at the end of the day, even if I haven’t managed to complete the tougher one, I’ll still be able to demo the simpler one.

In this case, the tougher one was also tricky to demo. It involved a lot of invisible behind-the-scenes plumbing. I was tweaking my webmention endpoint (stop sniggering—tweaking your endpoint is no laughing matter).

Up until now, I could handle straightforward webmentions, and I could handle updates (if I receive more than one webmention from the same link, I check it each time). But I needed to also handle deletions.

The spec is quite clear on this. A 404 isn’t enough to trigger a deletion—that might be a temporary state. But a status of 410 Gone indicates that a resource was once here but has since been deliberately removed. In that situation, any stored webmentions for that link should also be removed.

Anyway, I think I got it working, but it’s tricky to test and even trickier to demo. “Not to worry”, I thought, “I’ve always got my simpler task.”

For that, I chose to add a little map to my homepage showing the last location I published something from. I’ve been geotagging all my content for years (journal entries, notes, links, articles), but not really doing anything with that data. This is a first step to doing something interesting with many years of location data.

I’ve got it working now, but the demo gods really weren’t with me at Indie Web Camp. Both of my demos failed. The webmention demo failed quite embarrassingly.

As well as handling deletions, I also wanted to handle updates where a URL that once linked to a post of mine no longer does. Just to be clear, the URL still exists—it’s not 404 or 410—but it has been updated to remove the original link back to one of my posts. I know this sounds like another very theoretical situation, but I’ve actually got an example of it on my very first webmention test post from five years ago. Believe it or not, there’s an escort agency in Nottingham that’s using webmention as a vector for spam. They post something that does link to my test post, send a webmention, and then remove the link to my test post. I almost admire their dedication.

Still, I wanted to foil this particular situation so I thought I had updated my code to handle it. Alas, when it came time to demo this, I was using someone else’s computer, and in my attempt to right-click and copy the URL of the spam link …I accidentally triggered it. In front of a room full of people. It was midly NSFW, but more worryingly, a potential Code Of Conduct violation. I’m very sorry about that.

Apart from the humiliating demo, I thoroughly enjoyed Indie Web Camp, and I’m going to keep adjusting my webmention endpoint. There was a terrific discussion around the ethical implications of storing webmentions, led by Sebastian, based on his epic post from earlier this year.

We established early in the discussion that we weren’t going to try to solve legal questions—like GDPR “compliance”, which varies depending on which lawyer you talk to—but rather try to figure out what the right thing to do is.

Earlier that day, during the introductions, I quite happily showed webmentions in action on my site. I pointed out that my last blog post had received a response from another site, and because that response was marked up as an h-entry, I displayed it in full on my site. I thought this was all hunky-dory, but now this discussion around privacy made me question some inferences I was making:

  1. By receiving a webention in the first place, I was inferring a willingness for the link to be made public. That’s not necessarily true, as someone pointed out: a CMS could be automatically sending webmentions, which the author might be unaware of.
  2. If the linking post is marked up in h-entry, I was inferring a willingness for the content to be republished. Again, not necessarily true.

That second inferrence of mine—that publishing in a particular format somehow grants permissions—actually has an interesting precedent: Google AMP. Simply by including the Google AMP script on a web page, you are implicitly giving Google permission to store a complete copy of that page and serve it from their servers instead of sending people to your site. No terms and conditions. No checkbox ticked. No “I agree” button pressed.

Just sayin’.

Anyway, when it comes to my own processing of webmentions, I’m going to take some of the suggestions from the discussion on board. There are certain signals I could be looking for in the linking post:

  • Does it include a link to a licence?
  • Is there a restrictive robots.txt file?
  • Are there meta declarations that say noindex?

Each one of these could help to infer whether or not I should be publishing a webmention or not. I quickly realised that what we’re talking about here is an algorithm.

Despite its current usage to mean “magic”, an algorithm is a recipe. It’s a series of steps that contribute to a decision point. The problem is that, in the case of silos like Facebook or Instagram, the algorithms are secret (which probably contributes to their aura of magical thinking). If I’m going to write an algorithm that handles other people’s information, I don’t want to make that mistake. Whatever steps I end up codifying in my webmention endpoint, I’ll be sure to document them publicly.

Service workers and videos in Safari

Alright, so I’ve already talked about some gotchas when debugging service worker issues. But what if you don’t even realise the problem has anything to do with your service worker?

This is not a hypothetical situation. I encountered this very thing myself. Gather ‘round the campfire, children…

One of the latest case studies on the Clearleft site is a nice write-up by Luke of designing a mobile app for Virgin Holidays. The case study includes a lovely video that demonstrates the log-in flow. I implemented that using a video element (with a poster image). Nice and straightforward. Super easy. All good.

But I hadn’t done my due diligence in browser testing (I guess I didn’t even think of it in this case). Hana informed me that the video wasn’t working at all in Safari. The poster image appeared just fine, but when you clicked on it, the video didn’t load.

I ducked, ducked, and went, uncovering what appeared to be the root of the problem. It seems that Safari is fussy about having servers support something called “byte-range requests”.

I had put the video in question on an Amazon S3 server. I came to the conclusion that S3 mustn’t support these kinds of headers correctly, or something.

Now I had a diagnosis. The next step was figuring out a solution. I thought I might have to move the video off of S3 and onto a server that I could configure a bit more.

Luckily, I never got ‘round to even starting that process. That’s good. Because it turns out that my diagnosis was completely wrong.

I came across a recent post by Phil Nash called Service workers: beware Safari’s range request. The title immediately grabbed my attention. Safari: yes! Video: yes! But service workers …wait a minute!

There’s a section in Phil’s post entitled “Diagnosing the problem”, in which he says:

I first thought it could have something to do with the CDN I’m using. There were some false positives regarding streaming video through a CDN that resulted in some extra research that was ultimately fruitless.

That described my situation exactly. Except Phil went further and nailed down the real cause of the problem:

Nginx was serving correct responses to Range requests. So was the CDN. The only other problem? The service worker. And this broke the video in Safari.

Doh! I hadn’t even thought about service workers!

Phil came up with a solution, and he has kindly shared his code.

I decided to go for a dumber solution:

if ( request.url.match(/\.(mp4)$/) ) {
  return;
}

That tells the service worker to just step out of the way when it comes to video requests. Now the video plays just fine in Safari. It’s a bit of a shame, because I’m kind of penalising all browsers for Safari’s bug, but the Clearleft site isn’t using much video at all, and in any case, it might be good not to fill up the cache with large video files.

But what’s more important than any particular solution is correctly identifying the problem. I’m quite sure I never would’ve been able to fix this issue if Phil hadn’t gone to the trouble of sharing his experience. I’m very, very grateful that he did.

That’s the bigger lesson here: if you solve a problem—even if you think it’s hardly worth mentioning—please, please share your solution. It could make all the difference for someone out there.

Service workers and browser extensions

I quite enjoy a good bug hunt. Just yesterday, myself and Cassie were doing some bugfixing together. As always, the first step was to try to reproduce the problem and then isolate it. Which reminds me…

There’ve been a few occasions when I’ve been trying to debug service worker issues. The problem is rarely in reproducing the issue—it’s isolating the cause that can be frustrating. I try changing a bit of code here, and a bit of code there, in an attempt to zero in on the problem, butwith no luck. Before long, I’m tearing my hair out staring at code that appears to have nothing wrong with it.

And that’s when I remember: browser extensions.

I’m currently using Firefox as my browser, and I have extensions installed to stop tracking and surveillance (these technologies are usually referred to as “ad blockers”, but that’s a bit of a misnomer—the issue isn’t with the ads; it’s with the invasive tracking).

If you think about how a service worker does its magic, it’s as if it’s sitting in the browser, waiting to intercept any requests to a particular domain. It’s like the service worker is the first port of call for any requests the browser makes. But then you add a browser extension. The browser extension is also waiting to intercept certain network requests. Now the extension is the first port of call, and the service worker is relegated to be next in line.

This, apparently, can cause issues (presumably depending on how the browser extension has been coded). In some situations, network requests that should work just fine start to fail, executing the catch clauses of fetch statements in your service worker.

So if you’ve been trying to debug a service worker issue, and you can’t seem to figure out what the problem might be, it’s not necessarily an issue with your code, or even an issue with the browser.

From now on when I’m troubleshooting service worker quirks, I’m going to introduce a step zero, before I even start reproducing or isolating the bug. I’m going to ask myself, “Are there any browser extensions installed?”

I realise that sounds as basic as asking “Are you sure the computer is switched on?” but there’s nothing wrong with having a checklist of basic questions to ask before moving on to the more complicated task of debugging.

I’m going to make a checklist. Then I’m going to use it …every time.