Tags: ie




My site has been behaving strangely recently. It was nothing that I could put my finger on—it just seemed to be acting oddly. When I checked to see if everything was okay, I was told that everything was fine, but still, I sensed something that was amiss.

I’ve just realised what it was. Last week on the 30th of September, I didn’t do or say anything special. That was the problem. I had forgotten my blog’s anniversary.

I’m so sorry, adactio.com! Honestly, I had been thinking about it for all of September but then on the day, one thing led to another, I was busy, and it just completely slipped my mind.

So this is a bit late, but anyway …happy fifteenth anniversary to this journal!

We’ve been through a lot together in those fifteen years, haven’t we, /journal? Oh, the places we’ve been and the things we’ve seen!

I remember where we were on our tenth anniversary: Bologna. Remember we were there for the first edition of the From The Front conference? Now, five years on, we’ve just been to the final edition of that same event—a bittersweet occasion.

Like I said five years ago:

It has been a very rewarding, often cathartic experience so far. I know that blogging has become somewhat passé in this age of Twitter and Facebook but I plan to keep on keeping on right here in my own little corner of the web.

I should plan something special for September 30th, 2021 …just to make sure I don’t forget.


In the latest issue of Justin’s excellent Responsive Web Design weekly newsletter, he includes a segment called “The Snippet Show”:

This is what tells all our browsers on all our devices to set the viewport to be the same width of the current device, and to also set the initial scale to 1 (not scaled at all). This essentially allows us to have responsive design consistently.

<meta name="viewport" content="width=device-width, initial-scale=1">

The viewport value for the meta element was invented by Apple when the iPhone was released. Back then, it was a safe bet that most websites were wider than the iPhone’s 320 pixel wide display—most of them were 960 pixels wide …because reasons. So mobile Safari would automatically shrink those sites down to fit within the display. If you wanted to over-ride that behaviour, you had to use the meta viewport gubbins that they made up.

That was nine years ago. These days, if you’re building a responsive website, you still need to include that meta element.

That seems like a shame to me. I’m not suggesting that the default behaviour should switch to assuming a fluid layout, but maybe the browser could just figure it out. After all, the CSS will already be parsed by the time the HTML is rendering. Perhaps a quick test for the presence of a crawlbar could be used to trigger the shrinking behaviour. No crawlbar, no shrinking.

Maybe someday the assumption behind the current behaviour could be flipped—assume a website is responsive unless the author explicitly requests the shrinking behaviour. I’d like to think that could happen soon, but I suspect that a depressingly large number of sites are still fixed-width (I don’t even want to know—don’t tell me).

There are other browser default behaviours that might someday change. Right now, if I type example.com into a browser, it will first attempt to contact http://example.com rather than https://example.com. That means the example.com server has to do a redirect, costing the user valuable time.

You can mitigate this by putting your site on the HSTS preload list but wouldn’t it be nice if browsers first checked for HTTPS instead of HTTP? I don’t think that will happen anytime soon, but someday …someday.

Indie Web Camp Brighton 2016

Indie Web Camp Brighton 2016 is done and dusted. It’s hard to believe that it’s already in its fifth(!) year. As with previous years, it was a lot of fun.


The first day—the discussions day—covered a lot of topics. I led a session on service workers, where we brainstormed offline and caching strategies for personal websites.

There was a design session looking at alternatives to simply presenting everything in a stream. Some great ideas came out of that. And there was a session all about bookmarking and linking. That one really got my brain whirring with ideas for the second day—the making/coding day.

I’ve learned from previous Indie Web Camps that a good strategy for the second day is to have two tasks to tackle: one that’s really easy (so you’ve at least got that to demo at the end), and one that’s more ambitious. This time, I put together a list of potential goals, and then ordered them by difficulty. By the end of the day, I managed to get a few of them done.

First off, I added a small bit of code to my bookmarking flow, so that any time I link to something, I send a ping to the Internet Archive to grab a copy of that URL. So here’s a link I bookmarked to one of Remy’s blog posts, and here it is in the Wayback Machine—see how the date of storage matches the date of my link.

The code to do that was pretty straightforward. I needed to hit this endpoint:


I also updated my bookmarklet for posting links so that, if I’ve highlighted any text on the page I’m linking to, that text is automatically pasted in to the description.

I tweaked my webmentions a bit so that if I receive a webmention that has a type of bookmark-of, that is displayed differently to a comment, or a like, or a share. Here’s an example of Aaron bookmarking one of my articles.

The more ambitious plan was to create an over-arching /tags area for my site. I already have tag-based navigation for my journal and my links:

But until this weekend, I didn’t have the combined view:

I didn’t get around to adding pagination. That’s something I should definitely add, because some of those pages get veeeeery long. But I did spend some time adding sparklines. They can be quite revealing, especially on topics that were hot ten years ago, but have faded over time, or topics that have becoming more and more popular with each year.

All in all, a very productive weekend.

European tour

I’m recovering from an illness that laid me low a few weeks back. I had a nasty bout of man-flu which then led to a chest infection for added coughing action. I’m much better now, but alas, this illness meant I had to cancel my trip to Chicago for An Event Apart. I felt very bad about that. Not only was I reneging on a commitment, but I also missed out on an opportunity to revisit a beautiful city. But it was for the best. If I had gone, I would have spent nine hours in an airborne metal tube breathing recycled air, and then stayed in a hotel room with that special kind of air conditioning that hotels have that always seem to give me the sniffles.

Anyway, no point regretting a trip that didn’t happen—time to look forward to my next trip. I’m about to embark on a little mini tour of some lovely European cities:

  • Tomorrow I travel to Stockholm for Nordic.js. I’ve never been to Stockholm. In fact I’ve only stepped foot in Sweden on a day trip to Malmö to hang out with Emil. I’m looking forward to exploring all that Stockholm has to offer.
  • On Saturday I’ll go straight from Stockholm to Berlin for the View Source event organised by Mozilla. Looks like I’ll be staying in the east, which isn’t a part of the city I’m familiar with. Should be fun.
  • Alas, I’ll have to miss out on the final day of View Source, but with good reason. I’ll be heading from Berlin to Bologna for the excellent From The Front conference. Ah, I remember being at the very first one five years ago! I’ve made it back every second year since—I don’t need much of an excuse to go to Bologna, one of my favourite places …mostly because of the food.

The only downside to leaving town for this whirlwind tour is that there won’t be a Brighton Homebrew Website Club tomorrow. I feel bad about that—I had to cancel the one two weeks ago because I was too sick for it.

But on the plus side, when I get back, it won’t be long until Indie Web Camp Brighton on Saturday, September 24th and Sunday, September 25th. If you haven’t been to an Indie Web Camp before, you should really come along—it’s for anyone who has their own website, or wants to have their own website. If you have been to an Indie Web Camp before, you don’t need me to convince you to come along; you already know how good it is.

Sign up for Indie Web Camp Brighton here. It’s free and it’s a lot of fun.

The importance of owning your data is getting more awareness. To grow it and help people get started, we’re meeting for a bar-camp like collaboration in Brighton for two days of brainstorming, working, teaching, and helping.

Save the dates for Indie Web Camp Brighton 2016

September 24th and 25th—those are the dates you should put in your diary. That’s when this year’s Indie Web Camp Brighton is happening.

Once again it’ll be at 68 Middle Street, home to Clearleft. You can register for free now, and then add your name to the list of participants on the wiki.

If you haven’t been to an Indie Web Camp before, it’s a very straightforward proposition. The idea is that you should have your own website. That’s it. Every thing else is predicated on that. So while there’ll be plenty of discussions, demos, and designs, they’re all in service to that fundamental premise.

The first day of an Indie Web Camp is like a BarCamp. We make a schedule grid at the start of the day and people organise topics by room and time slot. It sounds chaotic. It is chaotic. But it works surprisingly well. The discussions can be about technologies, or interfaces, or ideas, or just about anything really.

The second day is for making. After the discussions from the previous day, most people will have a clear idea at this point for something they might want to do. It might involve adding some new technology to their website, or making some design changes, or helping build a tool. For people starting from scratch, this is the perfect time for them to build and launch a basic website.

At the end of the second day, everyone demos what they’ve done. I’m always amazed by how much people can accomplish in just one weekend. There’s something about having other people around to help you that makes it super productive.

You might be thinking “but I’m not a coder!” Don’t worry—there’ll be plenty of coders there so you can get their help on whatever you might decide to do. If you’re a designer, your skills will be in high demand by those coders. It’s that mish-mash of people that makes it such a fun gathering.

Last year’s Indie Web Camp Brighton was lots of fun. Let’s make Indie Web Camp Brighton 2016 even better!

Indie Web Camp Brighton group photo

Sticky headers

I made a little tweak to The Session today. The navigation bar across the top is “sticky” now—it doesn’t scroll with the rest of the content.

I made sure that the stickiness only kicks in if the screen is both wide and tall enough to warrant it. Vertical media queries are your friend!

But it’s not enough to just put some position: fixed CSS inside a media query. There are some knock-on effects that I needed to mitigate.

I use the space bar to paginate through long pages. It drives me nuts when sites with sticky headers don’t accommodate this. I made use of Tim Murtaugh’s sticky pagination fixer. It makes sure that page-jumping with the keyboard (using the space bar or page down) still works. I remember when I linked to this script two years ago, thinking “I bet this will come in handy one day.” Past me was right!

The other “gotcha!” with having a sticky header is making sure that in-page anchors still work. Nicolas Gallagher covers the options for this in a post called Jump links and viewport positioning. Here’s the CSS I ended up using:

:target:before {
    content: '';
    display: block;
    height: 3em;
    margin: -3em 0 0;

I also needed to check any of my existing JavaScript to see if I was using scrollTo anywhere, and adjust the calculations to account for the newly-sticky header.

Anyway, just a few things to consider if you’re going to make a navigational element “sticky”:

  1. Use min-height in your media query,
  2. Take care of keyboard-initiated page scrolling,
  3. Adjust the positioning of in-page links.

A little progress

I’ve got a fairly simple posting interface for my notes. A small textarea, an optional file upload, some checkboxes for syndicating to Twitter and Flickr, and a submit button.

Notes posting interface

It works fine although sometimes the experience of uploading a file isn’t great, especially if I’m on a slow connection out and about. I’ve been meaning to add some kind of Ajax-y progress type thingy for the file upload, but never quite got around to it. To be honest, I thought it would be a pain.

But then, in his excellent State Of The Gap hit parade of web technologies, Remy included a simple file upload demo. Turns out that all the goodies that have been added to XMLHttpRequest have made this kind of thing pretty easy (and I’m guessing it’ll be easier still once we have fetch).

I’ve made a little script that adds a progress bar to any forms that are POSTing data.

Feel free to use it, adapt it, and improve it. It isn’t using any ES6iness so there are some obvious candidates for improvement there.

It’s working a treat on my little posting interface. Now I can stare at a slowly-growing progress bar when I’m out and about on a slow connection.

Owning my words

When I wrote a few words about progressive enhancement recently, I linked to Karolina’s great article The Web Isn’t Uniform. I was a little reluctant to link to it, not because of the content—which is great—but because of its location on Ev’s blog. I much prefer to link directly to people’s own websites (I have a hunch that those resources tend to last longer too) but I understand that Medium offers a nice low barrier to publishing.

That low barrier comes at a price. It means you have to put up with anyone and everyone weighing in with their own hot takes. The way the site works is that anyone who writes a comment on your article is effectively writing their own article—you don’t get to have any editorial control over what kind of stuff appears together with your words. There is very little in the way of community management once a piece is published.

Karolina’s piece attracted some particularly unsavoury snark—tech bros disagreeing in their brash bullying way. I linked to a few comments, leaving out the worst of the snark, but I couldn’t resist editorialising:

Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.

I knew even when I was writing it that it was unproductive, itself a snarky remark. Two wrongs don’t make a right. But I wanted to acknowledge that not only was bad behaviour happening, but that I was seeing it, and I wasn’t ignoring it. I guess it was mostly intended for Karolina—I wanted to extend some kind of acknowledgment that the cumulative weight of those sneering drive-by reckons is a burden that no one should have to put up with.

I knew that when I wrote about Medium being “where the opinions of self-entitled dudes flow like rain from the tech heavens” that I would (rightly) get pushback, and sure enough, I did …on Medium. Not on Twitter or anywhere else, just Medium.

I syndicate my posts to Ev’s blog, so the free-for-all approach to commenting doesn’t bother me that much. The canonical URL for my words remains on my site under my control. But for people posting directly to Medium and then having to put up with other people casually shitting all over their words, it must feel quite disempowering.

I have a similar feeling with Twitter. I syndicate my notes there and if the service disappeared tomorrow, I wouldn’t shed any tears. There’s something very comforting in knowing that any snarky nasty responses to my words are only being thrown at copies. I know a lot of my friends are disheartened about the way that Twitter has changed in recent years. I wish I could articulate how much better it feels to only use Twitter (or Medium or Facebook) as a syndication tool, like RSS.

There is an equal and opposite reaction too. I think it’s easier to fling off some thoughtless remarks when you’re doing it on someone else’s site. I bet you that the discourse on Ev’s blog would be of a much higher quality if you could only respond from your own site. I find I’m more careful with my words when I publish here on adactio.com. I’m taking ownership of what I say.

And when I do lapse and write snarky words like “Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.”, at least I’m owning my own snark. Still, I will endeavour to keep my snark levels down …but that doesn’t mean I’m going to turn a blind eye to bad behaviour.

A web for everyone

I gave the closing talk at the Render conference in Oxford a few weeks back. It was a very smoothly-run event, the spiritual successor to jQuery UK.

In amongst the mix of talks there were a few emerging themes. Animation was covered from a few different angles by Val and Sara. Bruce, Jake, Ola, and I talked about Service Workers and offline functionality. But there were also some differences of opinion.

In her great talk—I’m Offline, Cool! Now What?—Ola outlined the many and varied offline use cases that drove the creation and philosophy of Hoodie. She described all the reasons why people need the web; for communication, for access to information, for empowerment, and for love. “Hell, yes!” I thought.

But then she said:

So since when is helping people to fulfil a basic need, progressive enhancement?

And even more forcefully:

This is why I think, putting offline first in the progressive enhancement slot is pure bullshit.

Strong words indeed! And I have to say I was a little puzzled by them.

Ola had demonstrated again and again just how fragile the network could be. That is absolutely correct. All too often, we make the assumption that people using our sites have a decent network connection. That’s not a safe assumption to make.

But the suggested solution—to rely on technologies like local storage, Service Workers, or other APIs—assumes a certain level of JavaScript capabilities in the devices and browsers out there. That’s an unsafe assumption to make.

I remember discussing this with Alex from Hoodie a while back. I was confused by the cognitive dissonance I was observing. It seems to me that, laudable as Hoodie’s offline-first goals are, they are swapping out one unstable dependency—the network—for a different unstable dependency—a set of JavaScript APIs.

(I remember Alex pointed out that Hoodie was intended primarily for web apps rather than web sites, and my response—predictably enough—was to say “Define web app”.)

I think I understand why Ola reacted so strongly to the suggestion that offline functionality should be added as an enhancement. I’ve seen the same reaction when I’ve said that beautiful typography on the web is an enhancement. I think that when I say something is an enhancement, what people hear is that something is just an enhancement. It sounds belittling. That’s not my intention, but I can understand how it could come across like that. Perhaps this is one reason why some people have a real issue with the term “progressive enhancement”.

I wish we could make offline functionality a requirement. But the reality is that not everyone is using a browser that supports the necessary technology. I wish we could make beautiful typography a requirement. But, again, the reality is that there will always be some browsers or devices that won’t be capable of executing that typography. Accepting these facets of reality might seem like admissions of defeat, but I actually find it quite liberating.

In her brilliant talk at Render, Ashley G. Williams channeled Carl Sagan, quoting from his book The Demon-Haunted World:

It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring.

That’s how I feel we should approach building for the web. Let’s accept that network connections are unevenly distributed. Let’s also accept that browser features are unevenly distributed. Pretending that millions of Opera Mini users don’t exist isn’t a viable strategy. They too are people who want to communicate, to access information, to be empowered, and to love.

Pointing out that you can’t always rely on client-side JavaScript shouldn’t be taken as an admonishment. It’s an opportunity.

Karolina Szczur wrote a wonderful piece on Ev’s blog called The Web Isn’t Uniform. She noticed how many sites—Facebook, AirBnB, Basecamp—failed to even render some useful information if the JavaScript fails to load. It’s a situation that many of us—with our fast connections, capable browsers, and modern devices—might never even notice.

It’s a privilege to be able to use breaking edge technologies and devices, but let’s not forget basic accessibility and progressive enhancement. Ultimately, we’re building for the users, not for our own tastes or preferences.

Karolina asks that we, as makers of the web, have a little more empathy. If the comments on her article are anything to go by, that’s a tall order. All the usual tropes are rolled out—there’s the misunderstanding that progressive enhancement means making sure everything works without JavaScript (it doesn’t; it’s about the core functionality), and the evergreen argument that as soon as you’re building a web “app”, that best practices, good engineering, and empathy can go out the window…

I strongly disagree that this has anything to do at all about empathy. Instead, it’s all about resources and priorities. Making a JS app is already hard enough, duplicating all that work so that it also works without JS is quite often just not practical.—Sacha Greif

But requiring that a site be functional when JavaScript is disabled, may not be a valid requirement anymore. HTML and CSS were originally created and designed for documents, not applications. Many websites these days should be considered apps rather then docs.—Dan Shappir

What you’re suggesting is that all these companies should write all their software twice, once in javascript and again in good ol’ html with forms, to cater to that point-whatever-percentage that has decided to break their own web browser by turning one of the three fundamental web technologies off. In what universe is this a reasonable request?—Erlend Halvorsen

JavaScript is as important as HTML. This is modern internet. If someone doesn’t have JavaScript, they should not be using the new applications that were possible because of JavaScript.—HarshaL

I am a web developer. I build web applications not web sites. What you say may be true for web sites with static pages displaying images and text.—R. Fancsiki

Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.

While they were so busy defending the lack of basic functionality in all the examples that Karolina listed, they failed to notice the most important development:

Let’s build a web that works for everyone. That doesn’t mean everyone has to have the same experience. Let’s accept that there are all sorts of people out there accessing the web with all sorts of browsers on all sorts of devices.

What a fantastic opportunity!

Indie Web Camp Düsseldorf

Indie Web Camp Düsseldorf took place last weekend and it was—no surprise—really excellent.

It felt really good to have one in Germany again so soon after the last one in Nuremberg. Lots of familiar faces showed up as well as plenty of newcomers.

I’m blown away by how much gets done in two short days, especially from people who start the weekend without a personal website and end it with something to call their own. Like Julie’s new site for example (and once again she took loads of great photos).

My own bit of hacking was quite different to what I got up to in Nuremberg. At that event, I was concentrating on the interface, adding sparklines and a bio to my home page. This time round I concentrated more on the plumbing. I finally updated some the code that handles webmentions. I first got it working a few year’s back at an Indie Web Camp here in Brighton, but I hadn’t really updated the code in a while. I’m much happier with the way it’s working now.

I also updated the way I’m syndicating my notes to Twitter, specifically how I send photos. Previously I was using the API method /statuses/updatewithmedia.

When I was at the Mobile @Scale event at Facebook’s London office a while back, Henna Kermani gave a talk about the new way that Twitter handles file uploads. There’s a whole new part of the API for handling that. When she got off stage, I mentioned to her that I was still using the old API method and asked how long it would be until it was switched off. She looked at incredulously and said “It’s still working‽ I thought it had been turned off already!”

That’s why I spent most of my time at Indie Web Camp Düsseldorf updating my PHP. Switching over to the TwitterOAuth library made it a bit less painful—thanks to Bea for helping me out there.

When it came time to demo, I didn’t have much to show. On the surface, my site looked no different. But I feel pretty good about finally getting around to changing the wiring under the hood.

Besides, there were plenty of other great demos. There was even some more sparklining. Check out this fantastic visualisation of the Indie Web Camp IRC logs made by Kevin …who wasn’t even in Düsseldorf; he participated remotely.

If you get the chance to attend an Indie Web Camp I highly, highly recommend it. In the meantime you can start working on your personal site. Here’s a quick primer I wrote a while back on indie web building blocks. Have fun!

Machine supplying

I wrote a little something recently about some inspiring projects that people are working on. Like Matt’s Machine Supply project. There’s a physical side to that project—a tweeting book-vending machine in London—but there’s also the newsletter, 3 Books Weekly.

I was honoured to be asked by Matt to contribute three book recommendations. That newsletter went out last week. Here’s what I said…

The Victorian Internet by Tom Standage

A book about the history of telegraphy might not sound like the most riveting read, but The Victorian Internet is both fascinating and entertaining. Techno-utopianism, moral panic, entirely new ways of working, and a world that has been utterly transformed: the parallels between the telegraph and the internet are laid bare. In fact, this book made me realise that while the internet has been a great accelerator, the telegraph was one of the few instances where a technology could truly be described as “disruptive.”

Ancillary Justice: 1 (Imperial Radch) by Ann Leckie

After I finished reading the final Iain M. Banks novel I was craving more galaxy-spanning space opera. The premise of Ancillary Justice with its description of “ship minds” led me to believe that this could be picking up the baton from the Culture series. It isn’t. This is an entirely different civilisation, one where song-collecting and tea ceremonies have as much value as weapons and spacecraft. Ancillary Justice probes at the deepest questions of identity, both cultural and personal. As well as being beautifully written, it’s also a rollicking good revenge thriller.

The City & The City by China Miéville

China Miéville’s books are hit-and-miss for me, but this one is a direct hit. The central premise of this noir-ish tale defies easy description, so I won’t even try. In fact, one of the great pleasures of this book is to feel the way your mind is subtly contorted by the author to accept a conceit that should be completely unacceptable. Usually when a book is described as “mind-altering” it’s a way of saying it has drug-like properties, but The City & The City is mind-altering in an entirely different and wholly unique way. If Borges and Calvino teamed up to find The Maltese Falcon, the result would be something like this.

When I sent off my recommendations, I told Matt:

Oh man, it was so hard to narrow this down! So many books I wanted to mention: Station 11, The Peripheral, The Gone-Away World, Glasshouse, Foucault’s Pendulum, Oryx and Crake, The Wind-up Girl …this was so much tougher than I thought it was going to be.

And Matt said:

Tell you what — if you’d be up for writing recommendations for another 3 books, from those ones you mentioned, I’d love to feature those in the machine!


Station Eleven by Emily St. John Mandel

Station Eleven made think about the purpose of art and culture. If art, as Brian Eno describes it, is “everything that you don’t have to do”, what happens to art when the civilisational chips are down? There are plenty of post-pandemic stories of societal collapse. But there’s something about this one that sets it apart. It doesn’t assume that humanity will inevitably revert to an existence that is nasty, brutish and short. It’s also a beautifully-written book. The opening chapter completely sucker-punched me.

Glasshouse by Charles Stross

On the face of it, this appears to be another post-Singularity romp in a post-scarcity society. It is, but it’s also a damning critique of gamification. Imagine the Stanford prison experiment if it were run by godlike experimenters. Stross’s Accelerando remains the definitive description of an unfolding Singularity, but Glasshouse is the one that has stayed with me.

The Gone-Away World by Nick Harkaway

This isn’t an easy book to describe, but it’s a very easy book to enjoy. A delightful tale of a terrifying apocalypse, The Gone-Away World has plenty of laughs to balance out the existential dread. Try not to fall in love with the charming childhood world of the narrator—you know it can’t last. But we’ll always have mimes and ninjas.

I must admit, it’s a really lovely feeling to get notified on Twitter when someone buys one of the recommended books.


I was in Nuremberg last weekend for Indie Web Camp. It was great.

At some point I really should stop being surprised by just how much gets done in one weekend, but once again, I was blown away by the results.

On the first day we had very productive BarCamp-like discussion sessions, and on the second day it was heads-down hacking. But it was hacking with help. Being in the same room as other people who each have their own areas of expertise is so useful. It really turbo-charges the amount that you can get accomplished.

For example, I was helping Tom turn his website into a progressive web app with the addition of a service worker and a manifest file. Meanwhile Tom was helping somebody else get a Wordpress site up and running.

Actually, that was what really blew me away: two people began the second day of Indie Web Camp Nuremberg without websites and by the end of the day, they both had their own sites up and running. For me, that’s the real spirit of the indie web—I know we tend to go on about the technologies like h-card, h-entry, webmentions, micropub, and IndieAuth, but really it’s not about the technologies; it’s about having your own place on the web so that you have control over what you put out in the world.

For my part, I was mostly making some cosmetic changes to my site. There was a really good discussion on the first page about home pages. What’s the purpose of a home page? For some, it’s about conveying information about the person. For others, it’s a stream of activity.

My site used to have a splash-like homepage; just a brief bio and a link to the latest blog post. Then I changed it into a stream a few years ago. But that means that the home page of my site doesn’t feel that different from sections of the site like the journal or the link list.

During the discussion at Indie Web Camp, we started looking at how silos design their profile pages to see what we could learn from them. Looking at my Twitter profile, my Instagram profile, my Untappd profile, or just about any other profile, it’s a mixture of bio and stream, with the addition of stats showing activity on the site—signs of life.

I decided I’d add signs of life to my home page. Once again, I reached for my favourite little data visualisation helper: sparklines

A sparkline is a small intense, simple, word-sized graphic with typographic resolution.

I’ve already got sparklines on Huffduffer and on The Session so I suppose it was only a matter of time before they showed up here.

Small Screen Sparklines Large Screen Sparklines

I’ve been tweaking them ever since I got back from Germany. Now I’ve added in a little h-card bio as well.

Bio and sparklines Bio And Sparklines (large screen)

Initially I was using the fantastic little scripted SVG that Stuart made , the same one that I’m using on Huffduffer and The Session. But Kevin pointed out that a straightforward polyline would be more succinct. And in the case of my own site, there’s only four sparklines so it wouldn’t be a huge overhead to hard-code the values straight into the SVGs.

Yesterday was the first day of Render Conference in Oxford (I’ll be speaking later today). Sara gave a blisteringly great talk on (what else?) SVGs and I got so inspired I started refactoring my code right there and then. I’m pretty happy with how the sparklines are working now, although I’m sure I’ll continue to play around with them some more.

There’s another activity visualisation that I’m eager to play around with. I really like the calendar heatmap on my Github profile. I could imagine using something like that for an archive view on my own site.

Luckily for me, I’ll have a chance to play around with my website a bit more very soon. There’s going to be another Indie Web Camp in Germany very soon.

Indie Web Camp Düsseldorf will take place on May 7th and 8th, right before Beyond Tellerrand. Last year’s event was really inspiring. If there’s any chance you can make it, you should come along. You won’t regret it.

Mistakes on a plane

I’m in Seattle. An Event Apart just wrapped up here and it was excellent as always. The venue was great and the audience even greater so I was able to thoroughly enjoy myself when it was time for me to give my talk.

I’m going to hang out here for another few days before it’s time for the long flight back to the UK. The flight over was a four-film affair—that’s how I measure the duration of airplane journeys. I watched:

  1. Steve Jobs,
  2. The Big Short,
  3. Spectre, and
  4. Joy.

I was very glad that I watched Joy after three back-to-back Bechdel failures. Spectre in particular seems to have been written by a teenage boy, and I couldn’t get past the way that the The Big Short used women as narrative props.

I did enjoy Steve Jobs. No surprise there—I enjoy most of Danny Boyle’s films. But there was a moment that took me out of the narrative flow…

The middle portion of the film centres around the launch of the NeXT cube. In one scene, Michael Fassbender’s Jobs refers to another character as “Rain Man”. I immediately started to wonder if that was an anachronistic comment. “When was Rain Man released?” I thought to myself.

It turns out that Rain Man was released in 1988 and the NeXT introduction was also in 1988 but according to IMDB, Rain Man was released in December …and the NeXT introduction was in October.

The jig is up, Sorkin!

The voice of MOL

The latest issue of Spaceflight—the magazine of the British Interplanetary Society—dropped through my door, adding to my weekend reading list. This issue contains a “whatever happened to” article about the military personnel who were supposed to crew the never-realised MOL project.

Before Salyut, Skylab, Mir, or the ISS, the Manned Orbital Laboratory was the first proposed space station. It would use a Gemini capsule and a Titan propellant tank.

Manned Orbital Laboratory

But this wasn’t to be a scientific endeavour. The plan was to use the MOL as a crewed spy satellite—human eyes in the sky watching the enemy below.

The MOL was cancelled (because uncrewed satellites were getting better at that sort of thing), so that particular orbital panopticon never came to pass.

I remember when I first heard of the MOL and I was looking it up on Wikipedia, that this little nugget of information stood out to me:

The MOL was planned to use a helium-oxygen atmosphere.

That’s right: instead of air (21% oxygen, 79% nitrogen), the spies in the sky would be breathing heliox (21% oxygen, 79% helium). Considering the effect that helium has on the human voice, I can only imagine that the grave nature of the mission would have been somewhat compromised.

Independently published

Jessica writes about The Heroine’s Journey.

Remy explains Why I love working with the web.

Ludwig dreams of designers and developers working Together.

Charlotte documents her technique Teaching the order of margins in CSS.

Craig field-tests The Leica Q.

Robin thinks about The New Web Typography.

Michael dives deep into A Complete History of the Millennium Falcon.

What do they all have in common? Nothing …other than the fact that each person chose to write on their own website. I’m grateful for that. These are all wonderful pieces of writing—they deserve a long life.

Homebrew header hardening

I’m at Homebrew Website Club. I figured I’d use this time to document some tweaking I’ve been doing to the back end of my website.

securityheaders.io is a handy site for testing whether your website’s server is sending sensible headers. Think of it like SSL Test for a few nitty-gritty details.

adactio.com was initially scoring very low, but the accompanying guide to hardening your HTTP headers meant I was able to increase my ranking to acceptable level.

My site is running on an Apache server on an Ubuntu virtual machine on Digital Ocean. If you’ve got a similar set-up, this might be useful…

I ssh’d into my server and went to this folder in the Apache directory

cd /etc/apache2/sites-available

There’s a file called default-ssl.conf that I need to edit (my site is being served up over HTTPS; if your site isn’t, you should edit 000-default.conf instead). I type:

nano default-ssl.conf

Depending on your permissions, you might need to type:

sudo nano default-ssl.conf

Now I’m inside nano. It’s like any other text editor you might be used to using, if you imagined what it would be like to remove all the useful features from it.

Within the <Directory /var/www/> block, I add a few new lines:

<IfModule mod_headers.c>
  Header always set X-Xss-Protection "1; mode=block"
  Header always set X-Frame-Options "SAMEORIGIN"
  Header always set X-Content-Type-Options "nosniff"

Those are all no-brainers:

  • Enable protection against cross-site-scripting.
  • Don’t allow your site to be put inside a frame.
  • Don’t allow anyone to change the content-type headers of your files after they’ve been sent from the server.

If you’re serving your site over HTTPS, and you’re confident that you don’t have any mixed content (a mixture of HTTPS and HTTP), you can add this line as well:

Header always set Content-Security-Policy "default-src https: data: 'unsafe-inline' 'unsafe-eval'"

To really up your paranoia (and let’s face it, that’s what security is all about; justified paranoia), you can throw this in too:

Header unset Server
Header unset X-Powered-By

That means that your server will no longer broadcast its intimate details. Of course, I’ve completely reversed that benefit by revealing to you in this blog post that my site is running on Apache on Ubuntu.

I’ll tell you something else too: it’s powered by PHP. There’s some editing I did there too. But before I get to that, let’s just finish up that .conf file…

Hit ctrl and o, then press enter. That writes out the file you’ve edited. Now you can leave nano: press ctrl and x.

You’ll need to restart Apache for those changes to take effect. Type:

service apache2 restart

Or, if permission is denied:

sudo service apache2 restart

Now, about that PHP thing. Head over to a different directory:

cd /etc/php5/fpm

Time to edit the php.ini file. Type:

nano php.ini

Or, if you need more permissions:

sudo nano php.ini

It’s a long file, but you’re really only interested in one line. A shortcut to finding that line is to hit ctrl and w (for “where is?”), type expose, and hit enter. That will take you to the right paragraph. If you see a line that says:

expose_php = On

Change it to:

expose_php= Off

Save the file (ctrl and o, enter) then exit nano (ctrl and x).

Restart Apache:

service apache2 restart

Again, you might need to preface that with sudo.

Alright, head on back to securityheaders.io and see how your site is doing now. You should be seeing a much better score.

There’s one more thing I should be doing that’s preventing me from getting a perfect score. That’s Public Key Pinning. It sounds a bit too scary for a mere mortal like me to attempt. Or rather, the consequences of getting it wrong (which I probably would), sound too scary.

The Force Awakens

You can listen to an audio version of The Force Awakens.

I’d like to talk about The Force Awakens (I mean, really, how can I not?) so there will be inevitable spoilers. Bail now if you haven’t seen the film.

Star Wars was a big part of my childhood. By extension—and because I’ve never really grown up—Star Wars has always been part of my identity, at least in the shallow sense of what I’d list under “hobbies and interests” on a theoretical form. Still, I could relate to Michael’s feelings in the run-up to the new film’s release:

Despite much evidence to the contrary, I don’t hang too many of my wants and needs on Star Wars or its continuing life as a franchise. I’m the fan-equivalent of a deep history archeologist, not a pundit or an evangelist.

While I’ve always been a big fan of Star Wars: The Films, I’ve never cared much about Star Wars: The Franchise. When my local pub quiz for nerds—The Geekest Link—has a Star Wars night, I enter with a prayer of “please no ‘Expanded Universe’, please no ‘Expanded Universe’.”

When I heard that Lucasfilm had been sold to Disney, I was intrigued—this could get interesting! When I heard that J.J. Abrams would be directing Episode VII, I was pretty happy—I like his work, and he’s a safe pair of hands. But I didn’t want to get too excited. Partly that’s because I’ve been burnt before—although I’m something of a prequels apologist in comparison to the hatred they inspired in most people. Mostly though, it’s because I’m aware that when it comes to something that doesn’t yet exist—whether it’s a Star Wars film, a forthcoming album, or an upcoming project at work—the more hope you place on its shoulders, the more unlikely it is to be able to fulfil those over-inflated expectations.

But as The Force Awakens drew closer and closer, despite my best intentions, I couldn’t help but get excited. Jessica and I watched and re-watched the trailers. The day that tickets went on sale, the website for my cinema of choice crashed, so I picked up the phone and waited in a queue to secure seats for the minute-past-midnight first showing (if you know how much I dislike telephonic communication, you’ll appreciate how unusual that action was for me).

I began to literally count down the days. In the final week, Jessica and I re-watched the Star Wars films in Machete Order, which I can highly recommend. That culminated on the evening of December 16th with a gathering ‘round at Andy’s to eat some food, watch Return Of The Jedi, and then head to the cinema before midnight. By the time I was sitting in my seat surrounded by equally enthusiastic fans, I was positively aquiver with excitement.

When the fanfare blasted and the Star Wars logo appeared, I was grinning from ear to ear. Then I experienced something really wonderful: I had no idea what was going to happen next. Going into this film with no knowledge of plot details or twists was the best possible way to experience it.

I didn’t know what the words of the opening crawl would be. I didn’t know who any of the characters were. I didn’t know what anybody was going to say. I know that sounds like a weird thing to fixate on—after all, didn’t we get that with the prequel films too? Well, not really. Because they were all backstory, there were clearly-delineated constraints on what could and couldn’t happen in those films. But with these new films, anything is possible.

I really, really, really enjoyed watching The Force Awakens. But in order to truly evaluate the film on its own merits, I knew I’d have to see it again in more normal circumstances (and who am I kidding? I didn’t need much of an excuse to see it again).

I’ve seen it three times now. I loved it every time. If anything, the things that slightly bothered me on first seeing the film have diminished with subsequent viewings. It stands up to repeat watching, something that isn’t necessarily true of other J.J. Abrams films—I enjoyed Star Trek Into Darkness when I first saw it, but with every time I see it again, it grows a little weaker.

As I said, there were things that slightly bothered me and I’ll get to those, but my overwhelming feelings about this film are very, very positive. I think the world-building is really good. I think the film itself is superbly crafted, as described in this excellent point-by-point analysis by Chris Dickinson. But above all, what I love the most about The Force Awakens are the characters.

Rey. What can I say? She is quite simply a wonderfully-written character brought to life by an astonishingly good performance. And of course I’m going to join in the chorus of people who are glad that we finally get a lead role for a woman in this galaxy. Granted, Star Wars: The Force Awakens isn’t exactly Mad Max: Fury Road, but still, how great is it that 2015 has given us both Rey and Furiousa?

(You know what it is? It’s a good start.)

Likewise with Finn: great character; great performance. Throw in Kylo Ren, Poe Dameron and even BB8 …I’m sold. I’m invested in their stories now. I want to know what happens next. I want to spend time with them.

But The Force Awakens wasn’t just about new heroes and villains. As audacious as it would be to start from an entirely clean slate, it also needed to tie in to the beloved original films. On the whole, I think this film did a good job of balancing the past and the future.

Paul came along to that midnight viewing; a ticket became available at the last minute. But he was prepared not to enjoy it, or even understand it, given that he’s never really watched Star Wars.

“Actually”, I said, “I’d be really interested to find out what you think of it.”

I’m too close to the source material; I can’t objectively judge whether the new film could stand on its own, as opposed to be being the latest episode in an existing saga.

As it turned out, Paul really enjoyed it. Sure, there was stuff he was aware he was missing out on, but interestingly, there was even more stuff that we were all missing out on: the script is filled with references to events that happened in the intervening decades between the old films and the new. I liked that a lot. It helped solidify this as being simultaneously a brand new chapter and also just one sliver of a larger ongoing narrative.

The Force Awakens is very much a bridging piece between the old and the new. The torch was passed on with dignity, and surprisingly, it was Harrison Ford’s Han Solo that made it a convincing handover.

I say “surprisingly” because remember, we had just watched Return Of The Jedi before The Force Awakens and it is so clear that Harrison Ford really didn’t want to be in that film. I know Han Solo is supposed to be somewhat sarcastic, but it was dialled up to 11 for Jedi, and I’m pretty sure it was a very, shall we say, “naturalistic” performance. But here he is over thirty years later, really breathing life into the character.

Through the stewardship of Harrison Ford, we were lovingly taken from the original films that we know so well into a new story. Han Solo picked up the audience like it was a child that had fallen asleep in the car, and he gently tucked us into our familiar childhood room where we can continue to dream. And then, with a tender brush of his hand across the cheek, he left us.

In many ways, Han Solo in The Force Awakens is Ben Kenobi in Star Wars …but with a much more fleshed-out history and a more interesting personal journey. Now he’s the one saying that the Force is real (and he does it in the very spot where he originally ridiculed Kenobi). It’s as if Scully were to slowly come around to Mulder’s worldview and finally intone “I want to believe.”

The biggest gripe that other people have with The Force Awakens is how much the plot resembles that of the original Star Wars. It’s undeniable. The question is how much that matters, and a result, how much it bothers you. It really bothered Khoi. It somewhat bothered Andy. It didn’t bother me much, but it was definitely an aspect that prevented the film from being a complete triumph. But it’s also one of those issues that diminishes with repeated viewing.

Those bothered by the echoes between Star Wars and The Force Awakens are going to be really pissed off when they find out about World War One and World War Two. “Britain and America fight Germany again? Really!?” (Probably best not to even mention any of the Gulf wars).

I get the feeling though that the people who are bothered by the plot are perhaps overplaying the similarities and underplaying the differences.

So yes, in one sense Rey in The Force Awakens is like Luke in Star Wars—a young person on a desert planet far from the action. But then there are the differences: where Luke was whining about his situation, Rey is mastering hers. And of course there’s the fact that he in 1977 is now she in 2015. “That doesn’t make any difference!” you may cry, and you’d be exactly right: it shouldn’t make any difference …so why has it taken us four decades to get to this?

The casting of Rey and Finn is simultaneously unimportant and monumental. It’s unimportant in that it makes no difference to the story whether Rey is a woman or Finn is black. It’s monumental in that they are the main characters in what everyone knew would be the biggest film of the century so far.

One of the other complaints that people have with The Force Awakens is the unclear political background. Here’s Michael again:

The rebels killed the Emperor and won, but now they’re ‘the resistance’? Why? They’re backed by the republic, so why aren’t they just the armed forces of the republic? The First Order strikes against the republic (looked like Coruscant, but apparently wasn’t). How big is the First Order? Big enough to build Starkiller Base, but what does that mean? Do they control systems? Do they have support inside the republic? Is this like a separatists thing? How long have they been around? How are they funded?

This certainly bugged me. It was the kind of issue that could have been fixed with one explanatory scene. Sure enough, it turns out that such a scene was shot but then cut from the film. Mostly that was to keep the film’s running time down, but I suspect that after the dull talkiness of the prequels, there may also have been some overcompensating course-correction away from anything with even a whiff of politics. Alas, that phobia of trade routes and senators resulted in an unclear backstory. It wasn’t until my third viewing that I realised that Hux’s speech is the closest thing to a blackboard scene for the galactic geopolitics: there’s a proxy war between wannabe extremists looking to set up a caliphate (think ISIS) and a resistance (think the Kurds) being funded by the dominant power (think America) …up until The First Order carry out a 9-11/Pearl Harbour/Vulcan scale attack, leaving the balance of power wide open—the next film could take it in any direction.

One of the most impressive achievements of The Force Awakens is that after seeing it, I didn’t want to think about how it tied back to the original films, as I expected I would want to do. Instead, I was entirely preoccupied with questions of what’s going to happen next.

Everyone is talking about Rey. Where is she from? What is her parentage? The most popular theories are currently:

  1. She is Luke Skywalker’s daughter.
  2. She is Han and Leia’s daughter, the secret sister of Kylo Ren.
  3. She is Ben Kenobi’s granddaughter.

Personally, I’d like it if her parentage were unremarkable. Maybe it’s the socialist in me, but I’ve never liked the idea that the Force is based on eugenics; a genetic form of inherited wealth for the lucky 1%. I prefer to think of the Force as something that could potentially be unlocked by anyone who tries hard enough.

But there are too many hints at Rey’s origins for her parentage to go unexplained. All the signs point to her having some kind of connection to existing bloodlines. Unless…

Lawrence Kasdan has been dropping hints about how odd Episode VIII is going to be, mostly because it has Rian Johnson at the helm. He gave us the terrific Looper. One of the most unsettling aspects of that film was the presence of a child with buried potential for destruction through telekinetic powers. For everyone’s safety, the child is kept far from civilisation.

Okay, I know it’s a stretch but what if Rey is on Jakku for similar reasons? Her parents aren’t Skywalkers or Kenobis, they’re just scared by the destructive episodes they’ve experienced with their Force-sensitive infant. With enormous reluctance—but for the greater good—they deposit her on a faraway world.


Okay, well, if you don’t like that theory, you’re going to hate this one:

What if Rey is the daughter of Luke and Leia?

Eww! I know, I know. But, hey, you can’t say the signs weren’t there all along. And the shame of an incestuous union could be the reason for the child’s secret exile.

It’s preposterous of course. Even in a post-Game Of Thrones landscape, that would be going too far, even for Rian Johnson …or would it?

Now I’ve planted the idea in your head. Sorry about that.

Still, how great is it that we we’re all talking about what’s going to happen next?

Some people have asked me where I think The Force Awakens ranks in comparison to the other Star Wars films, and I wasn’t prepared for the question. I honestly haven’t been thinking about it in the context of the original films. Instead I’ve been thinking about the new characters and the new storyline. As Maz Kanata would say:

The belonging you seek is not behind you, it is ahead.

Where to start?

A lot of the talks at this year’s Chrome Dev Summit were about progressive web apps. This makes me happy. But I think the focus is perhaps a bit too much on the “app” part on not enough on “progressive”.

What I mean is that there’s an inevitable tendency to focus on technologies—Service Workers, HTTPS, manifest files—and not so much on the approach. That’s understandable. The technologies are concrete, demonstrable things, whereas approaches, mindsets, and processes are far more nebulous in comparison.

Still, I think that the most important facet of building a robust, resilient website is how you approach building it rather than what you build it with.

Many of the progressive app demos use server-side and client-side rendering, which is great …but that aspect tends to get glossed over:

Browsers without service worker support should always be served a fall-back experience. In our demo, we fall back to basic static server-side rendering, but this is only one of many options.

I think it’s vital to not think in terms of older browsers “falling back” but to think in terms of newer browsers getting a turbo-boost. That may sound like a nit-picky semantic subtlety, but it’s actually a radical difference in mindset.

Many of the arguments I’ve heard against progressive enhancement—like Tom’s presentation at Responsive Field Day—talk about the burdensome overhead of having to bolt on functionality for older or less-capable browsers (even Jake has done this). But the whole point of progressive enhancement is that you start with the simplest possible functionality for the greatest number of users. If anything gets bolted on, it’s the more advanced functionality for the newer or more capable browsers.

So if your conception of progressive enhancement is that it’s an added extra, I think you really need to turn that thinking around. And that’s hard. It’s hard because you need to rewire some well-engrained pathways.

There is some precedence for this though. It was really, really hard to convince people to stop using tables for layout and starting using CSS instead. That was a tall order—completely change the way you approach building on the web. But eventually we got there.

When Ethan came out with Responsive Web Design, it was an equally difficult pill to swallow, not because of the technologies involved—media queries, percentages, etc.—but because of the change in thinking that was required. But eventually we got there.

These kinds of fundamental changes are inevitably painful …at first. After years of building websites using tables for layout, creating your first CSS-based layout was demoralisingly difficult. But the second time was a bit easier. And the third time, easier still. Until eventually it just became normal.

Likewise with responsive design. After years of building fixed-width websites, trying to build in a fluid, flexible way was frustratingly hard. But the second time wasn’t quite as hard. And the third time …well, eventually it just became normal.

So if you’re used to thinking of the all-singing, all-dancing version of your site as the starting point, it’s going to be really, really hard to instead start by building the most basic, accessible version first and then work up to the all-singing, all-dancing version …at first. But eventually it will just become normal.

For now, though, it’s going to take work.

The recent redesign of Google+ is true case study in building a performant, responsive, progressive site:

With server-side rendering we make sure that the user can begin reading as soon as the HTML is loaded, and no JavaScript needs to run in order to update the contents of the page. Once the page is loaded and the user clicks on a link, we do not want to perform a full round-trip to render everything again. This is where client-side rendering becomes important — we just need to fetch the data and the templates, and render the new page on the client. This involves lots of tradeoffs; so we used a framework that makes server-side and client-side rendering easy without the downside of having to implement everything twice — on the server and on the client.

This took work. Had they chosen to rely on client-side rendering alone, they could have built something quicker. But I think it was worth laying that solid foundation. And the next time they need to build something this way, it’s going to be less work. Eventually it just becomes normal.

But it all starts with thinking of the server-side rendering as the default. Server-side rendering is not a fallback; client-side rendering is an enhancement.

That’s exactly the kind of mindset that enables Jack Franklin to build robust, resilient websites:

Now we’ll build the React application entirely on the server, before adding the client-side JavaScript right at the end.

I had a chance to chat briefly with Jack at the Edge conference in London and I congratulated him on the launch of a Go Cardless site that used exactly this technique. He told me that the decision to flip the switch and make it act as a single page app came right at the end of the project. Server-side rendering was the default; client-side rendering was added later.

The key to building modern, resilient, progressive sites doesn’t lie in browser technologies or frameworks; it lies in how we think about the task at hand; how we approach building from the ground up rather than the top down. Changing the way we fundamentally think about building for the web is inevitably going to be challenging …at first. But it will also be immensely rewarding.

My first Service Worker

I’ve made no secret of the fact that I’m really excited about Service Workers. I’m not alone. At the Coldfront conference in Copenhagen, pretty much every talk mentioned Service Workers.

Obviously I’m excited about what Service Workers enable: offline caching, background processes, push notifications, and all sorts of other goodies that allow the web to compete with native. But more than that, I’m really excited about the way that the Service Worker spec has been designed. Instead of being an all-or-nothing technology that you have to bet the farm on, it has been deliberately crafted to be used as an enhancement on top of existing sites (oh, how I wish that web components would follow a similar path).

I’ve got plenty of ideas on how Service Workers could be used to enhance a community site like The Session or the kind of events sites that we produce at Clearleft, but to begin with, I figured it would make sense to use my own personal site as a playground.

To start with, I’ve already conquered the first hurdle: serving my site over HTTPS. Service Workers require a secure connection. But you can play around with running a Service Worker locally if you run a copy of your site on localhost.

That’s how I started experimenting with Service Workers: serving on localhost, and stopping and starting my local Apache server with apachectl stop and apachectl start on the command line.

That reminds of another interesting use case for Service Workers: it’s not just about the user’s network connection failing (say, going into a train tunnel); it’s also about your web server not always being available. Both scenarios are covered equally.

I would never have even attempted to start if it weren’t for the existing examples from people who have been generous enough to share their work:

Also, I knew that Jake was coming to FF Conf so if I got stumped, I could pester him. That’s exactly what ended up happening (thanks, Jake!).

So if you decide to play around with Service Workers, please, please share your experience.

It’s entirely up to you how you use Service Workers. I figured for a personal site like this, it would be nice to:

  1. Explicitly cache resources like CSS, JavaScript, and some images.
  2. Cache the homepage so it can be displayed even when the network connection fails.
  3. For other pages, have a fallback “offline” page to display when the network connection fails.

So now I’ve got a Service Worker up and running on adactio.com. It will only work in Chrome, Android, Opera, and the forthcoming version of Firefox …and that’s just fine. It’s an enhancement. As more and more browsers start supporting it, this Service Worker will become more and more useful.

How very future friendly!

The code

If you’re interested in the nitty-gritty of what my Service Worker is doing, read on. If, on the other hand, code is not your bag, now would be a good time to bow out.

If you want to jump straight to the finished code, here’s a gist. Feel free to take it, break it, copy it, improve it, or do anything else you want with it.

To start with, let’s establish exactly what a Service Worker is. I like this definition by Matt Gaunt:

A service worker is a script that is run by your browser in the background, separate from a web page, opening the door to features which don’t need a web page or user interaction.


From inside my site’s global JavaScript file—or I could do this from a script element inside my pages—I’m going to do a quick bit of feature detection for Service Workers. If the browser supports it, then I’m going register my Service Worker by pointing to another JavaScript file, which sits at the root of my site:

if (navigator.serviceWorker) {
  navigator.serviceWorker.register('/serviceworker.js', {
    scope: '/'

The serviceworker.js file sits in the root of my site so that it can act on any requests to my domain. If I put it somewhere like /js/serviceworker.js, then it would only be able to act on requests to the /js directory.

Once that file has been loaded, the installation of the Service Worker can begin. That means the script will be installed in the user’s browser …and it will live there even after the user has left my website.


I’m making the installation of the Service Worker dependent on a function called updateStaticCache that will populate a cache with the files I want to store:

self.addEventListener('install', function (event) {

That updateStaticCache function will be used for storing items in a cache. I’m going to make sure that the cache has a version number in its name, exactly as described in the Guardian’s use case. That way, when I want to update the cache, I only need to update the version number.

var staticCacheName = 'static';
var version = 'v1::';

Here’s the updateStaticCache function that puts the items I want into the cache. I’m storing my JavaScript, my CSS, some images referenced in the CSS, the home page of my site, and a page for displaying when offline.

function updateStaticCache() {
  return caches.open(version + staticCacheName)
    .then(function (cache) {
      return cache.addAll([

Because those items are part of the return statement for the Promise created by caches.open, the Service Worker won’t install until all of those items are in the cache. So you might want to keep them to a minimum.

You can still put other items in the cache, and not make them part of the return statement. That way, they’ll get added to the cache in their own good time, and the installation of the Service Worker won’t be delayed:

function updateStaticCache() {
  return caches.open(version + staticCacheName)
    .then(function (cache) {
      return cache.addAll([

Another option is to use completely different caches, but I’ve decided to just use one cache for now.


When the activate event fires, it’s a good opportunity to clean up any caches that are out of date (by looking for anything that doesn’t match the current version number). I copied this straight from Nicolas’s code:

self.addEventListener('activate', function (event) {
      .then(function (keys) {
        return Promise.all(keys
          .filter(function (key) {
            return key.indexOf(version) !== 0;
          .map(function (key) {
            return caches.delete(key);


The fetch event is fired every time the browser is going to request a file from my site. The magic of Service Worker is that I can intercept that request before it happens and decide what to do with it:

self.addEventListener('fetch', function (event) {
  var request = event.request;

POST requests

For a start, I’m going to just back off from any requests that aren’t GET requests:

if (request.method !== 'GET') {

That’s basically just replicating what the browser would do anyway. But even here I could decide to fall back to my offline page if the request doesn’t succeed. I do that using a catch clause appended to the fetch statement:

if (request.method !== 'GET') {
          .catch(function () {
              return caches.match('/offline');

HTML requests

I’m going to treat requests for pages differently to requests for files. If the browser is requesting a page, then here’s the order I want:

  1. Try fetching the page from the network first.
  2. If that doesn’t work, try looking for the page in the cache.
  3. If all else fails, show the offline page.

First of all, I need to test to see if the request is for an HTML document. I’m doing this by sniffing the Accept headers, which probably isn’t the safest method:

if (request.headers.get('Accept').indexOf('text/html') !== -1) {

Now I try to fetch the page from the network:


If the network is working fine, this will return the response from the site and I’ll pass that along.

But if that doesn’t work, I’m going to look for a match in the cache. Time for a catch clause:

.catch(function () {
  return caches.match(request);

So now the whole event.respondWith statement looks like this:

    .catch(function () {
      return caches.match(request)

Finally, I need to take care of the situation when the page can’t be fetched from the network and it can’t be found in the cache.

Now, I first tried to do this by adding a catch clause to the caches.match statement, like this:

return caches.match(request)
  .catch(function () {
    return caches.match('/offline');

That didn’t work and for the life of me, I couldn’t figure out why. Then Jake set me straight. It turns out that caches.match will always return a response …even if that response is undefined. So a catch clause will never be triggered. Instead I need to return the offline page if the response from the cache is falsey:

return caches.match(request)
  .then(function (response) {
    return response || caches.match('/offline');

With that cleared up, my code for handing HTML requests looks like this:

  fetch(request, { credentials: 'include' })
    .catch(function () {
      return caches.match(request)
        .then(function (response) {
          return response || caches.match('/offline');

Actually, there’s one more thing I’m doing with HTML requests. If the network request succeeds, I stash the response in the cache.

Well, that’s not exactly true. I stash a copy of the response in the cache. That’s because you’re only allowed to read the value of a response once. So if I want to do anything with it, I have to clone it:

var copy = response.clone();
caches.open(version + staticCacheName)
  .then(function (cache) {
    cache.put(request, copy);

I do that right before returning the actual response. Here’s how it fits together:

if (request.headers.get('Accept').indexOf('text/html') !== -1) {
    fetch(request, { credentials: 'include' })
      .then(function (response) {
        var copy = response.clone();
        caches.open(version + staticCacheName)
          .then(function (cache) {
            cache.put(request, copy);
        return response;
      .catch(function () {
        return caches.match(request)
          .then(function (response) {
            return response || caches.match('/offline');

Okay. So that’s requests for pages taken care of.

File requests

I want to handle requests for files differently to requests for pages. Here’s my list of priorities:

  1. Look for the file in the cache first.
  2. If that doesn’t work, make a network request.
  3. If all else fails, and it’s a request for an image, show a placeholder.

Step one: try getting the file from the cache:


Step two: if that didn’t work, go out to the network. Now remember, I can’t use a catch clause here, because caches.match will always return something: either a response or undefined. So here’s what I do:

    .then(function (response) {
      return response || fetch(request);

Now that I’m back to dealing with a fetch statement, I can use a catch clause to take care of the third and final step: if the network request doesn’t succeed, check to see if the request was for an image, and if so, display a placeholder:

.catch(function () {
  if (request.headers.get('Accept').indexOf('image') !== -1) {
    return new Response('<svg>...</svg>',  { headers: { 'Content-Type': 'image/svg+xml' }});

I could point to a placeholder image in the cache, but I’ve decided to send an SVG on the fly using a new Response object.

Here’s how the whole thing looks:

    .then(function (response) {
      return response || fetch(request)
        .catch(function () {
          if (request.headers.get('Accept').indexOf('image') !== -1) {
            return new Response('<svg>...</svg>', { headers: { 'Content-Type': 'image/svg+xml' }});

The overall shape of my code to handle fetch events now looks like this:

self.addEventListener('fetch', function (event) {
  var request = event.request;
  // Non-GET requests
  if (request.method !== 'GET') {
  // HTML requests
  if (request.headers.get('Accept').indexOf('text/html') !== -1) {
  // Non-HTML requests

Feel free to peruse the code.

Next steps

The code I’m running now is fine for a first stab, but there’s room for improvement.

Right now I’m stashing any HTML pages the user visits into the cache. I don’t think that will get out of control—I imagine most people only ever visit just a handful of pages on my site. But there’s the chance that the cache could get quite bloated. Ideally I’d have some way of keeping the cache nice and lean.

I was thinking: maybe I should have a separate cache for HTML pages, and limit the number in that cache to, say, 20 or 30 items. Every time I push something new into that cache, I could pop the oldest item out.

I could imagine doing something similar for images: keeping a cache of just the most recent 10 or 20.

If you fancy having a go at coding that up, let me know.

Lessons learned

There were a few gotchas along the way. I already mentioned the fact that caches.match will always return something so you can’t use catch clauses to handle situations where a file isn’t found in the cache.

Something else worth noting is that this:


…is functionally equivalent to this:

  .then(function (response) {
    return response;

That’s probably obvious but it took me a while to realise. Likewise:


…is the same as:

  .then(function (response) {
    return response;

Here’s another thing… you’ll notice that sometimes I’ve used:


…but sometimes I’ve used:

fetch(request, { credentials: 'include' } );

That’s because, by default, a fetch request doesn’t include cookies. That’s fine if the request is for a static file, but if it’s for a potentially-dynamic HTML page, you probably want to make sure that the Service Worker request is no different from a regular browser request. You can do that by passing through that second (optional) argument.

But probably the trickiest thing is getting your head around the idea of Promises. Writing JavaScript is generally a fairly procedural affair, but once you start dealing with then clauses, you have to come to grips with the fact that the contents of those clauses will return asynchronously. So statements written after the then clause will probably execute before the code inside the clause. It’s kind of hard to explain, but if you find problems with your Service Worker code, check to see if that’s the cause.

And remember, please share your code and your gotchas: it’s early days for Service Workers so every implementation counts.


I got some very useful feedback from Jake after I published this…

Expires headers

By default, JavaScript files on my server are cached for a month. But a Service Worker script probably shouldn’t be cached at all (or cached for a very, very short time). I’ve updated my .htaccess rules accordingly:

<FilesMatch "serviceworker.js">
  ExpiresDefault "now"

If a request is initiated by the browser, I don’t need to say:

fetch(request, { credentials: 'include' } );

It’s enough to just say:


I set the scope parameter of my Service Worker to be “/” …but because the Service Worker is sitting in the root directory anyway, I don’t really need to do that. I could just register it with:

if (navigator.serviceWorker) {

If, on the other hand, the Service Worker file were sitting in a folder, but I wanted it to act on the whole site, then I would need to specify the scope:

if (navigator.serviceWorker) {
  navigator.serviceWorker.register('/path/to/serviceworker.js', {
    scope: '/'

…and I’d also need to send a special header. So it’s probably easiest to just put Service Worker scripts in the root directory.


It was only last week that myself and Ellen were brainstorming ideas for a combined workshop. Our enthusiasm got the better of us, and we said “Let’s just do it!” Before we could think better of it, the room was booked, and the calendar invitations were sent.


The topic was “story.”

No wait, maybe it was …”narrative.”

That’s not quite right either.

“Content,” perhaps?

Basically, here’s the issue: at some point everyone at Clearleft needs to communicate something by telling a story. It might be a blog post. It might be a conference talk. It might be a proposal for a potential client. It might be a case study of our work. Whatever form it might take, it involves getting a message across …using words. Words are hard. We wanted to make them a little bit easier.

We did two workshops. Ellen’s was yesterday. Mine was today. They were both just about two hours in length.

Get out of my head!

Ellen’s workshop was all about getting thoughts out of your head and onto paper. But before we could even start to do that, we had to confront our first adversary: the inner critic.

You know the inner critic. It’s that voice inside you that says “You’ve got nothing new to say”, or “You’re rubbish at writing.” Ellen encouraged each of us to drag this inner critic out into the light—much like Paul Ford did with his AnxietyBox.

Each of us drew a cartoon of our inner critic, complete with speech bubbles of things our inner critic says to us.

Drawing our inner critic inner critics

In a bizarre coincidence, Chloe and I had exactly the same inner critic, complete with top hat and monocle.

With that foe vanquished, we proceeded with a mind map. The idea was to just dump everything out of our heads and onto paper, without worrying about what order to arrange things in.

I found it to be an immensely valuable exercise. Whenever I’ve tried to do this before, I’d open up a blank text file and start jotting stuff down. But because of the linear nature of a text file, there’s still going to be an order to what gets jotted down; without meaning to, I’ve imposed some kind of priority onto the still-unformed thoughts. Using a mind map allowed me get everything down first, and then form the connections later.

mind mapping

There were plenty of other exercises, but the other one that really struck me was a simple framework of five questions. Whatever it is that you’re trying to say, write down the answers to these questions about your audience:

  1. What are they sceptical about?
  2. What problems do they have?
  3. What’s different now that you’ve communicated your message?
  4. Paint a pretty picture of life for them now that you’ve done that.
  5. Finally, what do they need to do next?

They’re straightforward questions, but the answers can really help to make sure you’re covering everything you need to.

There were many more exercises, and by the end of the two hours, everyone had masses of raw material, albeit in an unstructured form. My workshop was supposed to help them take that content and give it some kind of shape.

The structure of stories

Ellen and I have been enjoying some great philosophical discussions about exactly what a story is, and how does it differ from a narrative structure, or a plot. I really love Ellen’s working definition: Narrative. In Space. Over Time.

This led me to think that there’s a lot that we can borrow from the world of storytelling—films, novels, fairy tales—not necessarily about the stories themselves, but the kind of narrative structures we could use to tell those stories. After all, the story itself is often the same one that’s been told time and time again—The Hero’s Journey, or some variation thereof.

So I was interested in separating the plot of a story from the narrative device used to tell the story.

To start with, I gave some examples of well-known stories with relatively straightforward plots:

  • Star Wars,
  • Little Red Riding Hood,
  • Your CV,
  • Jurassic Park, and
  • Ghostbusters.

I asked everyone to take a story (either from that list, or think of another one) and write the plot down on post-it notes, one plot point per post-it. Before long, the walls were covered with post-its detailing the plot lines of:

  • Robocop,
  • Toy Story,
  • Back To The Future,
  • Elf,
  • E.T.,
  • The Three Little Pigs, and
  • Pretty Woman.

Okay. That’s plot. Next we looked at narrative frameworks.

Narrative frameworks as Oblique Strategies.


Begin at a crucial moment, then back up to explain how you ended up there.

e.g. Citizen Kane “Rosebud!”


Instead of describing the action directly, have characters tell it to one another.

e.g. The Dialogues of Plato …or The Breakfast Club (or one of my favourite sci-fi short stories).

In Media Res

Begin in the middle of the action. No exposition allowed, but you can drop hints.

e.g. Mad Max: Fury Road (or Star Wars, if it didn’t have the opening crawl).


Begin with a looooong zooooom into the past before taking up the story today.

e.g. 2001: A Space Odyssey.

Distancing Effect

Just the facts with no embellishment.

e.g. A police report.

You get the idea.

In a random draw, everyone received a card with a narrative device on it. Now they had to retell the story they chose using that framing. That led to some great results:

  • Toy Story, retold as a conversation between Andy and his psychiatrist (dialogue),
  • E.T., retold as a missing person’s report on an alien planet (distancing effect),
  • Elf, retold with an introduction about the very first Christmas (backstory),
  • Robocop, retold with Murphy already a cyborg, remembering his past (flashback),
  • The Three Little Pigs, retold with the wolf already at the door and no explanation as to why there’s straw everywhere (in media res).

Once everyone had the hang of it, I asked them to revisit their mind maps and other materials from the previous day’s workshop. Next, they arranged the “chunks” of that story into a linear narrative …but without worrying about getting it right—it’s not going to stay linear for long. Then, everyone is once again given a narrative structure. Now try rearranging and restructuring your story to use that framework. If something valuable comes out of that, great! If not, well, it’s still a fun creative exercise.

And that was pretty much it. I had no idea what I was doing, but it didn’t matter. It wasn’t really about me. It was about helping others take their existing material and play with it.

That said, I’m glad I finally got this process out into the world in some kind of semi-formalised way. I’ve been preparing talks and articles using these narrative exercises for a while, but this workshop was just the motivation I needed to put some structure on the process.

I think I might try to create a proper deck of cards—along the lines of Brian Eno’s Oblique Strategies or Stephen Andersons’s Mental Notes—so that everyone has the option of injecting a random narrative structural idea into the mix whenever they’re stuck.

At the very least, it would be a distraction from listening to that pesky inner critic.