Tags: thesession



Friday, August 3rd, 2018

Greater expectations

I got an intriguing email recently from someone who’s a member of The Session, the community website about Irish traditional music that I run. They said:

When I recently joined, I used my tablet to join. Somewhere I was able to download The Session app onto my tablet.

But there is no native app for The Session. Although, as it’s a site that I built, it is, a of course, progressive web app.

They went on to say:

I wanted to put the app on my phone but I can’t find the app to download it. Can I have the app on more than one device? If so, where is it available?

I replied saying that yes, you can absolutely have it on more than one device:

But you don’t find The Session app in the app store. Instead you go to the website https://thesession.org and then add it to your home screen from your browser.

My guess is that this person had added The Session to the home screen of their Android tablet, probably following the “add to home screen” prompt. I recently added some code to use the window.beforeinstallprompt event so that the “add to home screen” prompt would only be shown to visitors who sign up or log in to The Session—a good indicator of engagement, I reckon, and it should reduce the chance of the prompt being dismissed out of hand.

So this person added The Session to their home screen—probably as a result of being prompted—and then used it just like any other app. At some point, they didn’t even remember how the app got installed:

Success! I did it. Thanks. My problem was I was looking for an app to download.

On the one hand, this is kind of great: here’s an example where, in the user’s mind, there’s literally no difference between the experience of using a progressive web app and using a native app. Win!

But on the other hand, the expectation is still that apps are to be found in an app store, not on the web. This expectation is something I wrote about recently (and Justin wrote a response to that post). I finished by saying:

Perhaps the inertia we think we’re battling against isn’t such a problem as long as we give people a fast, reliable, engaging experience.

When this member of The Session said “My problem was I was looking for an app to download”, I responded by saying:

Well, I take that as a compliment—the fact that once the site is added to your home screen, it feels just like a native app. :-)

And they said:

Yes, it does!

Tuesday, July 31st, 2018

abc to SVG | CSS-Tricks

Aw, this is so nice! Chris points to the way that The Session generates sheet music from abc text:

The SVG conversion is made possible entirely in JavaScript by an open source library. That’s the progressive enhancement part. Store and ship the basic format, and let the browser enhance the experience, if it can (it can).

Here’s another way of thinking of it: I was contacted by a blind user of The Session who hadn’t come across abc notation before. Once they realised how it worked, they said it was like having alt text for sheet music! 🤯

Saturday, March 31st, 2018

Sessions Map

This is nifty—a map of all the Irish music sessions and events happening around the world, using the data from TheSession.org.

If you’re interested in using data from The Session, there’s a read-only API and regularly-updated data dumps.

Thursday, February 11th, 2016

Banjos and Discrete Technologies | stevebenford

An examination of how sites like The Session are meshing with older ideas of traditional Irish music:

There is a very interesting tension at play here – one that speaks directly to the design of new technologies. On the one hand, Irish musicians appear to be enthusiastically adopting digital media to establish a common repertoire of tunes, while on the other the actual performance of these tunes in a live session is governed by a strong etiquette that emphasizes the importance of playing by ear.

There’s an accompanying paper called Supporting Traditional Music-Making: Designing for Situated Discretion (PDF).

Sunday, February 7th, 2016


I’ve spent the last week implementing a new feature over at The Session. I had forgotten how enjoyable it is to get completely immersed in a personal project, thinking about everything from database structures right through to CSS animations,

I won’t bore you with the details of this particular feature—which is really only of interest if you play traditional Irish music—but I thought I’d make note of one little bit of progressive enhancement.

One of the interfaces needed for this feature was a form to re-order items in a list. So I thought to myself, “what’s the simplest technology to enable this functionality?” I came up with a series of select elements within a form.


It’s not the nicest of interfaces, but it works pretty much everywhere. Once I had built that—and the back-end functionality required to make it all work—I could think about how to enhance it.

I brought it up at the weekly Clearleft front-end pow-wow (featuring special guest Jack Franklin). I figured that drag’n’drop would be the obvious enhancement, but I didn’t know if there were any “go-to” libraries for implementing it; I haven’t paid much attention to the state of drag’n’drop since the old IE implement was added to HTML5.

Nobody had any particular recommendations so I did a bit of searching. I came across Dragula, which looked pretty solid. It’s made by the super-smart Nicolás Bevacqua, who I know shares my feelings about progressive enhancement. To my delight, I was able to get it working within minutes.

Drag and drop

There’s a little bit of mustard-cutting going on: does the dragula object exist, and does the browser understand querySelector? If so, the select elements are hidden and the drag’n’drop is enabled. Then, whenever an item in the list is dragged and dropped, the corresponding (hidden) select element is updated …so that time I spent making the simpler non-drag’n’drop interface was time well spent: I didn’t need to do anything extra on the server to handle the data from the updated interface.

It’s a simple example but it demonstrates that the benefits of starting with the simpler universal interface before upgrading to the smoother experience.

Sunday, January 24th, 2016

Words of welcome

For a while now, The Session has had some little on-boarding touches to make sure that new members are eased into the culture of this traditional Irish music community.

First off, new members are encouraged to add a little bit about themselves so that there’s some context when they start making contributions.

Welcome! You are now a member of The Session. Now, how about sharing a bit more about yourself: where you're from, what instrument(s) you play, etc.

Secondly, new members can’t kick off a brand new discussion straight away.

Woah there! I appreciate your eagerness to post your first discussion, but seeing as you just joined The Session, maybe it would be better if you wait a little bit first. Take a look around at the existing discussions, have a read of the house rules and get a feel for how things work around here.

Likewise, they can’t post a comment straight away. They need to wait an hour between signing up and posting their first comment. Instead of seeing a comment form, they see a countdown.

Welcome to The Session, Testy McTest! You'll be able to add your first comment in forty-seven minutes.

Finally, when they do make their first submission—whether it’s a discussion, an event, a session, or a tune—the interface displays a few extra messages of encouragement and care.

Add a tune, Step 1 of 4: Tune Details. As this is your first tune submission, please take extra care. First, provide some basic details about the tune you want to add.

But I realised that all of these custom messages were very one-sided. They were always displayed to the new member. It’s equally important that existing members treat any newcomers with respect.

Now on some discussions, an extra message is displayed to existing members right before the comment form. The logic is straightforward:

  1. If this is a discussion added by a new member,
  2. who hasn’t yet added any comments anywhere,
  3. and this discussion has no responses so far,
  4. and anyone other than that member is viewing the page,
  5. then display a message asking for help making this new member feel welcome.

This is the first ever post by FourCourseChaos. Please help in making them feel welcome here at The Session.

It’s a small addition, but it makes a difference.

No intricate JavaScript; no smooth animations; just some words on a screen encouraging a human connection.

Thursday, August 13th, 2015

The Infinite Trad Session

Okay, this is kind of nuts: some researchers have seeded a neural network with all the tunes from The Session. Some of the results are surprisingly okay. It’s certainly a fascinating project.

Monday, May 25th, 2015

The Long Web

A presentation on long-term thinking and the web, from An Event Apart 2013.

Hello Austin! Good morning!

Good morning!

It is an absolute pleasure to be here. I love coming to Austin. Who here is from Austin? Nice, excellent; I like your town a lot. I’ve been here many times but usually it’s been SXSW so that doesn’t really count, so I’m very happy to be here when SXSW isn’t on, so I get to actually go to all the places I want to go to.

Anyway, so, today I want to talk to you about the web because I’m a big fan. I like the web a lot. It’s this wonderful, giant, big, huge mess, sprawling mess of a web, this beautiful chaotic thing. It’s almost too much to grasp, I think, the sheer scale of it. So what I’m going to do is I’m going to just zoom in on one particular website to try and talk about the web in general by focusing in on one website in particular, and this website we’re going to look at, it’s made up of a number of formats, like most websites are; a number of different files and those files are different formats, and some of these formats will be familiar to you, some of them may be less familiar. But what’s interesting about all of these particular formats is that these are all text formats. They’re not binary formats. They’re made of text; human, readable text files and if we’re going to look at text files on the web, I think it’s interesting to look at where text came from, as in: text that we use to communicate with.

Text really started here with cuneiform clay tablets, these scratches and markings. These are maybe about four thousand years old. If you ever get the chance to see these, they’re beautifully intricate but quite small. The Pergamon Museum in Berlin has a wonderful collection. Quite different from any text we would use today, but this evolved; we got hieratic text, later demotic and that led to Greek and you can see how, as it evolved, it starts to get closer to the text that we would recognise today, we start to get closer to the letter forms that we would use for the Western alphabet, for example, in these kind of texts. So this is something from about 330AD and at this point, knowledge is being stored in texts and this is the amazing thing about text: it’s kind of the next step on from language.

Language is amazing because it allows us to communicate ideas from brain to brain. This way that I have an idea in my head and by flapping the meat in my mouth and passing air over my vocal chords, I can vibrate the air and as long as you can decode the format that I’m speaking in, in this case English, then I can get an idea from my brain into your brain. That’s pretty remarkable. And what text allows us to do is to do that over time, so I can get an idea out from my brain onto a text format and then you could come along, read that text format, again, once you can decode it, once you can decode the codebase, in this case it would be English again. You could get that idea transferred to your brain, but the difference being that you could be reading this months, years, decades or centuries after it was written. And this allows us to store and pass on ideas over time, which is remarkable.

So, about this time, ideas are being stored in text and the texts themselves are being stored in these remarkable repositories of information. Like, for example, the Library of Alexandria, which is where a lot of the world’s knowledge was stored. Libraries: wonderful things. Right up there with the web in Things I Am A Fan Of. But the Library of Alexandria is most famous for one thing, and it’s not so much the texts that were stored there; it’s famous for the fact that it burned down. We’re not even clear exactly when it burned down. Maybe around 391AD. Who knows? The fact that it’s famous for being destroyed means all the knowledge about it is also destroyed, but we don’t know what was stored there. We don’t know how much knowledge was lost to humankind, whether if the Library of Alexandria still existed, would we have colonies on Mars at this point? Who knows? That timeline was shut off to us.

It was a pretty bad time for knowledge, for text files, for Europe in general, although off the western coast of Europe, Ireland was where some monks were starting to store knowledge, store some of these text files, mostly because it was just such an inaccessible place that marauding hordes are unlikely to visit. This has been written about and talked about and maybe over-exaggerated. There was a book called How the Irish saved civilisation by Thomas Cahill, which may have exaggerated the point but still, what did get saved tended to get saved here on the western coast of Europe.

Now, Ireland itself was beset with its own problems centuries later. In the mid-nineteenth century was the Irish Famine, which was an absolutely devastating effect; about 1846, 1847. Millions died and millions also left the country, they started this exodus, which never really stopped. The population of Ireland has been decreasing since the middle of the nineteenth century: it’s one of the few countries in Europe where the net population has been a loss. As you know, they all come here, right? They come to America, they go to other countries. New York, Philadelphia, Boston, Chicago, these kind of places is where the Irish ended up coming.

Like this guy. This is Francis O’Neill and he comes from County Cork, which is where I’m from as well. He was born the same time as the famine was going on, 1848, and like many others, he left the country later. He had all sorts of adventures: he was shipwrecked, travelled across the Wild West before settling in Chicago, and it was in Chicago that he became Chief of Police. He was also a fan of traditional Irish music and I can kind of relate to this; maybe he wasn’t so much a fan when he was back in Ireland, but once you leave the country, suddenly you’re like the most Irish person ever! You know what I’m talking about, Boston! He wanted to capture some of the Irish music, so what he ended up doing was, if you were in Chicago around this time in the nineteenth century, and you could play the pipes or flute or fiddle, you were pretty much guaranteed a job on the Chicago Police Force. You know the clichéd old movies, where…”Oh, it’s always the Irish Coppers on the beat.” There’s actually some truth to that: it was pretty much entirely made up of Irish policemen.

So he started, he didn’t play music himself, but he started transcribing the tunes that these musicians/police officers would play to him, and he collected these tunes. He collected them into a book: O’Neill’s 1001: jigs, reels, hornpipes, the dance music of Ireland. This was published about the start of the twentieth century. And I remember when I was learning Irish music, this book was still very much the place you’d go to, to find sheet music for tunes. In fact, we just called it The Book. If somebody played a tune and you wanted to learn it, you might ask, “Oh, is that tune in The Book?” and they would know what you mean. You meant, this book.

So what ended up happening was actually back home in Ireland, the music was kind of dying out, because people were leaving the country and the fact that the knowledge got stored in this book meant that the music could be revived, because it had been noted down. It contributed to a revival.

Now, because the music was stored in sheet music, which is an interesting format, because it’s not exactly text format: it’s a symbolic set in the same way as the alphabet is, but it’s kind of more like a graphic than at text format, I think: I picture it that way. So this is how sheet music looks, which has its own history of being noted down; goes back to medieval times essentially.

There was an interesting development with the transmission of music when the internet came along, or when the web came along, but even before that, email lists, bulletin boards; people want to transfer music from one computer to another, but you can’t send an image file, that’s going to be way too big. You certainly can’t send an audio file, even a midi file; that’s certainly way too big to be sending over the kind of networks we had back then, modems with really slow dial-up. So there was this new format that was created in 1990, 1991 by John Chambers and it’s called ABC where it is purely text which is really nice and lightweight and cheap to send. And you can see it’s almost JSON-like to begin with, with the metadata: what’s the title of the tune, information about how it should be played. And then the notes themselves are literally just letters of the alphabet, like any other text format. It’s re-using the Western alphabet, where lower case and upper case mean different octaves; we’ve got things like the vertical pipe symbol to denote bar lines, so I’ll try and demonstrate what this tune is that we’re looking at here, if you’ll bear with me…

Let’s see, so this is Chief O’Neill’s favourite….(plays mandolin)….So, that’s a hornpipe. (Applause.) Thank you! Thank you very much. That’s a hornpipe and all the information encoded into the notes of the tune at least, maybe not the nuances of how it be played, but is encoded into this text file, which is pretty remarkable.

So, what you could do over email lists, bulletin boards and then later the web was you could transmit just text files like this and the person at the other end could unpack that, could decode it, either by looking directly at it or using software on their machine that could convert the ABC file into sheet music, or maybe even a Midi file.

When I was first getting into the web, I was living in Germany in the nineties and like I was saying, it’s once you leave Ireland that you start to get really into the…oh, I’m so Irish…so I’m getting really into Irish folk music because I’m not living in Ireland any more and I decided I wanted to put together a site about Irish music with tunes, so I made the site called thesession.org and the idea would be that I would put on a tune every week, in ABC format, but also converted to sheet music and so on. So, I launched it in, gosh, ‘98, ‘97: I can’t remember, and initially it looked like this, please forgive me, this is fifteen years old now. And it was nice and it actually started to get a bit of a following because it had this weekly structure, this weekly release cycle.

Of course, there was an issue here and that was with scaling it, because I’m putting out a tune every week but there’s only so many tunes I know, so eventually I hit that wall of that’s it: I’ve run out of tunes I know. So, I decided I was going to completely revamp the site to make it much more of a write/read kind of site where people could contribute tunes in ABC format and other things like location of Irish music events or sessions and recordings and stuff. So, I re-launched the site in 2001 as much more of a community site where everything was being contributed by the people. So, people could contribute tunes in ABC format and then I would convert it into sheet music, so I was a bit of a bottleneck there; I was doing it manually: every time somebody contributed a tune I had to convert the whole lot.

I was really pleased with this site initially; I was really, really proud of it because it was really becoming the go-to place on the web for Irish music. But like I said, that was 2001 and I kind of let it stagnate, which is a real shame. So as browsers evolved and the web evolved, there was so much more I could have been doing with the site and I wasn’t doing it and The Session was simultaneously my proudest achievement and my most shameful achievement because I was like, “Oh, it could have been so much better!”

For years I was saying, I really need to re-launch The Session, I need to re-vamp it, but it was this huge monolithic task I really wasn’t looking forward to it. I think for two years running, on my New Year’s Resolution I had “Re-launch The Session”. But I finally got round to it last year, so this is after ten years of the previous design, the previous way that things are done, I re-launched the site. And this is the site I wanted to look at, is how I approached that from the long-term view as in, it’s a site that’s been online for over a decade. It’s a site that will be online for hopefully much longer than a decade. And how I evaluated technologies and how I evaluated approaches to building a site for the long term, not just for the here and now.

Of course, one of the things when you’re building a website today that’s clear is I can no longer rely on the fact that somebody’s just going to be looking at it on a desktop or a laptop computer, which would have been a safe bet maybe ten years ago. Because these days, of course, the site has got to go everywhere because people will use whatever device they have to hand to view the site, to use the site. Exactly what Karen was talking about yesterday.

Luckily, I’ve got a whole bunch of devices I can test on because at Clearleft we’ve got this Device Lab which I was able to use to do some testing; it’s really handy. This Device Lab also, it’s open to the public. Anybody can come by and test on these devices. So, we’re in Brighton in the UK and this started because there were some friends of mine in Portland had this idea that they were going to have this Open Device Lab where people could come and use it and I thought, that is a great idea, and every time I saw them I’d say to them, “Oh, how’s the Open Device Lab thing going?” and they’d say “Great. We just signed the paperwork to get funded as a non-profit and next we’re sending our lawyer to do this…” It’s never going to actually launch, is it?

So, at Clearleft, I was gathering together a few devices, a handful of crappy devices from second-hand shops and I had them on a table for testing responsive designs with, but I felt really bad that they were just sitting on the table most of the time, not being used: it feels wasteful. And I thought of the idea of the Open Device Lab and I thought yeah, they had all that paperwork to do and all the insurance stuff I guess we’d have to cover and I thought, you know what? Screw it. So I wrote a blogpost and I tweeted out, I said, “Hey, anybody in Brighton who wants to come and use these devices: help yourself.” I didn’t worry about insurance, I didn’t worry about liability, any of that, I just did it. And what I didn’t expect was that straight away, other designers and developers in Brighton responded with tweets like, “Oh, I’ve got this phone lying on my desk and I feel really bad about it being there. Do you want to take it?” or “I’ve got a phone in a drawer that’s just sitting there gathering dust, can I bring it by and drop it off?” and I was like, “Hell, yeah!” So within twenty four hours, the number of devices had doubled. And since then, we’ve got forty or fifty devices there and hardly any of them are actually mine. They’ve all been contributed by other people. And this idea of an Open Device Lab has taken off, so if you go to opendevicelab.com you can find out of there’s an Open Device Lab near you where you can come by and test on devices. And if there isn’t, you can get advice on starting one, which I highly recommend you do.

This this really handy; these days you’re trying to test your work on so many different devices and I will point out: this is testing, not optimising. It’s not like I’m trying to optimise every possible device. That doesn’t scale, but testing on every device.

It was clear, re-launching websites today as opposed to ten years ago, I’m going to go mobile first. I’m sure you’ve all…who’s read the book by Luke, Mobile First? Great book. Excellent. And of course, one of the things he talks about is the fact that by going mobile first, you have to prioritise; you have to be pretty ruthless about figuring out what’s the most important thing on this page. And so for me, the mobile first, when you follow it to its conclusion, it’s content first; it’s figuring out what’s most important. And when I say content, I don’t necessarily mean copy or images. Just as Luke was saying, this is about tasks. The content could be adding something to a shopping cart; the content could be very much actions as opposed to consuming it.

If you really want to take this content-first approach to its ultimate limit, something I like to do, if I ever get the chance to do this on products is I like to start with the URLs; really bring it down to the most basic webiness of what you’re building is, what is the URL structure, which is something I think people don’t think about enough. And yet, URLs are so, so important. Some people treat them like an implementation detail of the web, like, “oh yeah, we’ve got native, we’ve got the web, web has URLs, whatever,” whereas I think it’s the most powerful part of the web. In fact, once you have the name of something and once you have the address that you can pull up on any device, as long as it’s connected to a network, that is amazing. That immediately makes it part of this huge, big, chaotic mess of a web. It was Tim Berners-Lee who said, when you have a URL, it’s part of the web, it’s part of the discourse of humanity, this giant Library of Alexandria that we’re all collectively building.

URL design as a skill is something I feel we’re losing, which is a real shame because I will admit, I’m a URL fetishist. I love a good URL. But I think, rightly so, because they are this fundamental unit of the web. Kyle Neath who works at GitHub—where they have beautiful URLs—he said:

URLs are universal. They work in Firefox, Chrome, Safari, Internet Explorer, cURL, wget, your iPhone, Android and even written down on sticky notes, they are the one universal syntax of the web.

That’s so important to remember: written down on sticky notes, written on a Post-it. They’re for humans. URLs are for humans. Yes, they’re used by machines to fetch a resource, but they’re very much for humans to use. URLs should be hackable, guessable, readable.

On The Session I’ve got this kind of structure with my URLs, it’s sort of restful, you can drill down and this is repeated throughout the site, this URL structure, and in a way this is kind of almost like an API for the content; the content just happens to be an HTML format. In fact, there is a read-only API for The Session and rather than have it as a separate sub-domain or a completely different URL, the API is the same URLs, just adding on a query string to say, I’ll take this in RSS or I’ll take this in JSON,

Thinking about your content that way, almost like an API first approach is really good for stopping yourself thinking too much about the appearance, thinking about how things are going to look, where things are positioned. This idea of content first, literally the content devoid of where it’s going to appear. Whether it’s even going to be showing up on a visual medium at all.

Something else I do to drive home the content-first approach is try and break things down into their fundamental units. Instead of thinking about layout and how things fit together, which is important, but I feel like that needs to come later, I try to break things down into the building blocks. I do this at work, but I decided to also do this on a personal project like The Session.

Here I’ve broken things down into, we’ve got buttons, we’ve got feedback messages, form fields, headings, all this kind of stuff, breaking it down into individual units, and I have this one document where you’ve got, this is what the unit looks like and here’s the mark-up that’s generating that pattern. And I’ve thrown this up on GitHub. I call it a pattern primer; just a little page PHP script that looks in a folder full of little HTML snippets. It’s been ported to Ruby and Python and other languages too, so go ahead and take it, use it, whatever you like.

I just find this so useful to think in terms of the building blocks rather than thinking of the whole picture to start with because then you tend to have more robust units, and it forces you and your CSS as well to not rely on context when you don’t know whether a building block is going to appear in the main column or a sidebar or what that even means these days.

There’s a corollary to the content-first approach which is, navigation second. If you’re going to have your content first: navigation second. And again, this is an idea that I first saw from Luke; his previous start-up, Bagcheck, he had this content first, navigation second approach and I’ve shamelessly ripped it off for The Session.

On The Session you can see there’s this trigger at the top and that brings up the navigation and the way it’s working is, that trigger, that little downward arrow, is a hyperlink; that’s all it is. It’s a hyperlink pointing to a fragment identifier that’s at the bottom of the page, which is the navigation. And the back-to-top link is just another hyperlink. And the great thing about this pattern is that this will work everywhere. And I mean, everywhere: any browser connected to the internet understands hyperlinks, so it’s a very robust pattern. And there’s all sorts of other patterns you can use, off-canvas and overlays and progressive disclosure …and the nice thing about this is, you could start with this as your baseline, as your default, and enhance up to using any of those other patterns, but if anything ever goes wrong, you’ve got this great fallback which is this content-first, navigation-second approach I really like.

And of course, once we get more screen real estate to play with, I can start putting the navigation further up; I can put it back up to the top of the page, using CSS we’ve got absolute positioning, we’ve got display: table, we’ve got flexbox, we’ve got all sorts of ways that we can now move things around, regardless of their source order. CSS has gotten really good at that.

You’ll notice one of the other things that changed as I got more real estate: it wasn’t just that the navigation was changing but the logo as well; the logo starts being this kind of strip and then becomes more like this tag off to the side. I’m not swapping out images there because the logo is actually just text; it’s just CSS and markup, which is obviously nice and lightweight and CSS is great and of course, not every browser’s going to get these styles. You know what? I’m fine with that. That’s OK, it’s an enhancement.

Something else I wanted to make sure I really got right for The Session, because as I said, everything on The Session these days is contributed by somebody else: it’s not me putting the content in; it’s somebody else. I wanted to make sure that that input was as good as I could get it. And it could be better; I think there’s always room for improvement, but I’d learned a few things from previous work I’d done that I decided I’d try and apply it.

There’s this site Huffduffer that Jeffrey mentioned that I built a few years back, and on Huffduffer on the log-in, I provided a little toggle, you can see there, to show your password. So its input type="password", but if you click the checkbox then you can reveal the password so you can see what you’re typing. Just a little usability enhancement, the same way that works in operating systems when you’re connecting to wifi and stuff like that. Nice little pattern.

That was a few years ago, and for The Session, I decided I was actually going to reverse it. I decided I would make the password field input type="text" by default, and if you want to hide it, then you have to tick the box. Now I knew this was going to be interesting from a psychological perspective, how people would react to this. But honestly, I believed the whole password thing with the dots where you’re obscuring what’s written, it’s security theatre. I don’t think it’s helping anyone. When you’re typing in…people looking over your shoulder…and also, if you are worried about that, I’ve got the toggle. So I put this out there, and I did receive emails from people saying, “Oh, you’ve got the password in input type="text": you shouldn’t do that” and I responded, genuinely curious, “Why not?” I really wanted to know. Please tell me why not. And nobody could tell me why not. All they could say was, no other site does that. Yeah, I know that, but why not? Why shouldn’t I do this? And they brought up situations like, “Oh, somebody could be looking over my shoulder”, I’m like, “Yes, that’s why I’ve provided a toggle.” So there was a little bit of pushback, but on the whole not that much and let’s face it, this makes it a heck of a lot easier to input something on a mobile device, which is very much something where you’re not going to have people looking over your shoulder and heck, if they are, there’s a toggle.

I think this is going to take a while to percolate, but I think we will drop the security theatre of obscuring inputs, which gives this illusion of security that isn’t there. It will be interesting to hear your thoughts on that, whether I’ve pushed it too far, but I think the time is right. The world is ready for this!

As well as input type="text" and password and all that stuff, I’m going crazy with all these great new input types we’ve got now with HTML5. These are wonderful. The great thing about all of these is that if you use these, you might get rewarded in the browsers that understand these but if this is read by a browser that doesn’t understand them, that’s absolutely fine, they will just use these as input type="text", because of the way the HTML works; it’s like if you type input type="bar", a browser will see that and go, “I don’t understand that; I’m defaulting to input type="text"” which is great, it means we can start using the new shiny stuff, safe in the knowledge that it’s not going to break in all the browsers, so I can use input type = number; it’s going to just be like input type="text" for older browsers but on newer browsers that understand they’ll use that, like on an iOS device, you get this keypad that’s got the number already displayed with the symbols and stuff.

It’s not quite a numeric keypad because input type="number" can still accept commas, it can accept decimal points and stuff. If you want a numeric keypad, there is an attribute called “inputmode”, and you can specify that this should be in numeric input mode. This is not supported yet in any browsers; it’s in HTML5 but it’s not supported yet, but it’s on the way, so you can just start adding this stuff now because again, the way the browsers work is, they see something they don’t understand, like an attribute like this; they just ignore it. It’s like you’re putting something in there for future browsers.

Also, we do have a little hack we can use today, if you want to get the same effect, which is you can use the pattern attribute. Using the pattern attribute, what you do is you provide a regular expression (now you’ve got two problems) which specifies what you would expect from the input. In this case I’m saying, I only want numbers here. Be careful with this because, think about whether your input truly is just numbers, because a lot of things that look like numbers that actually take commas and decimal points or spaces, so, think about that. Or minus signs, things like that. But if you truly only need the numbers zero to nine, then go ahead and use this pattern. Pattern zero to nine and you’ll get that numeric keypad, because iOS is smart enough to realise it doesn’t need to use one of the other keyboards, so that’s handy.

One of the other new things in HTML5 that I’ve decided use with The Session is the datalist element which I really like. I don’t see that many people using it yet but I really, really like it. This is a way of turning a regular input into a combo-box which is like a cross between a text input and a select. Somebody can type in the text but if they start typing something that’s in this list of options, then that will come up like an auto-complete sort of suggestion. So, you’ve got regular input and you’ve got this list attribute that points to an ID. The ID is on the datalist element and then you’ve got a bunch of options and it’s the value in the options that count. So, as the user is typing in, what they get is the ability to choose from the selection or, if what they’re typing isn’t in the selection, they can just keep on typing, OK? This is perfect for those situations you’ve got like let’s take an option like, “How did you hear about us?” and it’s a select list and then after that, the last one in the select list is “Other” and then the next form field is, “If Other, please specify” and then you’ve got a text field. This is, “If Other, please specify.” This is perfect. You’ve given them a choice but also given them the option to type whatever they want. Really, really handy little pattern using the datalist element.

Another attribute we can use these days is placeholder. Again, something like datalist we could do before, but it required JavaScript, probably a jQuery plug-in, that’s more stuff to download. Same with this placeholder pattern, where when there’s nothing in the form field, you want some kind of greyed-out text that shows an example of the input that is expected, we could do this using JavaScript. Now we can do it in HTML; we just say, here’s the placeholder text.

And I noticed for anything that’s using the datalist element, because the options in the data list are effectively possible inputs and what you’re supposed to put into the placeholder is a possible input, then any element that has a data list associated, you could programmatically just grab any one of those and put it in there as placeholder. So that’s what I’ve done there, it just randomly grabs one of those suggestions and puts it in as a placeholder text into the input and I’ve put that JavaScript up online as well, so feel free to grab that and drop it into your site and it’ll just work straight away.

It won’t work in every browser; not every browser supports datalist; not every browser supports placeholder. Do you know what? That’s absolutely fine, because the core content here, the task, is inputting something and hitting that button and going to the next screen. And none of this affects that, other than to make it a bit better, to enhance it. And this is the approach I’ve been taking with The Session, it’s the approach I’ve been taking online for as long as I’ve been making websites, this idea of progressive enhancement.

There are some myths about progressive enhancement: I think there’s this idea that progressive enhancement means designing for the lowest common denominator, which isn’t true. Progressive enhancement is about starting from the lowest common denominator and then building up, but there’s no limit to where you can go.

The idea is, progressive enhancement in a nutshell is, you’ve got these layers of technology and you begin with your HTML and you structure your HTML well and it works as a standalone thing; the hyperlinks are proper hyperlinks, the forms work as forms. Then you can add your CSS, you’re enhancing using styles. And then you can add your JavaScript, add in all sorts of whizzy goodness: that’s great. But, if the JavaScript fails or the CSS fails, that’s fine, you’ve still got well-structured content, well-structured markup.

But progressive enhancement goes further than that because at each level, at each part of the stack, HTML, CSS, JavaScript, you can use progressive enhancement. placeholder attribute, datalist element, new input types: I can use those in HTML today as an enhancement, and that’s great. That allows us to evolve the language. And the reason I can do that, as I said, is that browsers won’t choke on that stuff; older browsers won’t throw an error. If a browser sees an element it doesn’t recognise or an attribute it doesn’t recognise, it doesn’t halt parsing of the page and throw an error and say the user can’t read the page, it just goes, “Nah, don’t understand it, going to carry on.” That’s actually really, really powerful, that kind of error handling.

It’s the same with CSS; if a parser sees a selector it doesn’t understand or a value or a property, it just moves onto the next one. That’s enormously powerful, because that allows us to keep on expanding CSS. This is how we can keep adding stuff to CSS; you can start using the new stuff today, even when it’s only in one or two browsers, because you’re safe in the knowledge that no browser’s going to choke on it; it’s not going to break any other browser.

Now, it’s a bit different with JavaScript because with JavaScript if you use a property or a method that the browser doesn’t understand, it will throw an error; it has a different error handling model, so you have to be a bit more careful in JavaScript and always test: “do you understand this property?” like geolocation or something. If so, do this stuff. But once you do that, you can apply progressive enhancement at the JavaScript level too, but that error handling of JavaScript can be tricky.

So, progressive enhancement, the idea is that it’s about being robust; it’s about catering for the situations you can’t imagine where something goes wrong that you haven’t foreseen.

One of the descriptions that’s always given for progressive enhancement; it’s like escalators: they can never break, they can just become stairs. Although in this case, clearly they didn’t get the memo because I can’t climb these now-stairs. But it’s kind of a good way of thinking about how we should use JavaScript. It’s like electricity: it should be used to enhance things, so an escalator is an electric stairs or a moving walkway is just a floor. How can a floor be out of service? I don’t know. A moving walkway is just a progressively enhanced floor. Or here we are: an electric toothbrush is just a progressively enhanced toothbrush; it doesn’t break, it just goes back to being an old-fashioned toothbrush.

I think with JavaScript, that’s definitely the way we should approach it. Enhancements. There’s no limit to what we can do with JavaScript but I think from a philosophical point of view, we need to treat it as a technology that’s used to enhance what’s already there which is in the markup, which is where the content is.

And yet, I see sites that use JavaScript for literally everything. For the markup, for the content, for the core tasks; they rely on JavaScript, which I think is dangerous. This is Squarespace if you have JavaScript turned off, go to squarespace.com without JavaScript, this is what you get: the page has finished loading at this point. This is my Facebook screen without JavaScript. A nice big white blank expanse; although they have a bug report for this saying, what do I do when my homepage is blank? They’ve got a fix for that: it’s turn on JavaScript!

But this actually isn’t the point. It isn’t about people turning off JavaScript or browsers that don’t support JavaScript. Frankly, it’s actually getting harder and harder to switch off JavaScript in most browsers; that’s not what progressive enhancement is about. It’s about how you use the JavaScript you’re using. It’s not relying on the JavaScript and you might think, well, I’ve looked at my audience and I knew the kind of devices they’re using; all these devices are JavaScript-capable. Yeah, but stuff happens that you can’t predict and progressive enhancement allows you to be ready for the unpredictable.

As Scott Jehl said:

Every user is a non-JavaScript user while a page is still loading.

You want to make sure your page works; maybe it’s not a great experience without JavaScript; that’s fine, but as long as it works and then you use JavaScript to make it a great experience.

This is the page for downloading Chrome from Google. I took this screenshot I think it was last year. For two hours, nobody could download Chrome, it didn’t work, because there was an error in the JavaScript in a big JavaScript file and because of that error, that button didn’t work because that button, it is a link but that’s the href value of the link: instead of being a link to an actual file or a page or something like a proper link, it’s this JavaScript pseudo-protocol, so effectively it’s not really a link at all. Because there was one error, completely unrelated, somewhere in the JavaScript file, this link did not work as a link and for two hours, nobody in the world could download Google Chrome. I’m pretty sure heads probably rolled for that one.

My friend Andy Hume said:

Progressive enhancement is more about dealing with technology failing than dealing with technology not being supported.

And I think that’s very true actually; like I said, it’s preparing for the unexpected.

I see this kind of stuff all the time, this assumption that JavaScript will be available; this reliance on JavaScript which, like I said, just from a purely engineering perspective doesn’t make sense because of the error-handling in HTML and CSS. If you make a mistake in your HTML, the browser’s going to be very forgiving. If you make a mistake in your CSS, the browser’s going to be very forgiving. They just ignore stuff they don’t understand. If you make a mistake in your JavaScript, the browser is not going to be forgiving; it will throw an error, it will stop the render parsing at that point. So just from Occam’s razor, from an engineering perspective, if you want a robust page, it makes sense not to rely on JavaScript.

Don’t get me wrong: I’m not saying, don’t use JavaScript. I love JavaScript. It’s just how you use it, how you deploy it, as an enhancement, not this or this on Flickr, this JavaScript pseudo-protocol. Or this on Foursquare. It’s not even a link or a button: it’s a span where they’ve got styles to make it look like a link, and they’ve probably got JavaScript to make it act exactly like a link. Just use a link! Or a button. This stuff makes me angry, it really does!

On The Session, I wanted to make sure I was using progressive enhancement, and probably the most complex part of The Session now is converting the ABC to sheet music. As I said, I used to be the bottleneck there; I used to have to do this by hand, I wanted this to be automated. By default you’ve got your ABC file there and there’s a button to turn it into sheet music and that is a proper button inside a form, and by default it just generates something on the server side; it generates a gif. This is actually on a third party server; I should really bring it in-house and do it on The Session itself but by default, without JavaScript, without any technology, this still works, and it will generate an image file like that.

But if you have JavaScript available, then what I did is I capture that click on the button and instead of going off to another page, we stay on the page we’re on, we use a little bit of Ajax and we bring in the sheet music into the page, so this is an enhancement. Everyone will be able to look at sheet music; it’s just the experience is much nicer if you’ve got JavaScript and if you support the technologies.

One of the ways that this is much nicer, I think, is that this sheet music that’s generated here isn’t a gif: this is SVG so it’s nice and crisp because some genius out there has written an ABC to SVG converter in JavaScript, which is just insane. Atwood’s law states that anything that can be written in JavaScript will be written in JavaScript! I think that’s very true. Somebody has ported UNIX to JavaScript; it’s getting pretty crazy.

But SVG I love. I love SVG; it’s so crisp; you can bump up the font size, it’s still stays crisp. Look at it on any device: it’s automatically responsive, it’s like it just fits the container. I’m using SVG somewhere else as well, actually, on the member profiles I have these sparklines.

I love sparklines. Sparklines, it’s a term coined by Edward Tufte and he describes them as:

A small, intense, simple word-size graphic with typographic resolution.

Again, this kind of enhancement, the information is already there in text and you just enhance it with this nice little visual enhancement with a sparkline.

I’d used sparklines before on Huffduffer, I was using sparklines; I was using Google Charts API, because you could use that API like image source = and then you’ve got this Google Chart href and you pass it all the parameters and it’ll generate a chart for you and it could generate a sparkline for you; it was a really, really useful service, really good API. So, Google’s shutting it down. Because that’s what Google do.

When you’ve been making websites long enough, you learn to trust no one. Especially when it comes to APIs being available. No, no, no. Over a long enough timescale, all APIs disappear.

So I did not want to rely on any third part service for this, so OK, I’ll figure out how to make sparklines myself. I thought, somebody must have done it: I’m looking on GitHub, I’m looking on Stack Overflow. Well, on Stack Overflow of course, all the answers are, “use this jQuery plug-in.” Not knocking jQuery, but jQuery is kind of like the spam of Stack Overflow if you’re trying to solve any problem and you don’t want to use jQuery: good luck!

I had to write it myself, so I wrote this little script that generates sparklines, it generates a little canvas element with a sparkline and I put it up on GitHub and I wrote a blogpost saying, here’s this thing I wrote to make sparklines using canvas.

But I finished the blogpost by saying, this doesn’t feel right; I don’t think canvas is the right element here because it’s not a dynamic image, it’s not going to move or anything; it’s just staying still and actually, SVG would be better, so if someone wanted to make an SVG version, that would be great. I swear to God, two hours later, somebody had converted it to SVG. I love the web! I love the fact that this kind of sharing and evolution…I love that kind of stuff. And with GitHub now, especially, it makes sharing so much easier and people should improve other people’s code: I love it.

Anyway, somebody else had basically made a service out of this where you can just pass in the numbers and get back an SVG. So now I’ve got sparklines nice and crisp using SVG and this is the crazy thing: every sparkline, no matter how different it looks, is the same SVG file, and I just pass in different numbers.

Now wait a minute: SVG, that’s an image format, right? No, not quite. SVG is simultaneously an image with the output that you see, and text because it’s a markup document. It’s a form of XML. You can view source in SVG. And because it’s a markup document, that means you can put scripts inside the SVG file, which is what’s happening here. It’s looking at the query string figuring out, yeah, it’s like the descriptor’s coming from inside the SVG file! You could put style declarations inside SVG files. So, you know I showed earlier how I made my logo look different at different sizes? I was using media queries there, but you could put media queries inside a style sheet inside an SVG file to make an automatically responsive logo or any other image that will respond to its containing element. SVG is awesome! I love it!

Initially when I was adding these SVG sparklines on the member profiles, it was computationally kind of expensive, so it was holding up the loading of the page, so I didn’t download them initially; I waited ‘til the page was loaded and then I fired off a new request using Ajax to grab those sparklines and put them into the page because I wanted to get better performance. This idea of having your core content loaded and then you do some loading afterwards, for the enhancements for the nice to have stuff, it’s this idea of conditional loading, where maybe you load in the stuff, maybe you won’t, by doing a test in JavaScript. Conditional loading I feel is something really, really important, especially for responsive design and it doesn’t get talked about enough, in my opinion.

A lot of people kind of pooh-pooh responsive design because they make the mistake of thinking that you’re serving up exactly the same thing to every browser, but using conditional loading, what you can do is serve up your core content to every browser, your basic core content, and then after the DOM is ready, then you can do some testing and say okay, if the screen is wide enough or some other parameters, then I want to load in this other content as well. Not core content, but nice to have content. Like on the front page of The Session, this is just the content, it’s loaded in, it’s grabbed from the server but down at the bottom you’d see a bunch of links off to Twitter or Facebook, Flickr, because there’s a Flickr pool of photos of Irish music and stuff. So, after the page has already loaded, then I use some JavaScript and I say, you know what, if the screen is wider than a certain width, I’ve got enough room, let’s pull in some of those photos—this is conditional loading—and then just display them in-line. But it’s not going to hold up the displaying of the page. This will happen after page load, the user can already carry on with what they’re doing.

In fact, this is kind of the perfect place to use conditional loading because it’s third party content. I’m not now relying on Flickr’s servers to be up all the time to render my page. If something goes wrong with the third party service, that’s okay. So a great place to use conditional loading is if you have those buttons somewhere, like saying “Like” or “+1” or “Tweet this” or “Follow me”; all that kind of stuff. First of all, it makes you look really desperate—just sayin’—but secondly, you are relying on a third party service for the rendering of your document. The way that most of those little widgets work is they say, “Oh, just insert this script element into the middle of your page.” A script element, which will block rendering until the source has been retrieved. So you might think, that’s okay, Twitter’s servers are always going to be up, right? And then someone tries to look at your page in China and your page never finishes loading because that script element never finishes loading because Twitter is blocked in China.

That’s just one example, but again, like I said, you’ve been in this business long enough to get paranoid; you don’t want to rely on a third party service for anything. Conditional loading allows us to have the best of both worlds; I’m going to get the value from these third party services, but I’m not going to be reliant on them. Conditional loading: really handy for that.

To speed things up a little when you know you’re going to be making a request out to some other server, in the head you can do a DNS pre-fetch. So this is a rel value in the link element, DNS pre-fetch and then you point to the third party service you’re going to be using and this is just a hint to the browser that you’re going to be getting something from this domain, so you might want to do the DNS look-up for that domain when you’ve got a chance, when the browser’s ready and has a moment.

There’s a whole bunch of these kind of rel values that help you squeeze a little bit more performance from the browser. You can pre-fetch, if you’re pretty sure that the user is going to go to a specific URL next, you’re pretty confident about that, you can tell the browser that it might want to pre-fetch that page in the background, when it’s ready, when it’s got time. It’s not a command to the browser, it’s a suggestion to the browser; it’s still left to the discretion of the browser. And if you’re very confident that the user is going to visit a particular page next, you can even use pre-render, though I would say be very careful with this one. Be careful because that’s a big assumption to be making, but could result in very, very snappy loading in the browsers that support this.

I find myself trying to squeeze every bit of performance out of the browsers; the performance is so, so important. It always was: performance always was important, but somehow we got lazy there for a while, like I remember back in my day …in the nineties, we were making the smallest image files we possibly could, we only had 216 colours to work with. Tell that to the young people today: they wouldn’t believe you. And we would keep our page sizes really low as well, right? These days we’ve got page sizes in the megabytes. What happened to us? Some time in the last ten years, a memo went around saying, “Everybody’s on broadband now; everybody’s got great big monitors now, we do not have to worry about optimising anything, it’s all taken care of: good job.” I missed that memo. We all got really lazy and thank heaven for responsive design and the rise of the mobile and all these other devices because now, “Oh no, I have to optimise everything and make it really snappy.” And you think, hell yeah, you should’ve been doing that anyway. That was always a skill of being a web designer and somehow we lost that skill and now we’re getting it back. Now, performance is so, so important, you can’t ignore it. It’s the single most important part of the user’s experience. If you are a UX designer and you’re not thinking about performance, you’re not doing your job.

I like the nerdiness of trying to squeeze every bit of performance out. I was really disappointed actually that Google Page Speed have taken the score out of the reports they give you when you hit the page speed thing; I used to love trying to get that number as high as I could. It’s gameification, I know, but I used to love trying to get the highest score I possibly could. Now, they don’t give you a score any more, it’s a shame. But still: performance. I can’t repeat it enough.

And this is the interesting thing is that, with the timescales I’m concerned with here, with this website which has been online for over a decade and is going to be online for much longer than that, I hope, I’ve got these two different timescales I’m looking at which is decades, maybe even centuries and then in the other end, nanoseconds, microseconds, the really, really short; how fast does this load? How quickly can the user accomplish their task? These two very different timescales; they’re what interest me. And all the stuff in the middle, I kind of don’t care about as much; days, weeks, months. And yet, in our work, those are the timescales we tend to concentrate on: when is this shipping? It’s shipping next month. When is the deadline? Get those files to me by Friday. These are the timescales we’re thinking of. This website needs to be on by this particular date this year. We’re thinking in pretty short timescales, and actually if you want to think long term, if you really want to be prepared for what’s coming and we don’t know what’s coming, we don’t know what kind of device is out there. Then, thinking in terms of the past can actually be your best bet.

It’s this idea of being future-friendly, trying to somehow prepare for the future. You can’t be future-proof: nobody can predict the future, but there will be more variation in devices, there will be more variation in ways that people will be accessing the content, using your content. You can’t predict that, but you can prepare for that, and the best way to be future-friendly is to be backwards compatible. To stick to those technologies, the robust technologies, starting with well-formed markup, adding your CSS as an enhancement. Remember that,every time you write a style declaration, you are not telling the browser what to do: you are suggesting something to the browser; it’s important to remember that. And not relying on JavaScript, or any other technology, not putting all your eggs in one basket, particularly a basket that has an error-handling capability like JavaScript.

Progressive enhancement is a way to be future friendly, in my opinion. And there are people thinking about the long term view here; like The Long Now Foundation. Who’s familiar with The Long Now Foundation? Okay, one or two people. Any members of The Long Now Foundation here? Not today. It’s kind of an obscure thing, but they concentrate on long-term timescales; they have projects that are thinking in the hundreds and thousands of years. Probably the most famous one is the Clock of the Long Now. This is a clock that will tell time for ten thousand years. It’s a scale-free clock, it’s being built inside a disused mine in Nevada. It’s being built; this isn’t some theoretical thing, they’re actually building something that’s going to tell time for ten thousand years, which I really like because it gets you thinking about those kind of timescales. You’re thinking about the engineering problems, you’re thinking about long-term problems.

Looking at the web, all these different formats like I said at the start, that I’m using on this one particular website, they’re text formats so I think that increases their longevity, because binary formats tend to last for shorter periods of time. Image formats, video formats, which concerns me a lot; we’re putting so much of our collective culture up online in binary formats .Whether the jpeg format will still be supported in a few decades’ time is not clear and when it comes to video, yeah, that’s even more of a mess. But text formats, because they’re human readable probably stand a better chance of surviving. It’s slightly better.

But the most important thing is to just be thinking about this stuff. Next time somebody says to you, “The internet never forgets”, just call bullshit on that. It’s absolute bollocks! Look at the data. The internet forgets all the time. The average lifespan of a web page is months, and yet people are like, “Oh, you’ve got to be careful what you put online, it’ll be there forever: Facebook never forgets, Google never forgets.” No, I would not entrust our collective culture, our society’s memory to some third party servers we don’t even know. Certainly not to the Cloud, whatever that means, the Cloud. What a bullshit term! I mean, it’s just…(applause) it’s just another word for somebody else’s server. Next time somebody talks about the Cloud, just substitute Somebody Else’s Server. It’s on a hard-drive somewhere. What I do is I mentally substitute the word “Moon” when someone says “Cloud” and it makes just as much sense but it’s way more entertaining!

But like I said, just thinking about it, thinking about how long stuff is going to be online, thinking about what formats you’re going to use. These formats, I don’t know whether they’ll last. A lot of these formats will probably disappear, although this one I have high hopes for: HTML, because they are thinking about the long-term picture when it comes to HTML. You know, when Tim Berners-Lee was first creating the web and URLs and HTTP and HTML, Håkon Wium Lie, the co-founder of Opera Software and one of the creators of CSS, he placed a bet; he placed a bet that HTML would be around in fifty years. Fifty years. That’s a ridiculously long timescale for a computer format. If you know anything about the history of computing, you’ll know that formats die off all the time. Fifty years, a crazy timescale. Now, I think that bet actually looks pretty safe, and this is not by accident. HTML is in it for the long-term by default and also it’s got to be backwards compatible. Future friendly and backwards compatible. It is by design.

I found an old email from Ian Hickson to a mailing list, years ago; he was talking about why he got into HTML, he’s the editor of the HTML spec at the WHATWG, and he said:

The original reason I got involved in this work is that I realised that the human race has written literally billions of electronic documents but without ever actually saying how they should be processed. I decided that, for the sake of our future generations, we should document exactly how to process today’s documents so that when they look back, they can still re-implement HTML browsers and get our data back.

That is thinking about our culture, about our society, about our preserving what we’re putting online, and that’s kind of all I ask of you, is to think about The Long Web, to think about the long term consequences of what we’re doing because I don’t think we do it enough.

It isn’t just about what we’re doing today. We are building something greater than the Library of Alexandria could ever have been and that is an awesome—in the true sense of the word—an awesome responsibility.

You’re going to be hearing about more technologies today; you’ve heard about technologies yesterday, techniques, processes. And as you’re evaluating all of the things you’re learning over these two days, I want you to just think also about the longevity, the consequences and the long term effects of what we’re building and think about The Long web.

Thank you.


This presentation is licenced under a Creative Commons attribution licence. You are free to:

Copy, distribute and transmit this presentation.
Adapt the presentation.

Under the following conditions:

You must attribute the presentation to Jeremy Keith.

Tuesday, March 31st, 2015

100 words 009

Last year at An Event Apart in Seattle I was giving a talk about long-term thinking on the web, using The Session as a case study. As a cheap gimmick, I played a tune on my mandolin during the talk.

Chris Coyier was also speaking. He plays mandolin too. Barry—one of the conference attendees—also plays mandolin. So we sat outside, passing my mandolin around.

Barry is back this year and he brought his mandolin with him. I showed him an Irish jig. He showed me a bluegrass tune. Together we played a reel that crossed the Atlantic ocean.

Tuesday, March 10th, 2015

Inlining critical CSS for first-time visits

After listening to Scott rave on about how much of a perceived-performance benefit he got from inlining critical CSS on first load, I thought I’d give it a shot over at The Session. On the chance that this might be useful for others, I figured I’d document what I did.

The idea here is that you can give a massive boost to the perceived performance of the first page load on a site by putting the most important CSS in the head of the page. Then you cache the full stylesheet. For subsequent visits you only ever use the external stylesheet. So if you’re squeamish at the thought of munging your CSS into your HTML (and that’s a perfectly reasonable reaction), don’t worry—this is a temporary workaround just for initial visits.

My particular technology stack here is using Grunt, Apache, and PHP with Twig templates. But I’m sure you can adapt this for other technology stacks: what’s important here isn’t the technology, it’s the thinking behind it. And anyway, the end user never sees any of those technologies: the end user gets HTML, CSS, and JavaScript. As long as that’s what you’re outputting, the specifics of the technology stack really don’t matter.

Generating the critical CSS

Okay. First question: how do you figure out which CSS is critical and which CSS can be deferred?

To help answer that, and automate the task of generating the critical CSS, Filament Group have made a Grunt task called grunt-criticalcss. I added that to my project and updated my Gruntfile accordingly:

    // All my existing Grunt configuration goes here.
    criticalcss: {
        dist: {
            options: {
                url: 'http://thesession.dev',
                width: 1024,
                height: 800,
                filename: '/path/to/main.css',
                outputfile: '/path/to/critical.css'

I’m giving it the name of my locally-hosted version of the site and some parameters to judge which CSS to prioritise. Those parameters are viewport width and height. Now, that’s not a perfect way of judging which CSS matters most, but it’ll do.

Then I add it to the list of Grunt tasks:

// All my existing Grunt tasks go here.

grunt.registerTask('default', ['sass', etc., 'criticalcss']);

The end result is that I’ve got two CSS files: the full stylesheet (called something like main.css) and a stylesheet that only contains the critical styles (called critical.css).

Cache-busting CSS

Okay, this is a bit of a tangent but trust me, it’s going to be relevant…

Most of the time it’s a very good thing that browsers cache external CSS files. But if you’ve made a change to that CSS file, then that feature becomes a bug: you need some way of telling the browser that the CSS file has been updated. The simplest way to do this is to change the name of the file so that the browser sees it as a whole new asset to be cached.

You could use query strings to do this cache-busting but that has some issues. I use a little bit of Apache rewriting to get a similar effect. I point browsers to CSS files like this:

<link rel="stylesheet" href="/css/main.20150310.css">

Now, there isn’t actually a file named main.20150310.css, it’s just called main.css. To tell the server where the actual file is, I use this rewrite rule:

RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.+).(d+).(js|css)$ $1.$3 [L]

That tells the server to ignore those numbers in JavaScript and CSS file names, but the browser will still interpret it as a new file whenever I update that number. You can do that in a .htaccess file or directly in the Apache configuration.

Right. With that little detour out of the way, let’s get back to the issue of inlining critical CSS.

Differentiating repeat visits

That number that I’m putting into the filenames of my CSS is something I update in my Twig template, like this (although this is really something that a Grunt task could do, I guess):

{% set cssupdate = '20150310' %}

Then I can use it like this:

<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">

I can also use JavaScript to store that number in a cookie called csscached so I’ll know if the user has a cached version of this revision of the stylesheet:

document.cookie = 'csscached={{ cssupdate }};expires="Tue, 19 Jan 2038 03:14:07 GMT";path=/';

The absence or presence of that cookie is going to be what determines whether the user gets inlined critical CSS (a first-time visitor, or a visitor with an out-of-date cached stylesheet) or whether the user gets a good ol’ fashioned external stylesheet (a repeat visitor with an up-to-date version of the stylesheet in their cache).

Here are the steps I’m going through:

First of all, set the Twig cssupdate variable to the last revision of the CSS:

{% set cssupdate = '20150310' %}

Next, check to see if there’s a cookie called csscached that matches the value of the latest revision. If there is, great! This is a repeat visitor with an up-to-date cache. Give ‘em the external stylesheet:

{% if _cookie.csscached == cssupdate %}
<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">

If not, then dump the critical CSS straight into the head of the document:

{% else %}
{% include '/css/critical.css' %}

Now I still want to load the full stylesheet but I don’t want it to be a blocking request. I can do this using JavaScript. Once again it’s Filament Group to the rescue with their loadCSS script:

    // include loadCSS here...
    loadCSS('/css/main.{{ cssupdate }}.css');

While I’m at it, I store the value of cssupdate in the csscached cookie:

    document.cookie = 'csscached={{ cssupdate }};expires="Tue, 19 Jan 2038 03:14:07 GMT";path=/';

Finally, consider the possibility that JavaScript isn’t available and link to the full CSS file inside a noscript element:

<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">
{% endif %}

And we’re done. Phew!

Here’s how it looks all together in my Twig template:

{% set cssupdate = '20150310' %}
{% if _cookie.csscached == cssupdate %}
<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">
{% else %}
{% include '/css/critical.css' %}
// include loadCSS here...
loadCSS('/css/main.{{ cssupdate }}.css');
document.cookie = 'csscached={{ cssupdate }};expires="Tue, 19 Jan 2038 03:14:07 GMT";path=/';
<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">
{% endif %}

You can see the production code from The Session in this gist. I’ve tweaked the loadCSS script slightly to match my preferred JavaScript style but otherwise, it’s doing exactly what I’ve outlined here.

The result

According to Google’s PageSpeed Insights, I done good.

Optimising https://thesession.org/

Friday, January 30th, 2015

The Long Web by Jeremy Keith – An Event Apart Video on Vimeo

This is a talk I gave at An Event Apart about eighteen months ago, all about irish music, the web, long-term thinking, and yes, you guessed it—progressive enhancement.

The Long Web by Jeremy Keith – An Event Apart Video

Tuesday, January 27th, 2015

A question of timing

I’ve been updating my collection of design principles lately, adding in some more examples from Android and Windows. Coincidentally, Vasilis unveiled a neat little page that grabs one list of principles at random —just keep refreshing to see more.

I also added this list of seven principles of rich web applications to the collection, although they feel a bit more like engineering principles than design principles per se. That said, they’re really, really good. Every single one is rooted in performance and the user’s experience, not developer convenience.

Don’t get me wrong: developer convenience is very, very important. Nobody wants to feel like they’re doing unnecessary work. But I feel very strongly that the needs of the end user should trump the needs of the developer in almost all instances (you may feel differently and that’s absolutely fine; we’ll agree to differ).

That push and pull between developer convenience and user experience is, I think, most evident in the first principle: server-rendered pages are not optional. Now before you jump to conclusions, the author is not saying that you should never do client-side rendering, but instead points out the very important performance benefits of having the server render the initial page. After that—if the user’s browser cuts the mustard—you can use client-side rendering exclusively.

The issue with that hybrid approach—as I’ve discussed before—is that it’s hard. Isomorphic JavaScript (terrible name) can theoretically help here, but I haven’t seen too many examples of it in action. I suspect that’s because this approach doesn’t yet offer enough developer convenience.

Anyway, I found myself nodding along enthusiastically with that first of seven design principles. Then I got to the second one: act immediately on user input. That sounds eminently sensible, and it’s backed up with sound reasoning. But it finishes with:

Techniques like PJAX or TurboLinks unfortunately largely miss out on the opportunities described in this section.

Ah. See, I’m a big fan of PJAX. It’s essentially the same thing as the Hijax technique I talked about many years ago in Bulletproof Ajax, but with the new addition of HTML5’s History API. It’s a quick’n’dirty way of giving the illusion of a fat client: all the work is actually being done in the server, which sends back chunks of HTML that update the interface. But it’s true that, because of that round-trip to the server, there’s a bit of a delay and so you often end up briefly displaying a loading indicator.

I contend that spinners or “loading indicators” should become a rarity

I agree …but I also like using PJAX/Hijax. Now how do I reconcile what’s best for the user experience with what’s best for my own developer convenience?

I’ve come up with a compromise, and you can see it in action on The Session. There are multiple examples of PJAX in action on that site, like pretty much any page that returns paginated results: new tune settings, the latest events, and so on. The steps for initiating an Ajax request used to be:

  1. Listen for any clicks on the page,
  2. If a “previous” or “next” button is clicked, then:
  3. Display a loading indicator,
  4. Request the new data from the server, and
  5. Update the page with the new data.

In one sense, I am acting immediately to user input, because I always display the loading indicator straight away. But because the loading indicator always appears, no matter how fast or slow the server responds, it sometimes only appears very briefly—just for a flash. In that situation, I wonder if it’s serving any purpose. It might even be doing the opposite to its intended purpose—it draws attention to the fact that there’s a round-trip to the server.

“What if”, I asked myself, “I only showed the loading indicator if the server is taking too long to send a response back?”

The updated flow now looks like this:

  1. Listen for any clicks on the page,
  2. If a “previous” or “next” button is clicked, then:
  3. Start a timer, and
  4. Request the new data from the server.
  5. If the timer reaches an upper limit, show a loading indicator.
  6. When the server sends a response, cancel the timer and
  7. Update the page with the new data.

Even though there are more steps, there’s actually less happening from the user’s perspective. Where previously you would experience this:

  1. I click on a button,
  2. I briefly see a loading indicator,
  3. I see the new data.

Now your experience is:

  1. I click on a button,
  2. I see the new data.

…unless the server or the network is taking too long, in which case the loading indicator appears as an interim step.

The question is: how long is too long? How long do I wait before showing the loading indicator?

The Nielsen Norman group offers this bit of research:

0.1 second is about the limit for having the user feel that the system is reacting instantaneously, meaning that no special feedback is necessary except to display the result.

So I should set my timer to 100 milliseconds. In practice, I found that I can set it to as high as 200 to 250 milliseconds and keep it feeling very close to instantaneous. Anything over that, though, and it’s probably best to display a loading indicator: otherwise the interface starts to feel a little sluggish, and slightly uncanny. (“Did that click do any—? Oh, it did.”)

You can test the response time by looking at some of the simpler pagination examples on The Session: new recordings or new discussions, for example. To see examples of when the server takes a bit longer to send a response, you can try paginating through search results. These take longer because, frankly, I’m not very good at optimising some of those search queries.

There you have it: an interface that—under optimal conditions—reacts to user input instantaneously, but falls back to displaying a loading indicator when conditions are less than ideal. The result is something that feels like a client-side web thang, even though the actual complexity is on the server.

Now to see what else I can learn from the rest of those design principles.

Tuesday, December 16th, 2014

The Session trad tune machine

Most pundits call it “the Internet of Things” but there’s another phrase from Andy Huntington that I first heard from Russell Davies: “the Geocities of Things.” I like that.

I’ve never had much exposure to this world of hacking electronics. I remember getting excited about the possibilities at a Brighton BarCamp back in 2008:

I now have my own little arduino kit, a bread board and a lucky bag of LEDs. Alas, know next to nothing about basic electronics so I’m really going to have to brush up on this stuff.

I never did do any brushing up. But that all changed last week.

Seb is doing a new two-day workshop. He doesn’t call it Internet Of Things. He doesn’t call it Geocities Of Things. He calls it Stuff That Talks To The Interwebs, or STTTTI, or ST4I. He needed some guinea pigs to test his workshop material on, so Clearleft volunteered as tribute.

In short, it was great! And this time, I didn’t stop hacking when I got home.

First off, every workshop attendee gets a hand-picked box of goodies to play with and keep: an arduino mega, a wifi shield, sensors, screens, motors, lights, you name it. That’s the hardware side of things. There are also code samples and libraries that Seb has prepared in advance.

Getting ready to workshop with @Seb_ly. Unwrapping some Christmas goodies from Santa @Seb_ly.

Now, remember, I lack even the most basic knowledge of electronics, but after two days of fiddling with this stuff, it started to click.

Blinkenlights. Hello, little fella.

On the first workshop day, we all did the same exercises, connected things up, getting them to talk to the internet, that kind of thing. For the second workshop day, Seb encouraged us to think about what we might each like to build.

I was quite taken with the ability of the piezo buzzer to play rudimentary music. I started to wonder if there was a way to hook it up to The Session and have it play the latest jigs, reels, and hornpipes that have been submitted to the site in ABC notation. A little bit of googling revealed that someone had already taken a stab at writing an ABC parser for arduino. I didn’t end up using that code, but it convinced me that what I was trying to do wasn’t crazy.

So I built a machine that plays Irish traditional music from the internet.

Playing with hardware and software, making things that go beep in the night.

The hardware has a piezo buzzer, an “on” button, an “off” button, a knob for controlling the speed of the tune, and an obligatory LED.

The software has a countdown timer that polls a URL every minute or so. The URL is http://tune.adactio.com/. That in turn uses The Session’s read-only API to grab the latest tune activity and then get the ABC notation for whichever tune is at the top of that list. Then it does some cleaning up—removing some of the more advanced ABC stuff—and outputs a single line of notes to be played. I’m fudging things a bit: the device has the range of a tin whistle, and expects tunes to be in the key of D or G, but seeing as that’s at least 90% of Irish traditional music, it’s good enough.

Whenever there’s a new tune, it plays it. Or you can hit the satisfying “on” button to manually play back the latest tune (and yes, you can hit the equally satisfying “off” button to stop it). Being able to adjust the playback speed with a twiddly knob turns out to be particularly handy if you decide to learn the tune.

I added one more lo-fi modification. I rolled up a piece of paper and placed it over the piezo buzzer to amplify the sound. It works surprisingly well. It’s loud!

Rolling my own speaker cone, quite literally.

I’ll keep tinkering with it. It’s fun. I realise I’m coming to this whole hardware-hacking thing very late, but I get it now: it really does feel similar to that feeling you would get when you first figured out how to make a web page back in the days of Geocities. I’ve built something that’s completely pointless for most people, but has special meaning for me. It’s ugly, and it’s inefficient, but it works. And that’s a great feeling.

(P.S. Seb will be running his workshop again on the 3rd and 4th of February, and there will a limited amount of early-bird tickets available for one hour, between 11am and midday this Thursday. I highly recommend you grab one.)

Sunday, February 9th, 2014


When I finally unveiled the redesigned and overhauled version of The Session at the end of 2012, it was the culmination of a lot of late nights and weekends. It was also a really great learning experience, one that I subsequently drew on to inform my An Event Apart presentation, The Long Web.

As part of that presentation, I give a little backstory on the ABC format. It’s a way of notating music using nothing more than ASCII text. It begins with some JSON-like metadata about the tune—its title, time signature, and key—followed by the notes of the tune itself—uppercase and lowercase letters denote different octaves, and numbers can be used to denote length:

X: 1
T: Cooley's
R: reel
M: 4/4
L: 1/8
K: Edor
EBBA B2 EB|B2 AB defg|afec dBAF|DEFD E2:|
|:gf|eB B2 efge|eB B2 gedB|A2 FA DAFA|A2 FA defg|
eB B2 eBgB|eB B2 defg|afec dBAF|DEFD E2:|

On The Session, a little bit of progressive enhancement produces a nice crisp SVG version of the sheet music at the user’s request (the non-JavaScript fallback is a server-rendered bitmap of the sheet music).

ABC notation dates back to the early nineties, a time of very limited bandwidth. Exchanging audio files or even images would have been prohibitively expensive. Having software installed on your machine that could convert ABC into sheet music or audio meant that people could share and exchange tunes through email, BBS, or even the then-fledgling World Wide Web.

In today’s world of relatively fast connections, ABC’s usefulness might seemed lessened. But in fact, it’s just as popular as it ever was. People have become used to writing (and even sight-reading) the format, and it has all the resilience that comes with being a text format; easily editable, and human-readable. It’s still the format that people use to submit new tune settings to The Session.

A little while back, I came upon another advantage of the ABC format, one that I had never previously thought of…

The Session has a wide range of users, of all ages, from all over the world, from all walks of life, using all sorts of browsers. I do my best to make sure that the site works for just about any kind of user-agent (while still providing plenty of enhancements for the most modern browsers). That includes screen readers. Some active members of The Session happen to be blind.

One of those screen-reader users got in touch with me shortly after joining to ask me to explain what ABC was all about. I pointed them at some explanatory links. Once the format “clicked” with them, they got quite enthused. They pointed out that if the sheet music were only available as an image, it would mean very little to them. But by providing the ABC notation alongside the sheet music, they could read the music note-for-note.

That’s when it struck me that ABC notation is effectively alt text for sheet music!

There’s one little thing that slightly irks me though. The ABC notation should be read out one letter at a time. But screen readers use a kind of fuzzy logic to figure out whether a set of characters should be spoken as a word:

Screen readers try to pronounce acronyms and nonsensical words if they have sufficient vowels/consonants to be pronounceable; otherwise, they spell out the letters. For example, NASA is pronounced as a word, whereas NSF is pronounced as “N. S. F.” The acronym URL is pronounced “earl,” even though most humans say “U. R. L.” The acronym SQL is not pronounced “sequel” by screen readers even though some humans pronounce it that way; screen readers say “S. Q. L.”

It’s not a big deal, and the screen reader user can explicitly request that a word be spoken letter by letter:

Screen reader users can pause if they didn’t understand a word, and go back to listen to it; they can even have the screen reader read words letter by letter. When reading words letter by letter, JAWS distinguishes between upper case and lower case letters by shouting/emphasizing the upper case letters.

But still …I wish there were some way that I could mark up the ABC notation so that a screen reader would know that it should be read letter by letter. I’ve looked into using abbr, but that offers no guarantees: if the string looks like a word, it will still be spoken as a word. It doesn’t look there’s any ARIA settings for this use-case either.

So if any accessibility experts out there know of something I’m missing, please let me know.

Update: I’ve added an aural CSS declaration of speak: spell-out (thanks to Martijn van der Ven for the tip), although I think the browser support is still pretty non-existent. Any other ideas?

Wednesday, January 8th, 2014

The Long Web - Jeremy Keith at FOWD NYC 2013 - YouTube

There were some technical difficulties with microphones, and it was a bit weird presenting inside a cinema, but I still had fun yapping on at last year’s Future Of Web Design in New York.

The Long Web - Jeremy Keith at FOWD NYC 2013

Wednesday, August 21st, 2013

The Best Thing I Ever Created by Jeremy Keith on The Shutterstock Blog

Shutterstock are running a series on their blog called “The Best Thing I Ever Created” and they asked me for a contribution. So I wrote about The Session.

Monday, January 21st, 2013

Long time

A few years back, I was on a road trip in the States with my friend Dan. We drove through Maryland and Virginia to the sites of American Civil War battles—Gettysburg, Antietam. I was reading Tom Standage’s magnificent book The Victorian Internet at the time. When I was done with the book, I passed it on to Dan. He loved it. A few years later, he sent me a gift: a glass telegraph insulator.

Glass telegraph insulator from New York

Last week I received another gift from Dan: a telegraph key.

Telegraph key

It’s lovely. If my knowledge of basic electronics were better, I’d hook it up to an Arduino and tweet with it.

Dan came over to the UK for a visit last month. We had a lovely time wandering around Brighton and London together. At one point, we popped into the National Portrait Gallery. There was one painting he really wanted to see: the portrait of Samuel Pepys.


“Were you reading the online Pepys diary?”, I asked.

“Oh, yes!”, he said.

“I know the guy who did that!”

The “guy who did that” is, of course, the brilliant Phil Gyford.

Phil came down to Brighton and gave a Skillswap talk all about the ten-year long project.

The diary of Samuel Pepys: Telling a complex story online on Huffduffer

Now Phil has restarted the diary. He wrote a really great piece about what it’s like overhauling a site that has been online for a decade. Given that I spent a lot of my time last year overhauling The Session (which has been online in some form or another since the late nineties), I can relate to his perspective on trying to choose long-term technologies:

Looking ahead, how will I feel about this Django backend in ten years’ time? I’ve no idea what the state of the platform will be in a decade.

I was thinking about switching The Session over to Django, but I decided against it in the end. I figured that the pain involved in trying to retrofit an existing site (as opposed to starting a brand new project) would be too much. So the site is still written in the very uncool LAMP stack: Linux, Apache, MySQL, and PHP.

Mind you, Marco Arment makes the point in his Webstock talk that there’s a real value to using tried and tested “boring” technologies.

One area where I’ve found myself becoming increasingly wary over time is the use of third-party APIs. I say that with a heavy heart—back at dConstruct 2006 I was talking all about The Joy of API. But Yahoo, Google, Twitter …they’ve all deprecated or backtracked on their offerings to developers.

Anyway, this is something that has been on my mind a lot lately: evaluating technologies and services in terms of their long-term benefit instead of just their short-term hit. It’s something that we need to think about more as developers, and it’s certainly something that we need to think about more as users.

Compared with genuinely long-term projects like the 10,000 year Clock of the Long Now making something long-lasting on the web shouldn’t be all that challenging. The real challenge is acknowledging that this is even an issue. As Phil puts it:

I don’t know how much individuals and companies habitually think about this. Is it possible to plan for how your online service will work over the next ten years, never mind longer?

As my Long Bet illustrates, I can be somewhat pessimistic about the longevity of our web creations:

The original URL for this prediction (www.longbets.org/601) will no longer be available in eleven years.

But I really hope I lose that bet. Maybe I’ll suggest to Matt (my challenger on the bet) that we meet up on February 22nd, 2022 at the Long Now Salon. It doesn’t exist yet. But give it time.

Wednesday, January 9th, 2013

Dealing with IE

Laura asked a question on Twitter the other day about dealing with older versions of Internet Explorer when you’ve got your layout styles nested within media queries (that older versions of IE don’t understand):

It’s a fair question. It also raises another question: how do you define “dealing with” Internet Explorer 8 or 7?

You could justifiably argue that IE7 users should upgrade their damn browser. But that same argument doesn’t really hold for IE8 if the user is on Windows XP: IE8 is as high as they can go. Asking users to upgrade their browser is one thing. Asking them to upgrade their operating system feels different.

But this is the web and websites do not need to look the same in every browser. Is it acceptable to simply give Internet Explorer 8 the same baseline experience that any other old out-of-date browser would get? In other words, is it even a problem that older versions of Internet Explorer won’t parse media queries? If you’re building in a mobile-first way, they’ll get linearised content with baseline styles applied.

That’s the approach that Alex advocates in the Q&A after his excellent closing keynote at Fronteers. That’s what I’m doing here on adactio.com. Users of IE8 get the linearised layout and that’s just fine. One of the advantages of this approach is that you are then freed up to use all sorts of fancy CSS within your media query blocks without having to worry about older versions of IE crapping themselves.

On other sites, like Huffduffer, I make an assumption (always a dangerous thing to do) that IE7 and IE8 users are using a desktop or laptop computer and so they could get some layout styles. I outlined that technique in a post about Windows mobile media queries. Using that technique, I end up splitting my CSS into two files:

<link rel="stylesheet" href="/css/global.css" media="all">
<link rel="stylesheet" href="/css/layout.css" media="all and (min-width: 30em)">
<!--[if (lt IE 9) & (!IEMobile)]>
<link rel="stylesheet" href="/css/layout.css" media="all">

The downside to this technique is that now there are two HTTP requests for the CSS …even for users of modern browsers. The alternative is to maintain one stylesheet for modern browsers and a separate stylesheet for older versions of Internet Explorer. That sounds like a maintenance nightmare.

Pre-processors to the rescue. Using Sass or LESS you can write your CSS in separate files (e.g. one file for basic styles and another for layout styles) and then use the preprocessor to combine those files in two different ways: one with media queries (for modern browsers) and another without media queries (for older versions of Internet Explorer). Or, if you don’t want to have your media query styles all grouped together, you can use Jake’s excellent method.

When I relaunched The Session last month, I initially just gave Internet Explorer 8 and lower the linearised content—the same layout that small-screen browsers would get. For example, the navigation is situated at the bottom of each page and you get to it by clicking an internal link at the top of each page. It all worked fine and nobody complained.

But I thought that it was a bit of a shame that users of IE8 and IE7 weren’t getting the same navigation that users of other desktop browsers were getting. So I decided to use a preprocesser (Sass in this case) to spit out an extra stylesheet for IE8 and IE7.

So let’s say I’ve got .scss files like this:

  • base.scss
  • medium.scss
  • wide.scss

Then in my standard .scss file that’s going to generate the CSS for all browsers (called global.css), I can write:

@import "base.scss";
@media all and (min-width: 30em) {
 @import "medium";
@media all and (min-width: 50em) {
 @import "wide";

But I can also generate a stylesheet for IE8 and IE7 (called legacy.css) that calls in those layout styles without the media query blocks:

@import "medium";
@import "wide";

IE8 and IE7 will be downloading some styles twice (all the styles within media queries) but in this particular case, that doesn’t amount to too much. Oh, and you’ll notice that I’m not even going to try to let IE6 parse those styles: it would do more harm than good.

<link rel="stylesheet" href="/css/global.css">
<!--[if (lt IE 9) & (!IEMobile) & (gt IE 6)]>
<link rel="stylesheet" href="/css/legacy.css">

So I did that (although I don’t really have .scss files named “medium” or “wide”—they’re actually given names like “navigation” or “columns” that more accurately describe what they do). I thought I was doing a good deed for any users of The Session who were still using Internet Explorer 8.

But then I read this. It turned out that someone was not only using IE8 on Windows XP, but they had their desktop’s resolution set to 800x600. That’s an entirely reasonable thing to do if your eyesight isn’t great. And, like I said, I can’t really ask him to upgrade his browser because that would mean upgrading the whole operating system.

Now there’s a temptation here to dismiss this particular combination of old browser + old OS + narrow resolution as an edge case. It’s probably just one person. But that one person is a prolific contributor to the site. This situation nicely highlights the problem of playing the numbers game: as a percentage, this demographic is tiny. But this isn’t a number. It’s a person. That person matters.

The root of the problem lay in my assumption that IE8 or IE7 users would be using desktop or laptop computers with a screen size of at least 1024 pixels. Serves me right for making assumptions.

So what could I do? I could remove the conditional comments and the IE-specific stylesheet and go back to just serving the linearised content. Or I could serve up just the medium-width styles to IE8 and IE7.

That’s what I ended up doing but I also introduced a little bit of JavaScript in the conditional comments to serve up the widescreen styles if the browser width is above a certain size:

<link rel="stylesheet" href="/css/global.css">
<!--[if (lt IE 9) & (!IEMobile) & (gt IE 6)]>
<link rel="stylesheet" href="/css/medium.css">
if (document.documentElement.clientWidth > 800) {
 document.write('<link rel="stylesheet" href="/css/wide.css">');

It works …I guess. It’s not optimal but at least users of IE8 and IE7 are no longer just getting the small-screen styles. It’s a hack, and not a particularly clever one.

Was it worth it? Is it an improvement?

I think this is something to remember when we’re coming up solutions to “dealing with” older versions of Internet Explorer: whether it’s a dumb solution like mine or a clever solution like Jake’s, we shouldn’t have to do this. We shouldn’t have to worry about IE7 just like we don’t have to worry about Netscape 4 or Mosaic or Lynx; we should be free to build according to the principles of progressive enhancement safe in the knowledge that older, less capable browsers won’t get all the bells and whistles, but they will be able to access our content. Instead we’re spending time coming up with hacks and polyfills to deal with one particular family of older, less capable browsers simply because of their disproportionate market share.

When we come up with clever hacks and polyfills for dealing with older versions of Internet Explorer, we shouldn’t feel pleased about it. We should feel angry.

Update: I’ve written a follow-up post to clarify what I’m talking about here.

Sunday, December 9th, 2012

The Session

When I was travelling back from Webstock in New Zealand at the start of this year, I had a brief stopover in Sydney. It coincided with one of John and Maxine’s What Do I Know? events so I did a little stint on five things I learned from the internet.

It was a fun evening and I had a chance to chat with many lovely Aussie web geeks. There was this one guy, Christian, that I was chatting with for quite a bit about all sorts of web-related stuff. But I could tell he wasn’t Australian. The Northern Ireland accent was a bit of a giveaway.

“You’re not from ‘round these parts, then?” I asked.

“Actually,” he said, “we’ve met before.”

I started racking my brains. Which geeky gathering could it have been?

“In Freiburg” he said.

Freiburg? But that was where I lived in the ’90s, before I was even making websites. I was drawing a complete blank. Then he said his name.

“Christian!” I cried, “Kerry and Christian!”

With a sudden shift of context, it all fit into place. We had met on the streets of Freiburg when I was a busker. Christian and his companion Kerry were travelling through Europe and they found themselves in Freiburg, also busking. Christian played guitar. Kerry played fiddle.

I listened to them playing some great Irish tunes and then got chatting with them. They didn’t have a place to stay so I offered to put them up. We had a good few days of hanging out and playing music together.

And now, all these years later, here was Christian …in Sydney, Australia …at a web event! Worlds were colliding. But it was a really great feeling to have that connection between my past and my present; between my life in Germany and my life now; between the world of Irish traditional music and the world of the web.

One of the other things that connects those two worlds is The Session. I’ve been running that website for about twelve or thirteen years now. It’s the thing I’m simultaneously most proud of and most ashamed of.

I’m proud of it because it has genuinely managed to contribute something back to the tradition: it’s handy resource for trad players around the world.

I’m ashamed of it because it has been languishing for so long. It has so much potential and I haven’t been devoting enough time or energy into meeting that potential.

At the end of 2009, I wrote:

I’m not going to make a new year’s resolution—that would just give me another deadline to stress out about—but I’m making a personal commitment to do whatever I can for The Session in 2010.

Well, it only took me another two years but I’ve finally done it.

I’ve spent a considerable portion of my spare time this year overhauling the site from the ground up, completely refactoring the code, putting together a new mobile-first design, adding much more location-based functionality and generally tilting at my own personal windmills. Trying to rewrite a site that’s been up and running for over a decade is considerably more challenging than creating a new site from scratch.

Luckily I had some help. Christian, for example, helped geocode all the sessions and events that had been added to the site over the years.

That’s one thing that the worlds of Irish music and the web have in common: people getting together to share and collaborate.

Thursday, December 31st, 2009

The future of the tradition

Drew and Brian did a superb job with this year’s 24 Ways, the advent calendar for geeks. There were some recurring themes: HTML5 from Yaili, Bruce and myself; CSS3 from Drew, Natalie and Rachel; and workflow from Andy and Meagan.

The matter of personal projects was also surprisingly prevalent. Elliot wrote A Pet Project is For Life, Not Just for Christmas and Jina specifically mentioned Huffduffer in her piece, Make Out Like a Bandit. December was the month for praising personal projects: that’s exactly what I was talking about at Refresh Belfast at the start of the month.

If you don’t have a personal project on the go, I highly recommend it. It’s a great way of learning new skills and experimenting with new technology. It’s also a good safety valve that can keep you sane when work is getting you down.

Working on Huffduffer is a lot of fun and I plan to keep iterating on the site whenever I can. But the project that I’ve really invested my soul into is The Session. Over the past decade, the site has built up a large international community with a comprehensive store of tunes and sessions.

Running any community site requires a lot of time and I haven’t always been as hands-on as I could have been with The Session. As a result, the discourse can occasionally spiral downwards into nastiness, prompting me to ask myself, Why do I bother? But then when someone contributes something wonderful to the site, I’m reminded of why I started it in the first place.

My dedication to the site was crystallised recently by a sad event. A long-time contributor to the site passed away. Looking back over the generosity of his contributions made me realise that The Session isn’t a personal project at all: it’s a community project, and I have a duty to enable the people in the community to connect. I also have a duty to maintain the URLs created by the community (are you listening, Yahoo?).

I feel like I’ve been neglecting the site. I could be doing so much more with the collective data, especially around location. The underlying code definitely needs refactoring, and the visual design could certainly do with a refresh (although I think it’s held up pretty well for such a long-running site).

I’m not going to make a new year’s resolution—that would just give me another deadline to stress out about—but I’m making a personal commitment to do whatever I can for The Session in 2010.