Tags: thesession

11

sparkline

Enhance’n’drag’n’drop

I’ve spent the last week implementing a new feature over at The Session. I had forgotten how enjoyable it is to get completely immersed in a personal project, thinking about everything from database structures right through to CSS animations,

I won’t bore you with the details of this particular feature—which is really only of interest if you play traditional Irish music—but I thought I’d make note of one little bit of progressive enhancement.

One of the interfaces needed for this feature was a form to re-order items in a list. So I thought to myself, “what’s the simplest technology to enable this functionality?” I came up with a series of select elements within a form.

Reordering

It’s not the nicest of interfaces, but it works pretty much everywhere. Once I had built that—and the back-end functionality required to make it all work—I could think about how to enhance it.

I brought it up at the weekly Clearleft front-end pow-wow (featuring special guest Jack Franklin). I figured that drag’n’drop would be the obvious enhancement, but I didn’t know if there were any “go-to” libraries for implementing it; I haven’t paid much attention to the state of drag’n’drop since the old IE implement was added to HTML5.

Nobody had any particular recommendations so I did a bit of searching. I came across Dragula, which looked pretty solid. It’s made by the super-smart Nicolás Bevacqua, who I know shares my feelings about progressive enhancement. To my delight, I was able to get it working within minutes.

Drag and drop

There’s a little bit of mustard-cutting going on: does the dragula object exist, and does the browser understand querySelector? If so, the select elements are hidden and the drag’n’drop is enabled. Then, whenever an item in the list is dragged and dropped, the corresponding (hidden) select element is updated …so that time I spent making the simpler non-drag’n’drop interface was time well spent: I didn’t need to do anything extra on the server to handle the data from the updated interface.

It’s a simple example but it demonstrates that the benefits of starting with the simpler universal interface before upgrading to the smoother experience.

Words of welcome

For a while now, The Session has had some little on-boarding touches to make sure that new members are eased into the culture of this traditional Irish music community.

First off, new members are encouraged to add a little bit about themselves so that there’s some context when they start making contributions.

Welcome! You are now a member of The Session. Now, how about sharing a bit more about yourself: where you're from, what instrument(s) you play, etc.

Secondly, new members can’t kick off a brand new discussion straight away.

Woah there! I appreciate your eagerness to post your first discussion, but seeing as you just joined The Session, maybe it would be better if you wait a little bit first. Take a look around at the existing discussions, have a read of the house rules and get a feel for how things work around here.

Likewise, they can’t post a comment straight away. They need to wait an hour between signing up and posting their first comment. Instead of seeing a comment form, they see a countdown.

Welcome to The Session, Testy McTest! You'll be able to add your first comment in forty-seven minutes.

Finally, when they do make their first submission—whether it’s a discussion, an event, a session, or a tune—the interface displays a few extra messages of encouragement and care.

Add a tune, Step 1 of 4: Tune Details. As this is your first tune submission, please take extra care. First, provide some basic details about the tune you want to add.

But I realised that all of these custom messages were very one-sided. They were always displayed to the new member. It’s equally important that existing members treat any newcomers with respect.

Now on some discussions, an extra message is displayed to existing members right before the comment form. The logic is straightforward:

  1. If this is a discussion added by a new member,
  2. who hasn’t yet added any comments anywhere,
  3. and this discussion has no responses so far,
  4. and anyone other than that member is viewing the page,
  5. then display a message asking for help making this new member feel welcome.

This is the first ever post by FourCourseChaos. Please help in making them feel welcome here at The Session.

It’s a small addition, but it makes a difference.

No intricate JavaScript; no smooth animations; just some words on a screen encouraging a human connection.

100 words 009

Last year at An Event Apart in Seattle I was giving a talk about long-term thinking on the web, using The Session as a case study. As a cheap gimmick, I played a tune on my mandolin during the talk.

Chris Coyier was also speaking. He plays mandolin too. Barry—one of the conference attendees—also plays mandolin. So we sat outside, passing my mandolin around.

Barry is back this year and he brought his mandolin with him. I showed him an Irish jig. He showed me a bluegrass tune. Together we played a reel that crossed the Atlantic ocean.

Inlining critical CSS for first-time visits

After listening to Scott rave on about how much of a perceived-performance benefit he got from inlining critical CSS on first load, I thought I’d give it a shot over at The Session. On the chance that this might be useful for others, I figured I’d document what I did.

The idea here is that you can give a massive boost to the perceived performance of the first page load on a site by putting the most important CSS in the head of the page. Then you cache the full stylesheet. For subsequent visits you only ever use the external stylesheet. So if you’re squeamish at the thought of munging your CSS into your HTML (and that’s a perfectly reasonable reaction), don’t worry—this is a temporary workaround just for initial visits.

My particular technology stack here is using Grunt, Apache, and PHP with Twig templates. But I’m sure you can adapt this for other technology stacks: what’s important here isn’t the technology, it’s the thinking behind it. And anyway, the end user never sees any of those technologies: the end user gets HTML, CSS, and JavaScript. As long as that’s what you’re outputting, the specifics of the technology stack really don’t matter.

Generating the critical CSS

Okay. First question: how do you figure out which CSS is critical and which CSS can be deferred?

To help answer that, and automate the task of generating the critical CSS, Filament Group have made a Grunt task called grunt-criticalcss. I added that to my project and updated my Gruntfile accordingly:

grunt.initConfig({
    // All my existing Grunt configuration goes here.
    criticalcss: {
        dist: {
            options: {
                url: 'http://thesession.dev',
                width: 1024,
                height: 800,
                filename: '/path/to/main.css',
                outputfile: '/path/to/critical.css'
            }
        }
    }
});

I’m giving it the name of my locally-hosted version of the site and some parameters to judge which CSS to prioritise. Those parameters are viewport width and height. Now, that’s not a perfect way of judging which CSS matters most, but it’ll do.

Then I add it to the list of Grunt tasks:

// All my existing Grunt tasks go here.
grunt.loadNpmTasks('grunt-criticalcss');

grunt.registerTask('default', ['sass', etc., 'criticalcss']);

The end result is that I’ve got two CSS files: the full stylesheet (called something like main.css) and a stylesheet that only contains the critical styles (called critical.css).

Cache-busting CSS

Okay, this is a bit of a tangent but trust me, it’s going to be relevant…

Most of the time it’s a very good thing that browsers cache external CSS files. But if you’ve made a change to that CSS file, then that feature becomes a bug: you need some way of telling the browser that the CSS file has been updated. The simplest way to do this is to change the name of the file so that the browser sees it as a whole new asset to be cached.

You could use query strings to do this cache-busting but that has some issues. I use a little bit of Apache rewriting to get a similar effect. I point browsers to CSS files like this:

<link rel="stylesheet" href="/css/main.20150310.css">

Now, there isn’t actually a file named main.20150310.css, it’s just called main.css. To tell the server where the actual file is, I use this rewrite rule:

RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.+).(d+).(js|css)$ $1.$3 [L]

That tells the server to ignore those numbers in JavaScript and CSS file names, but the browser will still interpret it as a new file whenever I update that number. You can do that in a .htaccess file or directly in the Apache configuration.

Right. With that little detour out of the way, let’s get back to the issue of inlining critical CSS.

Differentiating repeat visits

That number that I’m putting into the filenames of my CSS is something I update in my Twig template, like this (although this is really something that a Grunt task could do, I guess):

{% set cssupdate = '20150310' %}

Then I can use it like this:

<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">

I can also use JavaScript to store that number in a cookie called csscached so I’ll know if the user has a cached version of this revision of the stylesheet:

<script>
document.cookie = 'csscached={{ cssupdate }};expires="Tue, 19 Jan 2038 03:14:07 GMT";path=/';
</script>

The absence or presence of that cookie is going to be what determines whether the user gets inlined critical CSS (a first-time visitor, or a visitor with an out-of-date cached stylesheet) or whether the user gets a good ol’ fashioned external stylesheet (a repeat visitor with an up-to-date version of the stylesheet in their cache).

Here are the steps I’m going through:

First of all, set the Twig cssupdate variable to the last revision of the CSS:

{% set cssupdate = '20150310' %}

Next, check to see if there’s a cookie called csscached that matches the value of the latest revision. If there is, great! This is a repeat visitor with an up-to-date cache. Give ‘em the external stylesheet:

{% if _cookie.csscached == cssupdate %}
<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">

If not, then dump the critical CSS straight into the head of the document:

{% else %}
<style>
{% include '/css/critical.css' %}
</style>

Now I still want to load the full stylesheet but I don’t want it to be a blocking request. I can do this using JavaScript. Once again it’s Filament Group to the rescue with their loadCSS script:

 <script>
    // include loadCSS here...
    loadCSS('/css/main.{{ cssupdate }}.css');

While I’m at it, I store the value of cssupdate in the csscached cookie:

    document.cookie = 'csscached={{ cssupdate }};expires="Tue, 19 Jan 2038 03:14:07 GMT";path=/';
</script>

Finally, consider the possibility that JavaScript isn’t available and link to the full CSS file inside a noscript element:

<noscript>
<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">
</noscript>
{% endif %}

And we’re done. Phew!

Here’s how it looks all together in my Twig template:

{% set cssupdate = '20150310' %}
{% if _cookie.csscached == cssupdate %}
<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">
{% else %}
<style>
{% include '/css/critical.css' %}
</style>
<script>
// include loadCSS here...
loadCSS('/css/main.{{ cssupdate }}.css');
document.cookie = 'csscached={{ cssupdate }};expires="Tue, 19 Jan 2038 03:14:07 GMT";path=/';
</script>
<noscript>
<link rel="stylesheet" href="/css/main.{{ cssupdate }}.css">
</noscript>
{% endif %}

You can see the production code from The Session in this gist. I’ve tweaked the loadCSS script slightly to match my preferred JavaScript style but otherwise, it’s doing exactly what I’ve outlined here.

The result

According to Google’s PageSpeed Insights, I done good.

Optimising https://thesession.org/

A question of timing

I’ve been updating my collection of design principles lately, adding in some more examples from Android and Windows. Coincidentally, Vasilis unveiled a neat little page that grabs one list of principles at random —just keep refreshing to see more.

I also added this list of seven principles of rich web applications to the collection, although they feel a bit more like engineering principles than design principles per se. That said, they’re really, really good. Every single one is rooted in performance and the user’s experience, not developer convenience.

Don’t get me wrong: developer convenience is very, very important. Nobody wants to feel like they’re doing unnecessary work. But I feel very strongly that the needs of the end user should trump the needs of the developer in almost all instances (you may feel differently and that’s absolutely fine; we’ll agree to differ).

That push and pull between developer convenience and user experience is, I think, most evident in the first principle: server-rendered pages are not optional. Now before you jump to conclusions, the author is not saying that you should never do client-side rendering, but instead points out the very important performance benefits of having the server render the initial page. After that—if the user’s browser cuts the mustard—you can use client-side rendering exclusively.

The issue with that hybrid approach—as I’ve discussed before—is that it’s hard. Isomorphic JavaScript (terrible name) can theoretically help here, but I haven’t seen too many examples of it in action. I suspect that’s because this approach doesn’t yet offer enough developer convenience.

Anyway, I found myself nodding along enthusiastically with that first of seven design principles. Then I got to the second one: act immediately on user input. That sounds eminently sensible, and it’s backed up with sound reasoning. But it finishes with:

Techniques like PJAX or TurboLinks unfortunately largely miss out on the opportunities described in this section.

Ah. See, I’m a big fan of PJAX. It’s essentially the same thing as the Hijax technique I talked about many years ago in Bulletproof Ajax, but with the new addition of HTML5’s History API. It’s a quick’n’dirty way of giving the illusion of a fat client: all the work is actually being done in the server, which sends back chunks of HTML that update the interface. But it’s true that, because of that round-trip to the server, there’s a bit of a delay and so you often end up briefly displaying a loading indicator.

I contend that spinners or “loading indicators” should become a rarity

I agree …but I also like using PJAX/Hijax. Now how do I reconcile what’s best for the user experience with what’s best for my own developer convenience?

I’ve come up with a compromise, and you can see it in action on The Session. There are multiple examples of PJAX in action on that site, like pretty much any page that returns paginated results: new tune settings, the latest events, and so on. The steps for initiating an Ajax request used to be:

  1. Listen for any clicks on the page,
  2. If a “previous” or “next” button is clicked, then:
  3. Display a loading indicator,
  4. Request the new data from the server, and
  5. Update the page with the new data.

In one sense, I am acting immediately to user input, because I always display the loading indicator straight away. But because the loading indicator always appears, no matter how fast or slow the server responds, it sometimes only appears very briefly—just for a flash. In that situation, I wonder if it’s serving any purpose. It might even be doing the opposite to its intended purpose—it draws attention to the fact that there’s a round-trip to the server.

“What if”, I asked myself, “I only showed the loading indicator if the server is taking too long to send a response back?”

The updated flow now looks like this:

  1. Listen for any clicks on the page,
  2. If a “previous” or “next” button is clicked, then:
  3. Start a timer, and
  4. Request the new data from the server.
  5. If the timer reaches an upper limit, show a loading indicator.
  6. When the server sends a response, cancel the timer and
  7. Update the page with the new data.

Even though there are more steps, there’s actually less happening from the user’s perspective. Where previously you would experience this:

  1. I click on a button,
  2. I briefly see a loading indicator,
  3. I see the new data.

Now your experience is:

  1. I click on a button,
  2. I see the new data.

…unless the server or the network is taking too long, in which case the loading indicator appears as an interim step.

The question is: how long is too long? How long do I wait before showing the loading indicator?

The Nielsen Norman group offers this bit of research:

0.1 second is about the limit for having the user feel that the system is reacting instantaneously, meaning that no special feedback is necessary except to display the result.

So I should set my timer to 100 milliseconds. In practice, I found that I can set it to as high as 200 to 250 milliseconds and keep it feeling very close to instantaneous. Anything over that, though, and it’s probably best to display a loading indicator: otherwise the interface starts to feel a little sluggish, and slightly uncanny. (“Did that click do any—? Oh, it did.”)

You can test the response time by looking at some of the simpler pagination examples on The Session: new recordings or new discussions, for example. To see examples of when the server takes a bit longer to send a response, you can try paginating through search results. These take longer because, frankly, I’m not very good at optimising some of those search queries.

There you have it: an interface that—under optimal conditions—reacts to user input instantaneously, but falls back to displaying a loading indicator when conditions are less than ideal. The result is something that feels like a client-side web thang, even though the actual complexity is on the server.

Now to see what else I can learn from the rest of those design principles.

The Session trad tune machine

Most pundits call it “the Internet of Things” but there’s another phrase from Andy Huntington that I first heard from Russell Davies: “the Geocities of Things.” I like that.

I’ve never had much exposure to this world of hacking electronics. I remember getting excited about the possibilities at a Brighton BarCamp back in 2008:

I now have my own little arduino kit, a bread board and a lucky bag of LEDs. Alas, know next to nothing about basic electronics so I’m really going to have to brush up on this stuff.

I never did do any brushing up. But that all changed last week.

Seb is doing a new two-day workshop. He doesn’t call it Internet Of Things. He doesn’t call it Geocities Of Things. He calls it Stuff That Talks To The Interwebs, or STTTTI, or ST4I. He needed some guinea pigs to test his workshop material on, so Clearleft volunteered as tribute.

In short, it was great! And this time, I didn’t stop hacking when I got home.

First off, every workshop attendee gets a hand-picked box of goodies to play with and keep: an arduino mega, a wifi shield, sensors, screens, motors, lights, you name it. That’s the hardware side of things. There are also code samples and libraries that Seb has prepared in advance.

Getting ready to workshop with @Seb_ly. Unwrapping some Christmas goodies from Santa @Seb_ly.

Now, remember, I lack even the most basic knowledge of electronics, but after two days of fiddling with this stuff, it started to click.

Blinkenlights. Hello, little fella.

On the first workshop day, we all did the same exercises, connected things up, getting them to talk to the internet, that kind of thing. For the second workshop day, Seb encouraged us to think about what we might each like to build.

I was quite taken with the ability of the piezo buzzer to play rudimentary music. I started to wonder if there was a way to hook it up to The Session and have it play the latest jigs, reels, and hornpipes that have been submitted to the site in ABC notation. A little bit of googling revealed that someone had already taken a stab at writing an ABC parser for arduino. I didn’t end up using that code, but it convinced me that what I was trying to do wasn’t crazy.

So I built a machine that plays Irish traditional music from the internet.

Playing with hardware and software, making things that go beep in the night.

The hardware has a piezo buzzer, an “on” button, an “off” button, a knob for controlling the speed of the tune, and an obligatory LED.

The software has a countdown timer that polls a URL every minute or so. The URL is http://tune.adactio.com/. That in turn uses The Session’s read-only API to grab the latest tune activity and then get the ABC notation for whichever tune is at the top of that list. Then it does some cleaning up—removing some of the more advanced ABC stuff—and outputs a single line of notes to be played. I’m fudging things a bit: the device has the range of a tin whistle, and expects tunes to be in the key of D or G, but seeing as that’s at least 90% of Irish traditional music, it’s good enough.

Whenever there’s a new tune, it plays it. Or you can hit the satisfying “on” button to manually play back the latest tune (and yes, you can hit the equally satisfying “off” button to stop it). Being able to adjust the playback speed with a twiddly knob turns out to be particularly handy if you decide to learn the tune.

I added one more lo-fi modification. I rolled up a piece of paper and placed it over the piezo buzzer to amplify the sound. It works surprisingly well. It’s loud!

Rolling my own speaker cone, quite literally.

I’ll keep tinkering with it. It’s fun. I realise I’m coming to this whole hardware-hacking thing very late, but I get it now: it really does feel similar to that feeling you would get when you first figured out how to make a web page back in the days of Geocities. I’ve built something that’s completely pointless for most people, but has special meaning for me. It’s ugly, and it’s inefficient, but it works. And that’s a great feeling.

(P.S. Seb will be running his workshop again on the 3rd and 4th of February, and there will a limited amount of early-bird tickets available for one hour, between 11am and midday this Thursday. I highly recommend you grab one.)

ABC

When I finally unveiled the redesigned and overhauled version of The Session at the end of 2012, it was the culmination of a lot of late nights and weekends. It was also a really great learning experience, one that I subsequently drew on to inform my An Event Apart presentation, The Long Web.

As part of that presentation, I give a little backstory on the ABC format. It’s a way of notating music using nothing more than ASCII text. It begins with some JSON-like metadata about the tune—its title, time signature, and key—followed by the notes of the tune itself—uppercase and lowercase letters denote different octaves, and numbers can be used to denote length:

X: 1
T: Cooley's
R: reel
M: 4/4
L: 1/8
K: Edor
|:D2|EBBA B2 EB|B2 AB dBAG|FDAD BDAD|FDAD dAFD|
EBBA B2 EB|B2 AB defg|afec dBAF|DEFD E2:|
|:gf|eB B2 efge|eB B2 gedB|A2 FA DAFA|A2 FA defg|
eB B2 eBgB|eB B2 defg|afec dBAF|DEFD E2:|

On The Session, a little bit of progressive enhancement produces a nice crisp SVG version of the sheet music at the user’s request (the non-JavaScript fallback is a server-rendered bitmap of the sheet music).

ABC notation dates back to the early nineties, a time of very limited bandwidth. Exchanging audio files or even images would have been prohibitively expensive. Having software installed on your machine that could convert ABC into sheet music or audio meant that people could share and exchange tunes through email, BBS, or even the then-fledgling World Wide Web.

In today’s world of relatively fast connections, ABC’s usefulness might seemed lessened. But in fact, it’s just as popular as it ever was. People have become used to writing (and even sight-reading) the format, and it has all the resilience that comes with being a text format; easily editable, and human-readable. It’s still the format that people use to submit new tune settings to The Session.

A little while back, I came upon another advantage of the ABC format, one that I had never previously thought of…

The Session has a wide range of users, of all ages, from all over the world, from all walks of life, using all sorts of browsers. I do my best to make sure that the site works for just about any kind of user-agent (while still providing plenty of enhancements for the most modern browsers). That includes screen readers. Some active members of The Session happen to be blind.

One of those screen-reader users got in touch with me shortly after joining to ask me to explain what ABC was all about. I pointed them at some explanatory links. Once the format “clicked” with them, they got quite enthused. They pointed out that if the sheet music were only available as an image, it would mean very little to them. But by providing the ABC notation alongside the sheet music, they could read the music note-for-note.

That’s when it struck me that ABC notation is effectively alt text for sheet music!

There’s one little thing that slightly irks me though. The ABC notation should be read out one letter at a time. But screen readers use a kind of fuzzy logic to figure out whether a set of characters should be spoken as a word:

Screen readers try to pronounce acronyms and nonsensical words if they have sufficient vowels/consonants to be pronounceable; otherwise, they spell out the letters. For example, NASA is pronounced as a word, whereas NSF is pronounced as “N. S. F.” The acronym URL is pronounced “earl,” even though most humans say “U. R. L.” The acronym SQL is not pronounced “sequel” by screen readers even though some humans pronounce it that way; screen readers say “S. Q. L.”

It’s not a big deal, and the screen reader user can explicitly request that a word be spoken letter by letter:

Screen reader users can pause if they didn’t understand a word, and go back to listen to it; they can even have the screen reader read words letter by letter. When reading words letter by letter, JAWS distinguishes between upper case and lower case letters by shouting/emphasizing the upper case letters.

But still …I wish there were some way that I could mark up the ABC notation so that a screen reader would know that it should be read letter by letter. I’ve looked into using abbr, but that offers no guarantees: if the string looks like a word, it will still be spoken as a word. It doesn’t look there’s any ARIA settings for this use-case either.

So if any accessibility experts out there know of something I’m missing, please let me know.

Update: I’ve added an aural CSS declaration of speak: spell-out (thanks to Martijn van der Ven for the tip), although I think the browser support is still pretty non-existent. Any other ideas?

Long time

A few years back, I was on a road trip in the States with my friend Dan. We drove through Maryland and Virginia to the sites of American Civil War battles—Gettysburg, Antietam. I was reading Tom Standage’s magnificent book The Victorian Internet at the time. When I was done with the book, I passed it on to Dan. He loved it. A few years later, he sent me a gift: a glass telegraph insulator.

Glass telegraph insulator from New York

Last week I received another gift from Dan: a telegraph key.

Telegraph key

It’s lovely. If my knowledge of basic electronics were better, I’d hook it up to an Arduino and tweet with it.

Dan came over to the UK for a visit last month. We had a lovely time wandering around Brighton and London together. At one point, we popped into the National Portrait Gallery. There was one painting he really wanted to see: the portrait of Samuel Pepys.

Pepys

“Were you reading the online Pepys diary?”, I asked.

“Oh, yes!”, he said.

“I know the guy who did that!”

The “guy who did that” is, of course, the brilliant Phil Gyford.

Phil came down to Brighton and gave a Skillswap talk all about the ten-year long project.

The diary of Samuel Pepys: Telling a complex story online on Huffduffer

Now Phil has restarted the diary. He wrote a really great piece about what it’s like overhauling a site that has been online for a decade. Given that I spent a lot of my time last year overhauling The Session (which has been online in some form or another since the late nineties), I can relate to his perspective on trying to choose long-term technologies:

Looking ahead, how will I feel about this Django backend in ten years’ time? I’ve no idea what the state of the platform will be in a decade.

I was thinking about switching The Session over to Django, but I decided against it in the end. I figured that the pain involved in trying to retrofit an existing site (as opposed to starting a brand new project) would be too much. So the site is still written in the very uncool LAMP stack: Linux, Apache, MySQL, and PHP.

Mind you, Marco Arment makes the point in his Webstock talk that there’s a real value to using tried and tested “boring” technologies.

One area where I’ve found myself becoming increasingly wary over time is the use of third-party APIs. I say that with a heavy heart—back at dConstruct 2006 I was talking all about The Joy of API. But Yahoo, Google, Twitter …they’ve all deprecated or backtracked on their offerings to developers.

Anyway, this is something that has been on my mind a lot lately: evaluating technologies and services in terms of their long-term benefit instead of just their short-term hit. It’s something that we need to think about more as developers, and it’s certainly something that we need to think about more as users.

Compared with genuinely long-term projects like the 10,000 year Clock of the Long Now making something long-lasting on the web shouldn’t be all that challenging. The real challenge is acknowledging that this is even an issue. As Phil puts it:

I don’t know how much individuals and companies habitually think about this. Is it possible to plan for how your online service will work over the next ten years, never mind longer?

As my Long Bet illustrates, I can be somewhat pessimistic about the longevity of our web creations:

The original URL for this prediction (www.longbets.org/601) will no longer be available in eleven years.

But I really hope I lose that bet. Maybe I’ll suggest to Matt (my challenger on the bet) that we meet up on February 22nd, 2022 at the Long Now Salon. It doesn’t exist yet. But give it time.

Dealing with IE

Laura asked a question on Twitter the other day about dealing with older versions of Internet Explorer when you’ve got your layout styles nested within media queries (that older versions of IE don’t understand):

It’s a fair question. It also raises another question: how do you define “dealing with” Internet Explorer 8 or 7?

You could justifiably argue that IE7 users should upgrade their damn browser. But that same argument doesn’t really hold for IE8 if the user is on Windows XP: IE8 is as high as they can go. Asking users to upgrade their browser is one thing. Asking them to upgrade their operating system feels different.

But this is the web and websites do not need to look the same in every browser. Is it acceptable to simply give Internet Explorer 8 the same baseline experience that any other old out-of-date browser would get? In other words, is it even a problem that older versions of Internet Explorer won’t parse media queries? If you’re building in a mobile-first way, they’ll get linearised content with baseline styles applied.

That’s the approach that Alex advocates in the Q&A after his excellent closing keynote at Fronteers. That’s what I’m doing here on adactio.com. Users of IE8 get the linearised layout and that’s just fine. One of the advantages of this approach is that you are then freed up to use all sorts of fancy CSS within your media query blocks without having to worry about older versions of IE crapping themselves.

On other sites, like Huffduffer, I make an assumption (always a dangerous thing to do) that IE7 and IE8 users are using a desktop or laptop computer and so they could get some layout styles. I outlined that technique in a post about Windows mobile media queries. Using that technique, I end up splitting my CSS into two files:

<link rel="stylesheet" href="/css/global.css" media="all">
<link rel="stylesheet" href="/css/layout.css" media="all and (min-width: 30em)">
<!--[if (lt IE 9) & (!IEMobile)]>
<link rel="stylesheet" href="/css/layout.css" media="all">
<![endif]-->

The downside to this technique is that now there are two HTTP requests for the CSS …even for users of modern browsers. The alternative is to maintain one stylesheet for modern browsers and a separate stylesheet for older versions of Internet Explorer. That sounds like a maintenance nightmare.

Pre-processors to the rescue. Using Sass or LESS you can write your CSS in separate files (e.g. one file for basic styles and another for layout styles) and then use the preprocessor to combine those files in two different ways: one with media queries (for modern browsers) and another without media queries (for older versions of Internet Explorer). Or, if you don’t want to have your media query styles all grouped together, you can use Jake’s excellent method.

When I relaunched The Session last month, I initially just gave Internet Explorer 8 and lower the linearised content—the same layout that small-screen browsers would get. For example, the navigation is situated at the bottom of each page and you get to it by clicking an internal link at the top of each page. It all worked fine and nobody complained.

But I thought that it was a bit of a shame that users of IE8 and IE7 weren’t getting the same navigation that users of other desktop browsers were getting. So I decided to use a preprocesser (Sass in this case) to spit out an extra stylesheet for IE8 and IE7.

So let’s say I’ve got .scss files like this:

  • base.scss
  • medium.scss
  • wide.scss

Then in my standard .scss file that’s going to generate the CSS for all browsers (called global.css), I can write:

@import "base.scss";
@media all and (min-width: 30em) {
 @import "medium";
}
@media all and (min-width: 50em) {
 @import "wide";
}

But I can also generate a stylesheet for IE8 and IE7 (called legacy.css) that calls in those layout styles without the media query blocks:

@import "medium";
@import "wide";

IE8 and IE7 will be downloading some styles twice (all the styles within media queries) but in this particular case, that doesn’t amount to too much. Oh, and you’ll notice that I’m not even going to try to let IE6 parse those styles: it would do more harm than good.

<link rel="stylesheet" href="/css/global.css">
<!--[if (lt IE 9) & (!IEMobile) & (gt IE 6)]>
<link rel="stylesheet" href="/css/legacy.css">
<![endif]-->

So I did that (although I don’t really have .scss files named “medium” or “wide”—they’re actually given names like “navigation” or “columns” that more accurately describe what they do). I thought I was doing a good deed for any users of The Session who were still using Internet Explorer 8.

But then I read this. It turned out that someone was not only using IE8 on Windows XP, but they had their desktop’s resolution set to 800x600. That’s an entirely reasonable thing to do if your eyesight isn’t great. And, like I said, I can’t really ask him to upgrade his browser because that would mean upgrading the whole operating system.

Now there’s a temptation here to dismiss this particular combination of old browser + old OS + narrow resolution as an edge case. It’s probably just one person. But that one person is a prolific contributor to the site. This situation nicely highlights the problem of playing the numbers game: as a percentage, this demographic is tiny. But this isn’t a number. It’s a person. That person matters.

The root of the problem lay in my assumption that IE8 or IE7 users would be using desktop or laptop computers with a screen size of at least 1024 pixels. Serves me right for making assumptions.

So what could I do? I could remove the conditional comments and the IE-specific stylesheet and go back to just serving the linearised content. Or I could serve up just the medium-width styles to IE8 and IE7.

That’s what I ended up doing but I also introduced a little bit of JavaScript in the conditional comments to serve up the widescreen styles if the browser width is above a certain size:

<link rel="stylesheet" href="/css/global.css">
<!--[if (lt IE 9) & (!IEMobile) & (gt IE 6)]>
<link rel="stylesheet" href="/css/medium.css">
<script>
if (document.documentElement.clientWidth > 800) {
 document.write('<link rel="stylesheet" href="/css/wide.css">');
}
</script>
<![endif]-->

It works …I guess. It’s not optimal but at least users of IE8 and IE7 are no longer just getting the small-screen styles. It’s a hack, and not a particularly clever one.

Was it worth it? Is it an improvement?

I think this is something to remember when we’re coming up solutions to “dealing with” older versions of Internet Explorer: whether it’s a dumb solution like mine or a clever solution like Jake’s, we shouldn’t have to do this. We shouldn’t have to worry about IE7 just like we don’t have to worry about Netscape 4 or Mosaic or Lynx; we should be free to build according to the principles of progressive enhancement safe in the knowledge that older, less capable browsers won’t get all the bells and whistles, but they will be able to access our content. Instead we’re spending time coming up with hacks and polyfills to deal with one particular family of older, less capable browsers simply because of their disproportionate market share.

When we come up with clever hacks and polyfills for dealing with older versions of Internet Explorer, we shouldn’t feel pleased about it. We should feel angry.

Update: I’ve written a follow-up post to clarify what I’m talking about here.

The Session

When I was travelling back from Webstock in New Zealand at the start of this year, I had a brief stopover in Sydney. It coincided with one of John and Maxine’s What Do I Know? events so I did a little stint on five things I learned from the internet.

It was a fun evening and I had a chance to chat with many lovely Aussie web geeks. There was this one guy, Christian, that I was chatting with for quite a bit about all sorts of web-related stuff. But I could tell he wasn’t Australian. The Northern Ireland accent was a bit of a giveaway.

“You’re not from ‘round these parts, then?” I asked.

“Actually,” he said, “we’ve met before.”

I started racking my brains. Which geeky gathering could it have been?

“In Freiburg” he said.

Freiburg? But that was where I lived in the ’90s, before I was even making websites. I was drawing a complete blank. Then he said his name.

“Christian!” I cried, “Kerry and Christian!”

With a sudden shift of context, it all fit into place. We had met on the streets of Freiburg when I was a busker. Christian and his companion Kerry were travelling through Europe and they found themselves in Freiburg, also busking. Christian played guitar. Kerry played fiddle.

I listened to them playing some great Irish tunes and then got chatting with them. They didn’t have a place to stay so I offered to put them up. We had a good few days of hanging out and playing music together.

And now, all these years later, here was Christian …in Sydney, Australia …at a web event! Worlds were colliding. But it was a really great feeling to have that connection between my past and my present; between my life in Germany and my life now; between the world of Irish traditional music and the world of the web.

One of the other things that connects those two worlds is The Session. I’ve been running that website for about twelve or thirteen years now. It’s the thing I’m simultaneously most proud of and most ashamed of.

I’m proud of it because it has genuinely managed to contribute something back to the tradition: it’s handy resource for trad players around the world.

I’m ashamed of it because it has been languishing for so long. It has so much potential and I haven’t been devoting enough time or energy into meeting that potential.

At the end of 2009, I wrote:

I’m not going to make a new year’s resolution—that would just give me another deadline to stress out about—but I’m making a personal commitment to do whatever I can for The Session in 2010.

Well, it only took me another two years but I’ve finally done it.

I’ve spent a considerable portion of my spare time this year overhauling the site from the ground up, completely refactoring the code, putting together a new mobile-first design, adding much more location-based functionality and generally tilting at my own personal windmills. Trying to rewrite a site that’s been up and running for over a decade is considerably more challenging than creating a new site from scratch.

Luckily I had some help. Christian, for example, helped geocode all the sessions and events that had been added to the site over the years.

That’s one thing that the worlds of Irish music and the web have in common: people getting together to share and collaborate.

The future of the tradition

Drew and Brian did a superb job with this year’s 24 Ways, the advent calendar for geeks. There were some recurring themes: HTML5 from Yaili, Bruce and myself; CSS3 from Drew, Natalie and Rachel; and workflow from Andy and Meagan.

The matter of personal projects was also surprisingly prevalent. Elliot wrote A Pet Project is For Life, Not Just for Christmas and Jina specifically mentioned Huffduffer in her piece, Make Out Like a Bandit. December was the month for praising personal projects: that’s exactly what I was talking about at Refresh Belfast at the start of the month.

If you don’t have a personal project on the go, I highly recommend it. It’s a great way of learning new skills and experimenting with new technology. It’s also a good safety valve that can keep you sane when work is getting you down.

Working on Huffduffer is a lot of fun and I plan to keep iterating on the site whenever I can. But the project that I’ve really invested my soul into is The Session. Over the past decade, the site has built up a large international community with a comprehensive store of tunes and sessions.

Running any community site requires a lot of time and I haven’t always been as hands-on as I could have been with The Session. As a result, the discourse can occasionally spiral downwards into nastiness, prompting me to ask myself, Why do I bother? But then when someone contributes something wonderful to the site, I’m reminded of why I started it in the first place.

My dedication to the site was crystallised recently by a sad event. A long-time contributor to the site passed away. Looking back over the generosity of his contributions made me realise that The Session isn’t a personal project at all: it’s a community project, and I have a duty to enable the people in the community to connect. I also have a duty to maintain the URLs created by the community (are you listening, Yahoo?).

I feel like I’ve been neglecting the site. I could be doing so much more with the collective data, especially around location. The underlying code definitely needs refactoring, and the visual design could certainly do with a refresh (although I think it’s held up pretty well for such a long-running site).

I’m not going to make a new year’s resolution—that would just give me another deadline to stress out about—but I’m making a personal commitment to do whatever I can for The Session in 2010.