Archive: January, 2013

70

sparkline
                    5th                     10th                     15th                     20th                     25th                     30th     
12am      
4am
8am                          
12pm                
4pm                        
8pm                

Thursday, January 31st, 2013

The main issue

The inclusion of a main element in HTML has been the subject of debate for quite a while now. After Steve outlined the use cases, the element has now landed in the draft of the W3C spec. It isn’t in the WHATWG spec yet, but seeing as it is being shipped in browsers, it’s only a matter of time.

But I have some qualms about the implementation details so I fired off an email to the HTML working group with my concerns about the main element as it is current specced.

I’m curious as to why its use is specifically scoped to the body element rather than any kind of sectioning content:

Authors must not include more than one main element in a document.

I get the rationale behind the main element: it plugs a gap in the overlap between native semantics and ARIA landmark roles (namely role="main"). But if you look at the other elements that map to ARIA roles (header, footer, nav), those elements can be used multiple times in a single document by being scoped to their sectioning content ancestor.

Let’s say, for example, that I have a document like this, containing two header elements:

<body>
 <header>Page header</header>
 Page main content starts here
 <section>
  <header>Section header</header>
  Section main content
 </section>
 Page main content finishes here
</body>

…only the first header element (the one that’s scoped to the body element) will correspond to role="banner", right?

Similarly, in this example, only the final footer element will correspond to role=”contentinfo”:

<body>
 <header>Page header</header>
 Page main content starts here
 <section>
  <header>Section header</header>
  Section main content
  <footer>Section footer</footer>
 </section>
 Page main content finishes here
 <footer>Page footer</footer>
</body>

So what I don’t understand is why we can’t have the main element work the same way i.e. scope it to its sectioning content ancestor so that only the main element that is scoped to the body element will correspond to role="main":

<body>
 <header>Page header</header>
 <main>
  Page main content starts here
  <section>
   <header>Section header</header>
   <main>Section main content</main>
   <footer>Section footer</footer>
  </section>
  Page main content finishes here
 </main>
 <footer>Page footer</footer>
</body>

Here are the corresponding ARIA roles:

<body>
 <header>Page header</header> <!-- role="banner" -->
 <main>Page main content</main> <!-- role="main" -->
 <footer>Page footer</footer> <!-- role="contentinfo" -->
</body>

Why not allow the main element to exist within sectioning content (section, article, nav, aside) the same as header and footer?

<section>
 <header>Section header</header> <!-- no corresponding role -->
 <main>Section main content</main> <!-- no corresponding role -->
 <footer>Section footer</footer> <!-- no corresponding role -->
</section>

Deciding how to treat the main element would then be the same as header and footer. Here’s what the spec says about the scope of footers:

When the nearest ancestor sectioning content or sectioning root element is the body element, then it applies to the whole page.

I could easily imagine the same principle being applied to the main element. From an implementation viewpoint, this is already what browsers have to do with header and footer. Wouldn’t it be just as simple to apply the same parsing to the main element?

It seems a shame to introduce a new element that can only be used zero or one time in a document …I think the head and body elements are the only other elements with this restriction (and I believe the title element is the only element that is used precisely once per document).

It would be handy to be able to explicitly mark up the main content of an article or an aside—for exactly the same reasons that it would be useful to mark up the main content of a document.

So why not?

Mercator Puzzle!

This is fun. Drag the red country outlines around and slot them into place on the map. Sounds easy, right? But the distorting effect of the Mercator projection makes it a lot tougher than it looks.

What’s the deal with copyright and 3D printing? by Michael Weinberg

Michael Weinberg’s follow-up whitepaper to “It will be awesome if they don’t screw it up.”

Stick-n-find

Spimify your household with these bluetooth location stickers. Now you can google your shoes.

Wednesday, January 30th, 2013

Figuring out

A recent simplequiz over on HTML5 Doctor threw up some interesting semantic issues. Although the figure element wasn’t the main focus of the article, a lot of the comments were concerned with how it could and should be used.

This is a subject that has been covered before on HTML5 Doctor. It turns out that we may have been too restrictive in our use of the element, mistaking some descriptive text in the spec for proscriptive instruction:

The element can thus be used to annotate illustrations, diagrams, photos, code listings, etc, that are referred to from the main content of the document, but that could, without affecting the flow of the document, be moved away from that primary content, e.g. to the side of the page, to dedicated pages, or to an appendix.

Steve and Bruce have been campaigning on the HTML mailing list to get the wording updated and clarified.

Meanwhile, in an unrelated semantic question, there was another HTML5 Doctor article a while back about quoting and citing with blockquote and its ilk.

The two questions come together beautifully in a blog post on the newly-redesigned A List Apart documenting this pattern for associating quotations and authorship:

<figure>
 <blockquote>It is the unofficial force—the Baker Street irregulars.</blockquote>
 <figcaption>Sherlock Holmes, <cite>Sign of Four</cite></figcaption>
</figure>

Although, unsurprisingly, I still take issue with the decision in HTML5 not to allow the cite element to apply to people. As I’ve said before we don’t have to accept this restriction:

Join me in a campaign of civil disobedience against the unnecessarily restrictive, backwards-incompatible change to the cite element.

In which case, we get this nice little pattern combining figure, blockquote, cite, and the hCard microformat, like this:

<figure>
 <blockquote>It is the unofficial force—the Baker Street irregulars.</blockquote>
 <figcaption class="vcard"><cite class="fn">Sherlock Holmes</cite>, <cite>Sign of Four</cite></figcaption>
</figure>

Or like this:

<figure>
 <blockquote>Join me in a campaign of civil disobedience against the unnecessarily restrictive, backwards-incompatible change to the cite element.</blockquote>
 <figcaption class="vcard"><cite class="fn">Jeremy Keith</cite>, <a href="http://24ways.org/2009/incite-a-riot/"><cite>Incite A Riot</cite></a></figcaption>
</figure>

The test

There was once a time when the first thing you would do when you went to visit a newly-launched website was to run its markup through a validator.

Later on that was replaced by the action of bumping up the font size by a few notches—what Dan called the Dig Dug test.

Thanks to Ethan, we all started to make our browser windows smaller and bigger as soon as we visited a newly-launched site.

Now when I go to a brand new site I find myself opening up the “Network” tab in my browser’s developer tools to count the HTTP requests and measure the page weight.

Just like old times.

Tuesday, January 29th, 2013

HTML5 in six steps by Andy Hume

You’re probably doing each of these already but just in case your’e not, Andy has listed six quick wins you can get from HTML5.

Monday, January 28th, 2013

Setting a performance budget by Tim Kadlec

Tim talks about the very useful technique of setting a page-weight limit; something that Mark wrote about on the Clearleft blog recently.

Metadesign at Hack Farm by James Box

James’s notes from the most recent Hack Farm show that, even without a finished product, there were a lot of benefits.

On layout and web performance by Kelly Norton

This is handy: a look at which DOM properties and methods cause layout thrashing (reflows).

Performance as design by Brad Frost

Amen, Brad, Amen.

It’s time for us to treat performance as an essential design feature, not just as a technical best practice.

Life as German POW by Carl Lehman

My friend Dan’s stepfather Carl passed away recently, aged 90. His experiences during World War II were quite something.

Saturday, January 26th, 2013

Offscreen 4: I got what I paid for by Jeff Porter

A really nice write-up of issue four of Offscreen magazine, wherein I was featured.

Flickr: ugordan’s Photostream

Gorgeous colour-processed images from NASA probes. I could stare at the fountains of Enceladus all day.

Wednesday, January 23rd, 2013

Actual Facebook Graph Searches

Another Tom Scott project:

I had to take one more quick, cheap shot — and I think a Tumblr blog is the quickest, cheapest shot it’s possible to take.

UX Job Title Generator

Like @jeremysjob, but specifically for UX roles.

Carousel interaction stats by Eric Runyon

I’ve never been a fan of carousels on websites, to put it mildy. It seems I am not alone. And if you doubt the data, ask yourself this: when was the last time you, as a user, interacted with a carousel on any website?

Tuesday, January 22nd, 2013

Owning your own words – is it important?

A fascinating discussion on sharecropping vs. homesteading. Josh Miller from Branch freely admits that he’s only ever known a web where your content is held by somone else. Gina Trapani’s response is spot-on:

For me, publishing on a platform I have some ownership and control over is a matter of future-proofing my work. If I’m going to spend time making something I really care about on the web—even if it’s a tweet, brevity doesn’t mean it’s not meaningful—I don’t want to do it somewhere that will make it inaccessible after a certain amount of time, or somewhere that might go away, get acquired, or change unrecognizably.

When you get old and your memory is long and you lose parents and start having kids, you value your own and others’ personal archive much more.

On the styling of forms by Bruce Lawson

Bruce takes a look at the tricky issue of styling native form controls. Help us, Shadow DOM, you’re our only hope!

Twitter permissions

Twitter has come in for a lot of (justifiable) criticism for changes to its API that make it somewhat developer-hostile. But it has to be said that developers don’t always behave responsibly when they’re using the API.

The classic example of this is the granting of permissions. James summed it up nicely: it’s just plain rude to ask for write-access to my Twitter account before I’ve even started to use your service. I could understand it if the service needed to post to my timeline, but most of the time these services claim that they want me to sign up via Twitter so that I can find my friends who are also using the service — that doesn’t require write access. Quite often, these requests to authenticate are accompanied by reassurances like “we’ll never tweet without your permission” …in which case, why ask for write-access in the first place?

To be fair, it used to be a lot harder to separate out read and write permissions for Twitter authentication. But now it’s actually not that bad, although it’s still not as granular as it could be.

One of the services that used to require write-access to my Twitter account was Lanyrd. I gave it permission, but only because I knew the people behind the service (a decision-making process that doesn’t scale very well). I always felt uneasy that Lanyrd had write-access to my timeline. Eventually I decided that I couldn’t in good conscience allow the lovely Lanyrd people to be an exception just because I knew where they lived. Fortunately, they concurred with my unease. They changed their log-in system so that it only requires read-access. If and when they need write-access, that’s the point at which they ask for it:

We now ask for read-only permission the first time you sign in, and only ask to upgrade to write access later on when you do something that needs it; for example following someone on Twitter from the our attendee directory.

Far too many services ask for write-access up front, without providing a justification. When asked for an explanation, I’m sure most of them would say “well, that’s how everyone else does it”, and they would, alas, be correct.

What’s worse is that users grant write-access so freely. I was somewhat shocked by the amount of tech-savvy friends who unwittingly spammed my timeline with automated tweets from a service called Twitter Counter. Their reactions ranged from sheepish to embarrassed to angry.

I urge you to go through your Twitter settings and prune any services that currently have write-access that don’t actually need it. You may be surprised by the sheer volume of apps that can post to Twitter on your behalf. Do you trust them all? Are you certain that they won’t be bought up by a different, less trustworthy company?

If a service asks me to sign up but insists on having write-access to my Twitter account, it feels like being asked out on a date while insisting I sign a pre-nuptial agreement. Not only is somewhat premature, it shows a certain lack of respect.

Not every service behaves so ungallantly. Done Not Done, 1001 Beers, and Mapalong all use Twitter for log-in, but none of them require write-access up-front.

Branch and Medium are typical examples of bad actors in this regard. The core functionality of these sites has nothing to do with posting to Twitter, but both sites want write-access so that they can potentially post to Twitter on my behalf later on. I know that I won’t ever want either service to do that. I can either trust them, or not use the service at all. Signing up without granting write-access to my Twitter account isn’t an option.

I sent some feedback to Branch and part of their response was to say the problem was with the way Twitter lumps permissions together. That used to be true, but Lanyrd’s exemplary use of Twitter for log-in makes that argument somewhat hollow.

In the case of Branch, Medium, and many other services, Twitter authentication is the only way to sign up and start using the service. Using a username and password isn’t an option. On the face of it, requiring Twitter for authentication doesn’t sound all that different to requiring an email address for authentication. But demanding write-access to Twitter is the equivalent of demanding the ability to send emails from your email address.

The way that so many services unnecessarily ask for write-access to Twitter—and the way that so many users unquestioningly grant it—reminds me of the password anti-pattern all over again. Because this rude behaviour is so prevalent, it has now become the norm. If we want this situation to change, we need to demand more respect.

The next time that a service demands unwarranted write-access to your Twitter account, refuse to grant it. Then tell the people behind that service why you’re refusing to sign up.

And please take a moment to go through the services you’ve already authorised.

Soul Compare: Get the most for your soul.

The latest project from Tom Scott is like many Facebook-authenticated apps that ask you to sell your soul, but this one is literal. I think I might offer my soul (worth 56gigaMorgans) to Cthulhu.

Monday, January 21st, 2013

Long time

A few years back, I was on a road trip in the States with my friend Dan. We drove through Maryland and Virginia to the sites of American Civil War battles—Gettysburg, Antietam. I was reading Tom Standage’s magnificent book The Victorian Internet at the time. When I was done with the book, I passed it on to Dan. He loved it. A few years later, he sent me a gift: a glass telegraph insulator.

Glass telegraph insulator from New York

Last week I received another gift from Dan: a telegraph key.

Telegraph key

It’s lovely. If my knowledge of basic electronics were better, I’d hook it up to an Arduino and tweet with it.

Dan came over to the UK for a visit last month. We had a lovely time wandering around Brighton and London together. At one point, we popped into the National Portrait Gallery. There was one painting he really wanted to see: the portrait of Samuel Pepys.

Pepys

“Were you reading the online Pepys diary?”, I asked.

“Oh, yes!”, he said.

“I know the guy who did that!”

The “guy who did that” is, of course, the brilliant Phil Gyford.

Phil came down to Brighton and gave a Skillswap talk all about the ten-year long project.

The diary of Samuel Pepys: Telling a complex story online on Huffduffer

Now Phil has restarted the diary. He wrote a really great piece about what it’s like overhauling a site that has been online for a decade. Given that I spent a lot of my time last year overhauling The Session (which has been online in some form or another since the late nineties), I can relate to his perspective on trying to choose long-term technologies:

Looking ahead, how will I feel about this Django backend in ten years’ time? I’ve no idea what the state of the platform will be in a decade.

I was thinking about switching The Session over to Django, but I decided against it in the end. I figured that the pain involved in trying to retrofit an existing site (as opposed to starting a brand new project) would be too much. So the site is still written in the very uncool LAMP stack: Linux, Apache, MySQL, and PHP.

Mind you, Marco Arment makes the point in his Webstock talk that there’s a real value to using tried and tested “boring” technologies.

One area where I’ve found myself becoming increasingly wary over time is the use of third-party APIs. I say that with a heavy heart—back at dConstruct 2006 I was talking all about The Joy of API. But Yahoo, Google, Twitter …they’ve all deprecated or backtracked on their offerings to developers.

Anyway, this is something that has been on my mind a lot lately: evaluating technologies and services in terms of their long-term benefit instead of just their short-term hit. It’s something that we need to think about more as developers, and it’s certainly something that we need to think about more as users.

Compared with genuinely long-term projects like the 10,000 year Clock of the Long Now making something long-lasting on the web shouldn’t be all that challenging. The real challenge is acknowledging that this is even an issue. As Phil puts it:

I don’t know how much individuals and companies habitually think about this. Is it possible to plan for how your online service will work over the next ten years, never mind longer?

As my Long Bet illustrates, I can be somewhat pessimistic about the longevity of our web creations:

The original URL for this prediction (www.longbets.org/601) will no longer be available in eleven years.

But I really hope I lose that bet. Maybe I’ll suggest to Matt (my challenger on the bet) that we meet up on February 22nd, 2022 at the Long Now Salon. It doesn’t exist yet. But give it time.

Speed up your site using prefetching by Jon Fox

More details on DNS prefetching, page prefetching and, controversial, page pre-rendering.

Front-end performance for web designers and front-end developers by Harry Roberts

A really good introduction to front-end performance techniques. Most of this was already on my radar, but I still picked up a handy tip or two (particularly about DNS prefetching).

At this stage it should go without saying that you should be keeping up with this kind of thing: performance is really, really, really important.

The impending crisis that is Windows XP and IE 8 by Troy Hunt

A good explanation of the litany of woes that comes from Internet Explorer 8 being the highest that users of Windows XP can upgrade to. It’s a particularly woeful situation if you are a web developer attempting to provide parity. But there is hope on the horizon:

2013 will see the culmination of all these issues; support for IE 8 will drop of rapidly, users of XP will find an increasingly broken web, the cost of building software in XP organisations will increase.

Foreword to DOM Enlightenment by Cody Lindley

The foreword to the O’Reilly book.

Sunday, January 20th, 2013

The Last Pictures: Contemporary pessimism and hope for the future by Paul Glister

From the cave paintings at Lascaux to the Pioneer plaques and Voyager golden records to Trevor Paglen’s “The Last Pictures” project, Paul Glister examines the passage and preservation of art and information through time. Fascinating.

Or perhaps, as Paglen envisions, those who find a Pioneer Plaque, a Voyager Record, or one of our electromagnetic transmissions will be interested enough to search us out, coming upon a future Earth where all that is left of humanity are our terrestrial ruins and that artificial ring of geosynchronous satellites, with one of them having a particular golden artifact bolted to its pitted hull. In that scenario, about all that would be left for the visiting ETI to do in terms of learning about us would be grand-scale dumpster diving.

Come at me!

A wonderful collection of misconceptions, often the result of being myzelled when young.

Friday, January 18th, 2013

The importance of HTML5 sectioning elements by Heydon Pickering

A good explanation of HTML5’s sectioning content and outline algorithm.

Interview with Lauren Beukes about Shining Girls

Lauren talks about The Shining Girls and the tools she uses to write with.

The paradoxes of time travel by David Lewis

A well-written white paper on time travel. Alas, it relies a bit too much on semantic nitpickery to offer any real insight.

DNA molecule plush on Think Geek

Brilliant little magnetic cuddly nucleobases from Jun. You get all four bases to combine to your heart’s content: cytosine, guanine, adenine, thymine — take that, Pokémon.

2012 Project Google Birdhouse - a set on Flickr

I’ve been thinking about getting a birdhouse.

IMGP7951

To Be Today

A beautiful project from Brendan and the Royal Shakespeare Company: the headlines of today preceded by quotes from The Bard.

Bargain Bin Blasphemy

I think there might be some subliminal messages hidden in these album covers.

Thursday, January 17th, 2013

A question of time

Some of the guys at work occasionally provide answers to .net magazine’s “big question” feature. When they told me about the latest question that landed in their inboxes, I felt I just had to stick my oar in and provide my answer.

I’m publishing my response here, so that if they decide not to publish it in the magazine or on the website (or if they edit it down), I’ve got a public record of my stance on this very important topic.

The question is:

If you could send a message back to younger designer or developer self, what would it say? What professional advice would you give a younger you?

This is my answer:

Rather than send a message back to my younger self, I would destroy the message-sending technology immediately. The potential for universe-ending paradoxes is too great.

I know that it would be tempting to give some sort of knowledge of the future to my younger self, but it would be the equivalent of attempting to kill Hitler—that never ends well.

Any knowledge I supplied to my past self would cause my past self to behave differently, thereby either:

  1. destroying the timeline that my present self inhabits (assuming a branching many-worlds multiverse) or
  2. altering my present self, possibly to the extent that the message-sending technology never gets invented. Instant paradox.

But to answer your question, if I could send a message back to a younger designer or developer self, the professional advice I would give would be:

Jeremy,

When, at some point in the future, you come across the technology capable of sending a message like this back to your past self, destroy it immediately!

But I know that you will not heed this advice. If you did, you wouldn’t be reading this.

On the other hand, I have no memory of ever receiving this message, so perhaps you did the right thing after all.

Jeremy

Clearleft.com past and present

We finally launched the long-overdue redesign of the Clearleft website last week. We launched it late on Friday afternoon, because, hey! that’s not a stupid time to push something live or anything.

The actual moment of launch was initiated by Josh who had hacked together a physical launch button containing a teensy USB development board.

The launch button Preparing to launch

So nerdy.

Mind you, just because the site is now live doesn’t mean the work is done. Far from it, as Paul pointed out:

But it’s nice to finally have something new up. We were all getting quite embarrassed by the old site.

Still, rather than throw the old design away and never speak of it again, we’ve archived it. We’ve archived every iteration of the site:

  1. Version 1 launched in 2005. I wrote about it back then. It looked very much of its time. This was before responsive design, but it was, of course, nice and liquid.
  2. Version 2 came a few years later. There were some little bits I liked it about it but it always felt a bit “off”.
  3. Version 3 was more of a re-alignment than a full-blown redesign: an attempt to fix some of the things that felt “off” about the previous version.
  4. Version 4 is where we are now. We don’t love it, but we don’t hate it either. Considering how long it took to finally get this one done, we should probably start planning the next iteration now.

I’m glad that we’ve kept the archived versions online. I quite enjoy seeing the progression of the visual design and the technologies used under the hood.

Song blogging: Files That Last

I hereby declare that this song is my official anthem.

I want some files that last, data that will not stray.

Files just as fresh tomorrow as they were yesterday.

Wednesday, January 16th, 2013

Implementing off-canvas navigation for a responsive website by David Bushell

This off-canvas demo is a great practical example of progressive enhancement from David. It’s also a lesson in why over-reliance on jQuery can sometimes be problematic.

Tuesday, January 15th, 2013

The Panasonic Toughpad Press Conference - LOOK, ROBOT

Now this is what I call tech reporting.

The women leave the stage, wet computer in hand, and a new man takes the stage. He plays a schmaltzy video where Portuguese children teach adults to use Windows 8 accompanied by a hyperloud xylophone soundtrack that slices through my hangover like cheesewire though lukewarm gouda.

Monday, January 14th, 2013

The responsive web will be 99.9% typography

An intriguing extrapolation of current design trends: perhaps typographically-strong single-column layouts will become popular out of sheet necessity.

Sunday, January 13th, 2013

Out of Eden — A Walk Through Time

I like this idea of slow journalism: taking seven years to tell a story.

Snapshot Serengeti

The latest project from Zooniverse is, as you would expect, an extremely enjoyable and useful way to spend your time: classifying animals that have captured in camera trap images.

The opening tutorial is a lesson in how to do “on-boarding” right.

Friday, January 11th, 2013

The changing face of computers on screen

A look at the depiction of computer hardware and peripherals in sci-fi movies over time.

Flickr, codeswarming

A beautiful timelapse visualisation of code commits to Flickr from 2004 to 2011.

Form Follows Function

A gorgeous collection of experiments that showcase just how much is possible in browsers today.

Dealing with IE again

People have been linking to—and saying nice things about—my musings on dealing with Internet Explorer. Thank you. You’re very kind. But I think I should clarify a few things.

If you’re describing the techniques I showed (using Sass and conditional comments) as practical or useful, I appreciate the sentiment but personally I wouldn’t describe them as either. Jake’s technique is genuinely useful and practical.

I wasn’t really trying to provide any practical “take-aways”. I was just thinking out loud. The only real point to my ramblings was at the end:

When we come up with clever hacks and polyfills for dealing with older versions of Internet Explorer, we shouldn’t feel pleased about it. We should feel angry.

My point is that we really shouldn’t have to do this. And, in fact, we don’t have to do this. We choose to do this.

Take the particular situation I was describing with a user of The Session who using IE8 on Windows XP with a monitor set to 800x600 pixels. A lot people picked up on this observation:

As a percentage, this demographic is tiny. But this isn’t a number. It’s a person. That person matters.

But here’s the thing: that person only started to experience problems when I chose to try to “help” IE8 users. If I had correctly treated IE8 as the legacy browser that it is, those users would have received the baseline experience …which was absolutely fine. Not great, but fine. But instead, I decided to jump in with my hacks, my preprocessor, my conditional comments, and worst of all, my assumptions about the viewport size.

In this case, I only have myself to blame. This is a personal project so I’m the client. I decided that I wanted to give IE8 and IE7 users the same kind of desktop navigation that more modern browsers were getting. All the subsequent pain for me as the developer, and for the particular user who had problems, is entirely my fault. If you’re working at a company where your boss or your client insists on parity for IE8 or IE7, I guess you can point the finger at them.

My point is: all the problems and workarounds that I talked about in that post were the result of me trying to crowbar modern features into a legacy browser. Now, don’t get me wrong—I’m not suggesting that IE8 or IE7 should be shut out or get a crap experience: “baseline” doesn’t mean “crap”. There’s absolutely nothing wrong with serving up a baseline experience to a legacy browser as long as your baseline experience is pretty good …and it should be.

So, please, don’t think that my post was a hands-on, practical example of how to give IE8 and IE7 users a similar experience to modern browsers. If anything, it was a cautionary tale about why trying to do that is probably a mistake.

Thursday, January 10th, 2013

Responsive web design interview series: Trent Walton & Jeremy Keith

Trent and I answered a few questions for the Responsive Design Weekly newsletter.

Sparkicons and the humble hyperlink by Mark Boulton

I really like Mark’s idea of standardised “sparkicons” …for a while there, reading this, I was worried he was going to propose something like Snap Preview. shudder

Responsive Day Out updates

The Responsive Day Out is just seven weeks away. That’s only 49 sleeps!

Alas, Malarkey has had to drop out. Sad face. But, I’ve managed to find another talented web geek from Corby: Tom Maslen of Responsive News fame. Happy face!

Here’s another bit of news: there will be an after-party. Rejoice! Thanks to the generosity of the fine people behind Gridset—that would be Mark Boulton Design then—we’ll be heading to The Loft on Ship Street in the evening to have a few drinks and a good ol’ chinwag. That’s where Remy had the Full Frontal after-party and I thought it worked really, really well. Instead of shouting over blaring music, you could actually have a natter with people.

Many, many thanks to Gridset for making this possible.

In other news of generosity, Drew has offered to work his podcasting magic. Thank you, Drew! And Craig has very kindly offered to record and host video of the event, made possible with sponsorship from Mailchimp. Thank you, Craig! Thank you, Mailchimp!

And get this: A Book Apart are also getting behind the day and they’ll be sending on some books that I plan to give away to attendees who ask questions during the discussiony bits. Thank you, my friends apart!

I could still do with just one more sponsor: I’d really like to hire out the Small Batch coffee cart for the day, like we did at dConstruct. There will be coffee and tea available from the Dome bar anyway, but the coffee is pretty awful. Even though the Responsive Day Out is going to be a very shoestring, grassroots affair, I’d still like to maintain some standards. So if you know of a company that might be interested in earning the gratitude of a few hundred web geeks, let them know about this (it’s around a grand to caffeinate a conference worth of geeks for a day).

If you’ve got your ticket, great; I look forward to seeing you on March 1st. If you don’t; sorry. And before you ask, no, I’m afraid there is no waiting list. We’re not doing any refunds or transfers—if someone with a ticket can’t make it, they can simply give (or sell) their ticket to someone else. We won’t be making any lanyards so we don’t need to match up people to name badges.

So keep an eye on Twitter (especially closer to the day) in case anyone with a ticket is planning to bail.

Tapeworm logic by Charles Stross

The Fermi paradox as applied to tapeworms.

‘Sfunny; just yesterday I was revisiting this classic tapeworm tale on Fray.

Wednesday, January 9th, 2013

Dealing with IE

Laura asked a question on Twitter the other day about dealing with older versions of Internet Explorer when you’ve got your layout styles nested within media queries (that older versions of IE don’t understand):

It’s a fair question. It also raises another question: how do you define “dealing with” Internet Explorer 8 or 7?

You could justifiably argue that IE7 users should upgrade their damn browser. But that same argument doesn’t really hold for IE8 if the user is on Windows XP: IE8 is as high as they can go. Asking users to upgrade their browser is one thing. Asking them to upgrade their operating system feels different.

But this is the web and websites do not need to look the same in every browser. Is it acceptable to simply give Internet Explorer 8 the same baseline experience that any other old out-of-date browser would get? In other words, is it even a problem that older versions of Internet Explorer won’t parse media queries? If you’re building in a mobile-first way, they’ll get linearised content with baseline styles applied.

That’s the approach that Alex advocates in the Q&A after his excellent closing keynote at Fronteers. That’s what I’m doing here on adactio.com. Users of IE8 get the linearised layout and that’s just fine. One of the advantages of this approach is that you are then freed up to use all sorts of fancy CSS within your media query blocks without having to worry about older versions of IE crapping themselves.

On other sites, like Huffduffer, I make an assumption (always a dangerous thing to do) that IE7 and IE8 users are using a desktop or laptop computer and so they could get some layout styles. I outlined that technique in a post about Windows mobile media queries. Using that technique, I end up splitting my CSS into two files:

<link rel="stylesheet" href="/css/global.css" media="all">
<link rel="stylesheet" href="/css/layout.css" media="all and (min-width: 30em)">
<!--[if (lt IE 9) & (!IEMobile)]>
<link rel="stylesheet" href="/css/layout.css" media="all">
<![endif]-->

The downside to this technique is that now there are two HTTP requests for the CSS …even for users of modern browsers. The alternative is to maintain one stylesheet for modern browsers and a separate stylesheet for older versions of Internet Explorer. That sounds like a maintenance nightmare.

Pre-processors to the rescue. Using Sass or LESS you can write your CSS in separate files (e.g. one file for basic styles and another for layout styles) and then use the preprocessor to combine those files in two different ways: one with media queries (for modern browsers) and another without media queries (for older versions of Internet Explorer). Or, if you don’t want to have your media query styles all grouped together, you can use Jake’s excellent method.

When I relaunched The Session last month, I initially just gave Internet Explorer 8 and lower the linearised content—the same layout that small-screen browsers would get. For example, the navigation is situated at the bottom of each page and you get to it by clicking an internal link at the top of each page. It all worked fine and nobody complained.

But I thought that it was a bit of a shame that users of IE8 and IE7 weren’t getting the same navigation that users of other desktop browsers were getting. So I decided to use a preprocesser (Sass in this case) to spit out an extra stylesheet for IE8 and IE7.

So let’s say I’ve got .scss files like this:

  • base.scss
  • medium.scss
  • wide.scss

Then in my standard .scss file that’s going to generate the CSS for all browsers (called global.css), I can write:

@import "base.scss";
@media all and (min-width: 30em) {
 @import "medium";
}
@media all and (min-width: 50em) {
 @import "wide";
}

But I can also generate a stylesheet for IE8 and IE7 (called legacy.css) that calls in those layout styles without the media query blocks:

@import "medium";
@import "wide";

IE8 and IE7 will be downloading some styles twice (all the styles within media queries) but in this particular case, that doesn’t amount to too much. Oh, and you’ll notice that I’m not even going to try to let IE6 parse those styles: it would do more harm than good.

<link rel="stylesheet" href="/css/global.css">
<!--[if (lt IE 9) & (!IEMobile) & (gt IE 6)]>
<link rel="stylesheet" href="/css/legacy.css">
<![endif]-->

So I did that (although I don’t really have .scss files named “medium” or “wide”—they’re actually given names like “navigation” or “columns” that more accurately describe what they do). I thought I was doing a good deed for any users of The Session who were still using Internet Explorer 8.

But then I read this. It turned out that someone was not only using IE8 on Windows XP, but they had their desktop’s resolution set to 800x600. That’s an entirely reasonable thing to do if your eyesight isn’t great. And, like I said, I can’t really ask him to upgrade his browser because that would mean upgrading the whole operating system.

Now there’s a temptation here to dismiss this particular combination of old browser + old OS + narrow resolution as an edge case. It’s probably just one person. But that one person is a prolific contributor to the site. This situation nicely highlights the problem of playing the numbers game: as a percentage, this demographic is tiny. But this isn’t a number. It’s a person. That person matters.

The root of the problem lay in my assumption that IE8 or IE7 users would be using desktop or laptop computers with a screen size of at least 1024 pixels. Serves me right for making assumptions.

So what could I do? I could remove the conditional comments and the IE-specific stylesheet and go back to just serving the linearised content. Or I could serve up just the medium-width styles to IE8 and IE7.

That’s what I ended up doing but I also introduced a little bit of JavaScript in the conditional comments to serve up the widescreen styles if the browser width is above a certain size:

<link rel="stylesheet" href="/css/global.css">
<!--[if (lt IE 9) & (!IEMobile) & (gt IE 6)]>
<link rel="stylesheet" href="/css/medium.css">
<script>
if (document.documentElement.clientWidth > 800) {
 document.write('<link rel="stylesheet" href="/css/wide.css">');
}
</script>
<![endif]-->

It works …I guess. It’s not optimal but at least users of IE8 and IE7 are no longer just getting the small-screen styles. It’s a hack, and not a particularly clever one.

Was it worth it? Is it an improvement?

I think this is something to remember when we’re coming up solutions to “dealing with” older versions of Internet Explorer: whether it’s a dumb solution like mine or a clever solution like Jake’s, we shouldn’t have to do this. We shouldn’t have to worry about IE7 just like we don’t have to worry about Netscape 4 or Mosaic or Lynx; we should be free to build according to the principles of progressive enhancement safe in the knowledge that older, less capable browsers won’t get all the bells and whistles, but they will be able to access our content. Instead we’re spending time coming up with hacks and polyfills to deal with one particular family of older, less capable browsers simply because of their disproportionate market share.

When we come up with clever hacks and polyfills for dealing with older versions of Internet Explorer, we shouldn’t feel pleased about it. We should feel angry.

Update: I’ve written a follow-up post to clarify what I’m talking about here.

Tuesday, January 8th, 2013

Interview with Ian Hickson, HTML editor on HTML5 Doctor

Bruce sits down for a chat with Hixie. This is a good insight into the past and present process behind HTML.

Issue 4 — Offscreen Magazine

There’s an interview with me in the new issue of Offscreen Magazine. Some of sort of clerical error, I’m guessing.

Monday, January 7th, 2013

New year, old year

2013 is one week old. This time of transition from one calendar year to another is the traditional time to cast an eye back over the year that has just come to a close. Who am I to stand in the way of tradition?

2012 was quite a jolly year for me. Nothing particularly earth-shattering happened, and that’s just fine with me. I made some websites. I did some travelling. It was grand.

I really enjoyed working on Matter by day and hacking away at relaunching The Session by night.

The trip to New Zealand at the start of 2012 was great. Not only was Webstock a great conference (and I’m very happy with the talk I gave, Of Time And The Network), but the subsequent road trip with Jessica, Ethan, Liz, Erin and Peter was a joyous affair.

Thinking about it, I went to lovely new places in 2012 like Newfoundland and Oslo as well as revisiting New York, Austin, Chicago, and Freiburg. And I went to CERN, which was a great experience.

But the highlight of my year was undoubtedly the first week of September right here in Brighton. The combination of Brighton SF followed by dConstruct was simply amazing. I feel very privileged to have been involved in both events. I’m still pinching myself.

Now it’s 2013, and I’m already starting to plan this year’s dConstruct: be sure to put Friday, September 6th, 2013 in your calendar. Before that, I’ve got the Responsive Day Out—more on that soon. I’ve got some speaking engagements lined up, mostly in the States in the latter half of the year at An Event Apart. Interestingly, apart from compering dConstruct and BrightonSF, I didn’t speak at all in the UK in 2012—the last talk I gave in the UK was All Our Yesterdays at Build 2011.

I’m going to continue hacking away on Huffduffer and The Session whenever I can in 2013. I find those personal projects immensely rewarding. I’m also hoping to have time to do some more writing.

Let’s see what this year brings.

Olaf Stapledon

The out-of-copyright books of Olaf Stapledon are available to download from the University of Adelaide. Be sure to grab Starmaker and First And Last Men.

Sunday, January 6th, 2013

phuu/sparksvg · GitHub

Remember when I made that canvas sparkline script? Remember when Stuart grant my wish for an SVG version? Well, now Tom has gone one further and created a hosted version on sparksvg.me

Not a fan of sparklines? Bars and circles are also available.

Mark Lynas » Lecture to Oxford Farming Conference, 3 January 2013

This is a superb talk by Mark Lynas who once spearheaded the anti-GM movement, and who has now completely changed his stance on genetically-modified crops. Why? Science.

You are more likely to get hit by an asteroid than to get hurt by GM food. More to the point, people have died from choosing organic, but no-one has died from eating GM.

Saturday, January 5th, 2013

Science Hackday Dublin

Dublin is going to play host to its second Science Hack Day at the start of March. It looks like it’s going to be a fantastic event (again!) but they need sponsors. Do you know of any?

Friday, January 4th, 2013

Squaremans | Gaming Culture Blog

Ostensibly about gaming (and written by Matt Colville who works in the games industry), this blog actually has a lot of interesting observations on sci-fi cinema. I like it.

30 Great Reads from 2012 - Readlists

Some of the past year’s best long-form non-fiction, gathered together into a handy readlist for your portable epub pleasure.

Front-end London - A New London Based Meetup for Front-end Developers, hosted by Made by Many

This looks like being an excellent (free) event in London featuring three talks related to front-end web development.

The inaugural event this month features a talk on responsive design, a talk on data visualisation, and a talk on accessibility.

All you need to know about CSS Transitions | Alex MacCaw

An in-depth look at CSS transitions with some handy tips for improving performance.

» Responsive Design for Apps — Part 1 Cloud Four Blog

A great piece by Jason analysing the ever-blurring lines between device classes.

Mind you, there is one question he doesn’t answer which would help clear up his framing of the situation. That question is:

What’s a web app?

Thursday, January 3rd, 2013

Schedule | Breaking Development Dallas 2012: Web design and development for beyond the desktop

All the videos from last year’s Breaking Development conference in Dallas are up on the site. They’re all excellent.

Medaler.com

I know have a visualisation of my public data in the form of 3D-printed snowflake, thanks to Medaler.

CABINET // Trap Streets

A fascinating piece by James on trap streets, those fictitious places on maps that have no corresponding territory.

Wednesday, January 2nd, 2013

The Pastry Box Project | 2 January 2013, baked by Chris Coyier

I heartily concur with Chris’s sentiment:

I wish everyone in the world would blog.

PlaceIt by Breezi - Generate Product Screenshots in Realistic Environments

A cute little service for mocking up pictures of your site being used on different devices. Just drag and drop a screenshot on to an image.

Jackdaws love my big sphinx of quartz - Stuff

The biggest plot holes of World War Two.

Warning: contains spoilers.