Wednesday, January 22nd, 2020
Wednesday, November 6th, 2019
Tuesday, November 5th, 2019
This isn’t a “the web is doomed, DOOMED, I tells ya” kind of blog post. It’s more in the “the web in its current form isn’t sustainable and will collapse into a simpler, more sustainable form, possibly several” genre.
Baldur points to the multiple causes of the web’s current quagmire.
I honestly have no idea on how to mitigate this harm or even how long the decline is going to take. My hope is that if we can make the less complex, more distributed aspects of the web safer and more robust, they will be more likely to thrive when the situation has forced the web as a whole to break up and simplify.
Monday, March 18th, 2019
A good talk from from Chris Ferdinandi, who says:
Saturday, December 29th, 2018
And they all have.
And they are all different.
Read this talk transcript, and even if you don’t agree with everything in it today, you may end up coming back to it in the future. He’s playing the long game:
The web is the way now that we distribute information. We will need the web pages we create now to be readable in 100 years time, just as we can still read 100-year-old books.
Monday, September 10th, 2018
Robustness and least power
There’s a great article by Steven Garrity over on A List Apart called Design with Difficult Data. It runs through the advantages of using unusual content to stress-test interfaces, referencing Postel’s Law, AKA the robustness principle:
Be conservative in what you send, be liberal in what you accept.
Even though the robustness principle was formulated for packet-switching, I see it at work in all sorts of disciplines, including design. A good example is in best practices for designing forms:
Every field you ask users to fill out requires some effort. The more effort is needed to fill out a form, the less likely users will complete the form. That’s why the foundational rule of form design is shorter is better — get rid of all inessential fields.
In other words, be conservative in the number of form fields you send to users. But then, when it comes to users filling in those fields:
It’s very common for a few variations of an answer to a question to be possible; for example, when a form asks users to provide information about their state, and a user responds by typing their state’s abbreviation instead of the full name (for example, CA instead of California). The form should accept both formats, and it’s the developer job to convert the data into a consistent format.
In other words, be liberal in what you accept from users.
I find the robustness principle to be an immensely powerful way of figuring out how to approach many design problems. When it comes to figuring out what specific tools or technologies to use, there’s an equally useful principle: the rule of least power:
Choose the least powerful language suitable for a given purpose.
On the face of it, this sounds counter-intuitive; why forego a powerful technology in favour of something less powerful?
Well, power comes with a price. Powerful technologies tend to be more complex, which means they can be trickier to use and trickier to swap out later.
In the web front-end stack — HTML, CSS, JS, and ARIA — if you can solve a problem with a simpler solution lower in the stack, you should. It’s less fragile, more foolproof, and just works.
- Instead of using ARIA to give a certain
rolevalue to a
span, try to use a more suitable HTML element instead.
It sounds a lot like the KISS principle: Keep It Simple, Stupid. But whereas the KISS principle can be applied within a specific technology—like keeping your CSS manageable—the rule of least power is all about evaluating technology; choosing the most appropriate technology for the task at hand.
There are some associated principles, like YAGNI: You Ain’t Gonna Need It. That helps you avoid picking a technology that’s too powerful for your current needs, but which might be suitable in the future: premature optimisation. Or, as Rachel put it, stop solving problems you don’t yet have:
So make sure every bit of code added to your project is there for a reason you can explain, not just because it is part of some standard toolkit or boilerplate.
There’s no shortage of principles, laws, and rules out there, and I find many of them very useful, but if I had to pick just two that are particularly applicable to my work, they would be the robustness principle and the rule of least of power.
After all, if they’re good enough for Tim Berners-Lee…
Thursday, April 5th, 2018
This is absolutely brilliant!
Forgive my excitement, but this transcript of Charlie’s talk is so, so good—an equal mix of history and practical advice. Once you’ve read it, share it. I want everyone to have the pleasure of reading this inspiring piece!
It is this flirty declarative nature makes HTML so incredibly robust. Just look at this video. It shows me pulling chunks out of the Amazon homepage as I browse it, while the page continues to run.
Let’s just stop and think about that, because we take it for granted. I’m pulling chunks of code out of a running computer application, AND IT IS STILL WORKING.
Just how… INCREDIBLE is that? Can you imagine pulling random chunks of code out of the memory of your iPhone or Windows laptop, and still expecting it to work? Of course not! But with HTML, it’s a given.
Wednesday, February 7th, 2018
This is the rarely-seen hour-long version of my Resilience talk. It’s the director’s cut, if you will, featuring an Arthur C. Clarke sub-plot that goes from the telegraph to the World Wide Web to the space elevator.
Tuesday, May 17th, 2016
Marvellous insights from Mark on how the robustness principle can and should be applied to styeguides and pattern libraries (‘sfunny—I was talking about Postel’s Law just this morning at An Event Apart in Boston).
Being liberal in accepting things into the system, and being liberal about how you go about that, ensures you don’t police the system. You collaborate on it.
So, what about the output? Remember: be ’conservative in what you do’. For a design system, this means your output of the system – guidelines, principles, design patterns, code, etc etc. – needs to be clear, unambiguous, and understandable.
Thursday, July 9th, 2015
The full text of Jason’s great talk at this year’s CSS Summit. It’s a great read, clearing up many of the misunderstandings around progressive enhancement and showing some practical examples of progressive enhancement working at each level of the web’s technology stack
Friday, April 24th, 2015
And that’s why you always use progressive enhancement!
Thursday, April 23rd, 2015
A profile of a legend.
Sunday, April 12th, 2015
I like this nice straightforward approach. Instead of jumping into the complexities of the final interactive component, Chris starts with the basics and layers on the complexity one step at a time, thereby creating a more robust solution.
If I had one small change to suggest, maybe
aria-label might work better than offscreen text for the controls …as documented by Heydon.
Monday, May 30th, 2011
Hashbangs. Yes, again. This is important, dammit!
There seems to be a general concensus that hashbang URLs are bad. Even those defending the practice portray them as a necessary evil. That is, once a better solution is viable—like the HTML5 History API—then there will no longer be any need for
#! in URLs. I’m certain that it’s a matter of when, not if Twitter switches over.
But even then, that won’t be the end of the story.
There’s no such thing as a temporary fix when it comes to URLs. If you introduce a change to your URL scheme you are stuck with it for the forseeable future. You may internally change your links to fit your new URL scheme but you have no control over the rest of the web that links to your content.
Therein lies the rub. Even if—nay when—Twitter switch over to proper URLs, there will still be many, many blog posts and other documents linking to individual tweets …and each of those links will contain
#!. That means that Twitter must make sure that their home page maintains a client-side routing mechanism for inbound hashbang links (remember, the server sees nothing after the
Sunday, May 29th, 2011
A superb post by Dan on the bigger picture of what’s wrong with hashbang URLs. Well written and well reasoned.
Thursday, February 10th, 2011
Tim Bray calmly explains why hash-bang URLs are a very bad idea.
This is what we call “tight coupling” and I thought that anyone with a Computer Science degree ought to have been taught to avoid it.
Wednesday, February 9th, 2011
I wrote a little while back about my feelings on hash-bang URLs:
I feel so disappointed and sad when I see previously-robust URLs swapped out for the fragile #! fragment identifiers. I find it hard to articulate my sadness…
It would appear that hash-bang usage is on the rise, despite the fact that it was never intended as a long-term solution. Instead, the pattern (or anti-pattern) was intended as a last resort for crawling Ajax-obfuscated content:
So the #! URL syntax was especially geared for sites that got the fundamental web development best practices horribly wrong, and gave them a lifeline to getting their content seen by Googlebot.
I’m always surprised when I come across as site that deliberately chooses to make its content harder to access.
This is no accident. The web stack is rooted in Postel’s law. If you serve an HTML document to a browser, and that document contains some tags or attributes that the browser doesn’t understand, the browser will simply ignore them and render the document as best it can. If you supply a style sheet that contains a selector or rule that the browser doesn’t recognise, it will simply pass it over and continue rendering.
That’s why I’m so surprised that any front-end engineer would knowingly choose to swap out a solid declarative foundation like HTML for a more brittle scripting language. Or, as Simon put it:
Gizmodo launches redesign, is no longer a website (try visiting with JS disabled): http://gizmodo.com/