Design Principles For The Web
The opening presentation from An Event Apart Online Together: Front-End Focus held online in August 2020.
The opening presentation from An Event Apart Online Together: Front-End Focus held online in August 2020.
The opening keynote from Fronteers 2010 in Amsterdam.
There are observational principles, and there are imperative principles. Let’s put them together.
A tale of two principles.
The closing keynote from the border:none event held in Nuremberg in October 2013.
Where the 80/20 principle breaks down.
A presentation on progressive enhancement from the Beyond Tellerrand conference held in Düsseldorf in May 2015.
Mashing up George Orwell with axioms of web architecture.
Using design principles to embody your priorities.
A presentation from the Beyond Tellerrand conference held in Düsseldorf in May 2016. I also presented a version of this talk at An Event Apart, Smashing Conference, Render, and From The Front.
Two principles I’ve found to be univerally useful (to design, development, writing) are the robustness principle and the principle of least power:
JavaScript should only do what only JavaScript can do.
What makes ‘em good?
I don’t agree with Steven Pemberton on a lot of things—I’m not a fan of many of the Semantic Web technologies he likes, and I think that the Robustness Principle is well-suited to the web—but I always pay attention to what he has to say. I certainly share his concern that migrating everything to JavaScript is not good for interoperability:
This is why there are so few new elements in HTML5: they haven’t done any design, and instead said “if you need anything, you can always do it in Javascript”.
And they all have.
And they are all different.
Read this talk transcript, and even if you don’t agree with everything in it today, you may end up coming back to it in the future. He’s playing the long game:
The web is the way now that we distribute information. We will need the web pages we create now to be readable in 100 years time, just as we can still read 100-year-old books.
Requiring a webpage to depend on a particular 100-year-old implementation of Javascript is not exactly evidence of future-thinking.
Reframing the principle of least power.
This presentation on web standards was delivered at the State Of The Browser conference in London in September 2018.
Hyperlinks to accompany a talk.
The closing keynote from dConstruct 2008 in Brighton.
The opening presentation from the New Adventures conference held in Nottingham in January 2019.
How hash-bang URLs violate the robustness principle.
Marvellous insights from Mark on how the robustness principle can and should be applied to styeguides and pattern libraries (‘sfunny—I was talking about Postel’s Law just this morning at An Event Apart in Boston).
Being liberal in accepting things into the system, and being liberal about how you go about that, ensures you don’t police the system. You collaborate on it.
So, what about the output? Remember: be ’conservative in what you do’. For a design system, this means your output of the system – guidelines, principles, design patterns, code, etc etc. – needs to be clear, unambiguous, and understandable.
A presentation from XTech 2008 in Dublin.
It’s not because it’s declarative—it’s because it’s robust.
I use https://www.ssllabs.com/ssltest/ for testing the robustness of the HTTPS connection, and I use https://securityheaders.com/ for testing how well HTTP headers are set up.
Blogging the year away.
It really takes a lot for spam to be memorable, but this is one of the weirdest I’ve seen a long time.
+1
It violates a fundamental principle of the web: GET requests shouldn’t have side effects …and it should be safe to visit a web page:
A presentation with Derek Featherstone at South by Southwest 2007.
This is the keynote presentation I gave at the Accessibility 2.0 conference held in London in April 2008.
This is an oldie from Julie Zhou, but it’s a timeless message about the value of good (i.e. actually useful) design principles.
See also what she said on this podcast episode:
When push comes to shove and you have to make a trade off, how are you, in those moments, as a team or a company going to prioritize? What are you going to care about the most? Good values will be controversial in that respect because it’s something that another company might have made a different decision than you.
A responsive refresh of adactio.com that takes progressive enhancement to the next level.
Let’s get together and feel alright.
A lovely bit of experimentation with prime numbers and multiple background images.
The opening presentation from the Beyond Tellerrand conference held in Berlin in November 2019.
Yes! Prioritisation is what makes a good design principle useful (instead of fuzzy and toothless).
A presentation from the DIBI conference held in Gateshead in June 2011.
A presentation about history, networks, and digital preservation, from the Webstock conference held in Wellington, New Zealand in February 2012.
This piece first appeared in issue 3 of The Manual, a thrice-yearly print publication.
A meet’n’greet with the W3C’s Technical Architecture Group.
Yes.
I refuse on principle to use an ISP that does any traffic shaping.
This is neat—Vasilis has built a one-pager that grabs a random example from my collection of design principles.
I really like that he was able to use the predictable structure of my HTML as an API.
Incrementally improving the perceived performance of Ajax interactions.
Two sides of a debate on progressive enhancement…
Andrey “Rarst” Savchenko wrote Progressive enhancement — JS sites that work:
If your content website breaks down from JavaScript issue — it is broken.
Joe Hoyle disagrees:
Unlike Rarst, I don’t value progressive enhancement very highly and don’t agree it’s a fundamental principle of the web that should be universally employed. Quite frankly, I don’t care about not supporting JavaScript, and neither does virtually anyone else. It’s not that it doesn’t have any value, or utility - but in a world where we don’t have unlimited resources and time, one has to prioritise what we’ll support and not support.
Caspar acknowledges this:
I don’t have any problem buying into pragmatism as the main and often pressing reason for not investing into a no-JS fallback. The idealistic nature of a design directive like progressive enhancement is very clear to me, and so are typical restrictions in client projects (budgets, deadlines, processes of decision making).
But concludes that by itself that’s not enough reason to ditch such a fundamental technique for building a universal, accessible web:
Ain’t nobody got time for progressive enhancement always, maybe. But entirely ditching principle as a compass for resilient decision making won’t do.
See also: Mike Little’s thoughts on progressive enhancement and accessibility.
Remy looks at the closing gap between native and web. Things are looking pretty damn good for the web, with certain caveats:
The web is the long game. It will always make progress. Free access to both consumers and producers is a core principle. Security is also a core principle, and sometimes at the costs of ease to the developer (but if it were easy it wouldn’t be fun, right?).
That’s why there’ll always be some other technology that’s ahead of the web in terms of features, but those features give the web something to aim for:
Flash was the plugin that was ahead of the web for a long time, it was the only way to play video for heavens sake!
Whereas before we needed polyfills like PhoneGap (whose very reason for existing is to make itself obsolete), now with progressive web apps, we’re proving the philosophy behind PhoneGap:
If the web doesn’t do something today it’s not because it can’t, or won’t, but rather it is because we haven’t gotten around to implementing that capability yet.
This was originally published on CSS Tricks in December 2020 as part of a year-end round-up of responses to the question “What is one thing you learned about building websites this year?”
Here’s the really clever technique that Charlotte used on the speakers page for this year’s UX London site.
I remember that Jon was really impressed that she managed to implement his crazy design.
Matt takes a look at the history of scheduled broadcast media—which all began in Hungary in 1887 via telephone—and compares it to the emerging media context of the 21st century; the stream.
If the organizing principle of the broadcast schedule was synchronization — millions seeing the same thing at the same time — then the organizing principle of the stream is de-contextualization — stories stripped of their original context, and organized into millions of individual, highly personalized streams.
A presentation from the Beyond Tellerrand conference held in Düsseldorf in May 2017. I also presented a version of this talk at An Event Apart, Smashing Conference, Render, Frontend United, and From The Front.
This presentation on the indie web was delivered as the opening keynote at Webstock in Wellington, Zealand in February 2018.
Applying the principle of least power to tools and technologies.
Can my favourite design principle be applied to the process of design?
A counterpart to the piece by Baldur that I linked to yesterday:
There are many challenges to face as the web grows.
Most of them are people problems. Habits. Inertia. A misalignment of priorities with user needs. Those can be overcome.
Don’t build prototypes with a production mindset. Don’t release prototype code into production.
This isn’t a “the web is doomed, DOOMED, I tells ya” kind of blog post. It’s more in the “the web in its current form isn’t sustainable and will collapse into a simpler, more sustainable form, possibly several” genre.
Baldur points to the multiple causes of the web’s current quagmire.
I honestly have no idea on how to mitigate this harm or even how long the decline is going to take. My hope is that if we can make the less complex, more distributed aspects of the web safer and more robust, they will be more likely to thrive when the situation has forced the web as a whole to break up and simplify.
If the JavaScript API requires a user gesture, maybe it’s time for a new button type.
If it’s Tuesday, it must be Paris.
I love the story that Terence relates here. It reminds me of all the fantastic work that Anna did documenting game console browsers.
Are you developing public services? Or a system that people might access when they’re in desperate need of help? Plain HTML works. A small bit of simple CSS will make look decent. JavaScript is probably unnecessary – but can be used to progressively enhance stuff.
What web development can learn from the Nintendo Game and Watch.
The Web now consists of an ever-growing number of different frameworks, methodologies, screen sizes, devices, browsers, and connection speeds. “Lateral thinking with withered technology” – progressively enhanced – might actually be an ideal philosophy for building accessible, performant, resilient, and original experiences for a wide audience of users on the Web.
Having only the content I want to see only be shown when I want to see it with the freedom to jump between readers as I please, all with no ads? For me, no other service comes close to the flexibility, robustness, and overall ease-of-use that RSS offers.
Tim Bray calmly explains why hash-bang URLs are a very bad idea.
This is what we call “tight coupling” and I thought that anyone with a Computer Science degree ought to have been taught to avoid it.
Be liberal in what you accept:
Basically, if your form can’t register Beyoncé – it has failed.
I understand less than half of this great talk by Meredith L. Patterson, but it ticks all my boxes: Leibniz, Turing, Borges, and Postel’s Law.
(via Tim Berners-Lee)
A superb post by Dan on the bigger picture of what’s wrong with hashbang URLs. Well written and well reasoned.
The web is agreement.
Applying Postel’s Law to relationships:
I aspire to be conservative in what and how I share (i.e., avoid drama) while understanding that other people will say all sorts of unmindful things.
Where should you be focusing your efforts when it comes to improving your site’s performance? Here’s a reusable framework for figuring it out.
A good talk from from Chris Ferdinandi, who says:
One of the central themes of my talk on The Lean Web is that we as developers repeatedly take all of the great things the web and browsers give us out-of-the-box, break them, and then re-implement them poorly with JavaScript.
A day of front-end fun in Brighton.
Hijax, Youjax, we all jax for Pjax.
Here’s to the next twenty years.
I like this nice straightforward approach. Instead of jumping into the complexities of the final interactive component, Chris starts with the basics and layers on the complexity one step at a time, thereby creating a more robust solution.
If I had one small change to suggest, maybe aria-label
might work better than offscreen text for the controls …as documented by Heydon.
A profile of a legend.
The full text of Jason’s great talk at this year’s CSS Summit. It’s a great read, clearing up many of the misunderstandings around progressive enhancement and showing some practical examples of progressive enhancement working at each level of the web’s technology stack
HTML. JavaScript. Why not both?
Counting down the charts—what will be in the number one spot?
Top. Men.
This is the rarely-seen hour-long version of my Resilience talk. It’s the director’s cut, if you will, featuring an Arthur C. Clarke sub-plot that goes from the telegraph to the World Wide Web to the space elevator.
This is absolutely brilliant!
Forgive my excitement, but this transcript of Charlie’s talk is so, so good—an equal mix of history and practical advice. Once you’ve read it, share it. I want everyone to have the pleasure of reading this inspiring piece!
It is this flirty declarative nature makes HTML so incredibly robust. Just look at this video. It shows me pulling chunks out of the Amazon homepage as I browse it, while the page continues to run.
Let’s just stop and think about that, because we take it for granted. I’m pulling chunks of code out of a running computer application, AND IT IS STILL WORKING.
Just how… INCREDIBLE is that? Can you imagine pulling random chunks of code out of the memory of your iPhone or Windows laptop, and still expecting it to work? Of course not! But with HTML, it’s a given.
Progressive enhancement is not yet another technology or passing fad. It is a lasting strategy, a principle, to deal with complexity because it lets you build inclusive, resilient experiences that work across different contexts and that will continue to work, once the next fancy JavaScript framework enters the scene – and vanishes again.
But why don’t more people practice progressive enhancement? Is it only because they don’t know better? This might, in fact, be the primary reason. On top of that, especially many JavaScript developers seem to believe that it is not possible or necessary to build modern websites and applications that way.
A heartfelt look at progressive enhancement:
Some look at progressive enhancement like a thing from the past of which the old guard just can’t let go. But to me, progressive enhancement is the future of the Web. It is the basis for building resilient, performant, interoperable, secure, usable, accessible, and thus inclusive experiences. Not only for the Web of today but for the ever-growing complexity of an ever-changing and ever-evolving Web.
Why do I like fluid responsive typography? Let me count the ways…
Docs, books, people and technologies.
Embedding one little thing inside another little thing.
Liveblogging Joshua Porter at An Event Apart Boston 2009.
Fiction and non-fiction, in more-or-less equal measure.
A presentation from the Update conference held in Brighton in September 2011.
There’s a really interesting discussion here, kicked off by Lea, about balancing long-term standards with short-term pragmatism. Specifically, it’s about naming things.
Naming things is hard. Naming things in standards, doubly so.
JavaScript doesn’t get executed on very old browsers when native syntax for new language features is encountered. However, thanks to GitHub being built following the principle of progressive enhancement, users of older browsers still get to interact with basic features of GitHub, while users with more capable browsers get a faster experience.
That’s the way to do it!
Concepts like progressive enhancement allow us to deliver the best experience possible to the majority of customers, while delivering a useful experience to those using older browsers.
Read on for the nitty-gritty details…
A half-day workshop I did at this year’s UX London.
A nice look at responsive design, progressive enhancement, and the principle of One Web.
Progressive enhancement, developer convenience, and isomorphic JavaScript.
Extending the wheel, instead of reinventing it.
Are you writing instructions in CSS …or are you writing suggestions?
Judicious hope.
Jake’s got an idea for improving the security of displaying URLs in browsers.
In web development, we have this concept of progressive enhancement, which means that you start by building websites with the very most basic blocks - HTML elements. Then you enhance those basic elements with CSS to make them look better, then you add JavaScript to make them whizzy - the benefit being that if the JS or the CSS fail to load, you’ve still go the basic usable blocks underneath. I’m following this same principle in the house.
Related: this great chat between Jen Simmons and Stephanie Rieger.
Progressive enhancement. I do not think it means what you think it means.
The lows are low, but the highs are high.
These are really good ideas for evaluating design principles. In fact, I would go so far as to say they are design principles for design principles.
- Good design principles are memorable
- Good design principles help you say no.
- Good design principles aren’t truisms.
- Good design principles are applicable.
The tension between developer convenience and user needs.