Just what is it that you want to do?

The supersmart Scott Jenson just gave a talk at The Web Is in Cardiff, which was by all accounts, excellent. I wish I could have seen it, but I’m currently chilling out in Florida and I haven’t mastered the art of bilocation.

Last week, Scott wrote a blog post called My Issue with Progressive Enhancement (he wrote it on Google+, which is why you might not have seen it).

In it, he takes to task the idea that—through progressive enhancement—you should be able to offer all functionality to all browsers, thereby foregoing the use of newer technologies that aren’t universally supported.

If that were what progressive enhancement meant, I’d be with him all the way. But progressive enhancement is not about offering all functionality; progressive enhancement is about making sure that your core functionality is available to everyone. Everything after that is, well, an enhancement (the clue is in the name).

The trick to doing this well is figuring out what is core functionality, and what is an enhancement. There are no hard and fast rules.

Sometimes it’s really obvious. Web fonts? They’re an enhancement. Rounded corners? An enhancement. Gradients? An enhancement. Actually, come to think of it, all of your CSS is an enhancement. Your content, on the other hand, is not. That should be available to everyone. And in the case of task-based web thangs, that means the fundamental tasks should be available to everyone …but you can still layer more tasks on top.

If you’re building an e-commerce site, then being able to add items to a shopping cart and being able to check out are your core tasks. Once you’ve got that working with good ol’ HTML form elements, then you can go crazy with your enhancements: animating, transitioning, swiping, dragging, dropping …the sky’s the limit.

This is exactly what Orde Saunders describes:

I’m not suggesting that you try and replicate all your JavaScript functionality when it’s disabled, above all that’s just not practical. What you should be aiming for is being able to complete the basics - for example adding a product to a shopping cart and then checking out. This is necessarily going to be clunky as judged by current standards and I suggest you don’t spend much time on optimising this process.

Scott asked about building a camera app with progressive enhancement:

Here again, the real question to ask is “what is the core functionality?” Building a camera app is a means to an end, not the end itself. You need to ask what the end goal is. Perhaps it’s “enable people to share photos with their friends.” Going back to good ol’ HTML, you can accomplish that task with:

<input type="file" accept="image/*">

Now that you’ve got that out of the way, you can spend the majority of your time making the best damn camera app you can, using all the latest browser technologies. (Perhaps WebRTC? Maybe use a canvas element to display the captured image data and apply CSS filters on top?)

Scott says:

My point is that not everything devolves to content. Sometimes the functionality is the point.

I agree wholeheartedly. In fact, I would say that even in the case of “content” sites, functionality is still the point—the functionality would be reading/hearing/accessing content. But I think that Scott is misunderstanding progressive enhancement if he think it means providing all the functionality that one can possibly provide.

Mat recently pointed out that there are plenty of enhancements on the Boston Globe site that require JavaScript, but the core functionality is available to everyone:

Scott again:

What I’m chaffing at is the belief that when a page is offering specific functionality, Let’s say a camera app or a chat app, what does it mean to progressively enhance it?

Again, a realtime chat app is a means to an end. What is it enabling? The ability for people to talk to each other over the web? Okay, we can do that using good ol’ HTML—text and form elements—with full page refreshes. That won’t be realtime. That’s okay. The realtime part is an enhancement. Use Web Sockets and WebRTC (in the browsers that support them) to provide the realtime experience. But everyone gets the core functionality.

Like I said, the trick is figuring out what’s core functionality and what’s an enhancement.

Ethan provides another example. Let’s say you’re building a browser-based rich text editor, that uses JavaScript to do all sorts of formatting on the fly. The core functionality is not the formatting on the fly; the core functionality is being able to edit text:

If progressive enhancement truly meant making all functionality available to everyone, then it would be unworkable. I think that’s a common misconception around progressive enhancement; there’s this idea that using progressive enhancement means that you’re going to spend all your time making stuff work in older browsers. In fact, it’s the exact opposite. As long as you spend a little bit of time at the start making sure that the core functionality works with good ol’ fashioned HTML, then you can spend most of your time trying out the latest and greatest browser technologies.

As Orde put it:

What you are going to be spending the majority of your time and effort on is the enhanced JavaScript version as that is how the majority of your customers will be experiencing your site.

The other Scott—Scott Jehl—wrote a while back:

For us, building with Progressive Enhancement moves almost all of our development time and costs to newer browsers, not older ones.

Progressive Enhancement frees us to focus on the costs of building features for modern browsers, without worrying much about leaving anyone out. With a strongly qualified codebase, older browser support comes nearly for free.

Approaching browser support this way requires a different way of thinking. For everything you’re building, you need to ask “is this core functionality, or is it an enhancment?” and build accordingly. It takes a bit of getting used to, but it gets easier the more you do it (until, after a while, it becomes second nature).

But if you’re thinking about progressive enhancement as “devolving” down—as Scott Jenson describes in his post—then I think you’re on the wrong track. Instead it’s about taking care of the core functionality quickly and then spending your time “enhancing” up.

Scott asks:

Shouldn’t we be allowed to experiment? Isn’t it reasonable to build things that push the envelope?

Absolutely! And the best and safest way to do that is to make sure that you’re providing your core functionality for everyone. Once you do that, you can go nuts with the latest and greatest experimental envelope-pushing technologies, secure in the knowledge that you don’t even need to worry about the fact that they don’t work in older browsers. Geolocation! Offline storage! Device APIs! Anything you can think of, you can use as a powerful enhancement on top of your core tasks.

Once you realise this, it’s immensely liberating to use progressive enhancement. You can have the best of both worlds: universal access to core functionality, combined with all the latest cuting-edge technology too.

Have you published a response to this? :



progressive enhancement is not about offering all functionality; progressive enhancement is about making sure that your core functionality is available to everyone. Everything after that is, well, an enhancement (the clue is in the name).Jeremy Keith It is more often than not that I find myself in discussions about this exact point: I want our websites, their core tasks, to function, and if this is cared for, then we can talk about the fun stuff. To many people this, in our new and modern browser times, seems to be a waste of time, why care for systems or users without javascript, that’s only those web fundamentalists like you, I was told on one occasion. There are plenty of good reasons to do so, but first of all, the layered approach from function to representation to behaviour to me seems only logical when working with the web’s building blocks. Working with, not against, them. But this needs planning, strategy, and sometimes there seems to be no time and/or budget for this – it’s only a small webpage, let’s throw some javascript driven ui widgets together and be done with it, why don’t you? So it’s good to have this article by Jeremy Keith to refer to, since it explains very clearly and takes good care of some ‘counter’ arguments, thank you, Jeremy.

# Posted by Webrocker on Monday, November 3rd, 2014 at 9:40am

Tyler Gaw

@adactio @crtr0 thanks Jeremy. That latest one,7774–I think–is the best piece you (or anyone else) has written on the topic

# Posted by Tyler Gaw on Friday, November 7th, 2014 at 9:07pm


Progressive enhancement for everyone

23 November 2014 enhancementfeaturedopinionprogressive

One of the key benefits ascribed to progressive enhancement is that your site works for everyone. That is almost true. But we need to be clear what we mean by “everyone”. A recent polite disagreement between Scott Jenson and Jeremy Keith, both of whom I admire immensely, made me finally put into writing something that’s been bothering me about progressive enhancement idealism for a while. The principle has it that if you start with ‘good ol’ HTML’, it will work everywhere, and then you can add CSS and JavaScript to enhance it. Literally everything except the markup (and even much of the markup) is an enhancement. This is fine up to (or rather down to) a point. Jeremy and Scott use the example of a camera app, where you could consider it to be a progressively enhanced . But does that really work for everyone? Taking IE as an example: You’re probably not testing in anything lower than v6 File inputs were first supported in v4 JavaScript support has been available since v2 Reality: The ‘core functionality’ doesn’t work for everyone So how do we define “everyone”? In this case it evidently doesn’t include anyone using IE 1, 2 or 3 since the core solution isn’t supported by those browsers. And virtually nobody does use them, I know. But does it even include users of 4 and 5? You’re serving JS code to these browsers, and not testing it. Not tested means probably not working, because no browser can be relied upon to interpret and render standards-compliant code correctly whilst ignoring stuff they don’t understand (especially old ones). I don’t think a single legacy browser can claim to do this. Added to all this there are literally no versions of IE that support file inputs but not JavaScript, so what exactly are we achieving here? Beware of the leopard Even if you devise some way in which your user can achieve their supposed aim with the most basic browser functionality, and it works ‘everywhere’, you will probably be providing such a laughably awful experience that absolutely no-one is going to use it. In the Hitchhiker’s guide to the galaxy, Arthur Dent is surprised that the council want to knock down his house, because he hasn’t seen the planning application that has been clearly on display in a pitch dark basement with no stairs on the back door on a toilet cubicle with a ‘beware of the leopard’ sign on it. The reality is that most modern web products are designed not for web gurus who are willing to tolerate endless steps, but for normal humans. And in many cases they’re designed to be easier ways of doing something that is already possible. As a result it’s totally pointless to also support a way of doing it that’s harder than what’s already possible. A user that signs up for a camera app is not going to be very happy if it doesn’t actually take the pictures. There is nothing wrong with “Browser not supported” Progressive enhancement is a valuable mechanism, and it helps us bring new features to users faster. All the things Jeremy cites as no-brainers are exactly that. Rounded corners, web fonts, gradients, all indisputably enhancements. But when it comes to complex script-powered elements, we should not feel the need to support some catastrophically awful user experience for a tiny number of users. A much better user experience for the camera app is to simply show a message like this: Your browser is too old to use the camera app. The app works with any browser that includes the getUserMedia feature. Learn about free upgrade options This isn’t always the case, and of course you can often be more granular. For example, many sites now have very rich commenting interfaces. You could argue that ancient browsers should still be able to comment, but for that tiny audience, what’s wrong with printing a selection of comments and the message: To add your own comment, you need to upgrade your browser At the FT we use the seamless progressive enhancement of ‘optional’ features, but also define a moveable baseline using a cuts the mustard test, below which we are happy to replace some of what you might consider core functionality with messages like the one shown above. Everything is, in the end, a judgment call – login and sign up might work below the baseline, but commenting, ‘save for later’, ‘email a friend’ and so on don’t. Some of these features might show upgrade messages, others might vanish completely. And of course even an ultra-simplified site is expected to break if you go back far enough in the browser versions. No matter how you cut it, nothing is going to work for everyone. For sure, taking a PE approach allows you to be more inclusive without a lot more effort. But being inclusive doesn’t mean being universal. The way I look at it, I make reasonable attempts to serve the largest practically addressable audience with the most important features. As long as that’s what you mean by ‘everyone’, we’re on the same page. I don’t have comments enabled on this blog, but if you want to respond feel free to send me a tweet!.

# Sunday, November 23rd, 2014 at 5:08pm

Remy Sharp

@adactio do you have insights on what’s driving that decision? Mine has been that they don’t have the time & budget to “not support JS”.

# Posted by Remy Sharp on Monday, June 22nd, 2015 at 1:16pm

Jeremy Keith

@rem Developer convenience. If you’re not used to using progressive enhancement, it appears as though it’s going to take longer…

Jeremy Keith

@rem …and so developers basically say “inclusion is hard, let’s go shopping.” Also: tools. Tools come with philosophies…

Jeremy Keith

@rem …and often developers are taking on those (non progressive) philosophies without even realising it.

Remy Sharp

@adactio yes, yes and yes. I think there’s a workflow problem that needs to be solved.

# Posted by Remy Sharp on Monday, June 22nd, 2015 at 1:20pm

Peter-Paul Koch

@rem Then we shouldn’t talk about JavaScript, either. Really, the problem is in the messaging and terminology.

Remy Sharp

@jaffathecake plus, it sounds nice. Like Chimichanga. So, it’s either Isomorphic JavaScript or Chimichanga Script. Up to you.

# Posted by Remy Sharp on Monday, June 22nd, 2015 at 1:26pm

David Peach

# Posted by David Peach on Thursday, August 18th, 2016 at 10:04am


This post was originally written in 2015, but upon re-reading it today, it still (just about) holds up, so I finally hit publish.

I had thought that an EdgeConf panel would be about developers not using JavaScript because they were more interested in building high end web apps, full of WebRTC, Web Audio and the like. But it’s not.

I had the pleasure of introducing the Progressive Enhancement panel and contributing to the panel in 2015. For my introduction, I ran some “research” and did some pondering about what exactly is progressive enhancement.

Here’s the thing: after getting responses from 800+ developers (on a Twitter poll), I’ve come to realise that most developers, or certainly everyone following me, everyone watching (the EdgeConf stream), everyone reading, see progressive enhancement as a good thing. The “right thing” to do. They understand that it can deliver the web site’s content to a wider audience. There’s no doubt.

There’s accessibility benefits and SEO benefits. SEO, I’ve heard directly from developers, is one way that the business has had buy in to taking a PE approach to development.

But the truth is: progressive enhancement is not part of the workflow.

What is Progressive Enhancement?

Well…it’s a made up term by Steve Champeon who used it to describe the techniques he (or he and team) were using to build web sites instead of taking a graceful degradation approach.

As such, there’s no one single line that defines progressive enhancement. However, Wikipedia defines it as:

[progressive enhancement] allows everyone to access the basic content and functionality of a web page, using any browser or Internet connection

Graceful degradation works the other way around, in that the complete functionality is delivered to the browser, and edge cases and “older browsers” (not meeting the technical requirements) degrade down to a (potentially) less functionality.

The problem is based on a survey of my own followers, that’s to say that they’re likely to have similar interests and values when it comes to web dev, 25% of 800 developers still believe that progressive enhancement is simply making the site work without JavaScript enabled.

How do you make it work without JavaScript?

I can imagine to anyone starting out new in web development might find this question pretty daunting. First pressed with solving some complicated problem and they’ve finally worked out how to make it work using a marriage of StackOverflow copy & pasting and newly gained advice from books and stuff…, but now all of a sudden: make it work without the code 😱

Which explains the silver bullet response that I’ve heard time after time: “how would a WebRTC chat site work?” …it wouldn’t.

In fact, here is The Very Clever Jake Archibald’s excellent SVGOMG web site…with JavaScript turned off, watch as frustration boils over and I’m left to throw my computer out of the window…

Putting aside silly jokes, how does a web site work without JavaScript isn’t really a good question. In fact, it’s entirely out of context.

A better question to ask could be how do we deliver a baseline web site that’s usable by the most minimal of requirements.

Very much what Jeremy Keith has said recently in response to criticism that it’s impossible to progressively enhance everything with today’s expectations. Progressive enhancement is:

…figuring out what is core functionality and what is an enhancement.

So how does the web community re-frame it’s thinking and look at progressive enhancement as the baseline that you build upon?

Why does it matter?

Today many developers are writing “thick clients”, that is, JavaScript driving a lot, if not all, of the functionality and presentation in the browser.

They do it by delivering and render views in the browser. The big upside of this is that the site is extremely fast to the user’s input. The other big benefit is that there are a good number of frameworks (React, Vue, Angular, Polymer to name the “biggies” of today) that lend themselves greatly to client side MVC, i.e. full application logic in the client side code.

The problem is that the frameworks will often (try to) reinvent fundamental building blocks of a web experience. A few simple/classic examples:

  • The link isn’t a link at all, which means you can’t open it in a new tab, or copy it, or share it…or even click it the way you’d expect to
  • The button isn’t a button
  • You can’t share a link to the page you’re looking at (because it’s all client side rendered and doesn’t have a link)
  • Screen readers can’t navigate the content properly

I recently wrote about how I had failed the anchor. It pretty much touched on all the points above.

This doesn’t mean this isn’t possible, just that it’s often forgotten. In the same way that Flash was often labelled as inaccessible. This wasn’t true, it was possible to make Flash accessible, it’s just that the default development path didn’t include it.

A more extreme example of this was seen in Flipboard’s mobile site. Importantly: mobile site. Flipboard render the entire page using a canvas element. I can’t speak for the accessibility of the site, but on mobile it performs beautifully. It feels…”native”. And with that, it’s also broken. I can’t copy links, and I can’t copy text - akin to the Flash apps and even Java applet days. It looks great, but it doesn’t feel “of the web”†.

† caveat: this was true in 2015, it’s possible…likely it’s been thrown away and fixed…I hope.

The problem is: browsers are pretty poor when compared to the proprietary and closed platforms they’re constantly compared to.

There’s pressure (from SF/Apple/who knows) to deliver web sites that feel “native” (no, I won’t define this) and browsers are always playing catch up with native, proprietary platforms: this is a fact.

Native media elements, native sockets, native audio, native push notifications, native control over network - this all took it’s merry time to get the browser. So when a company decides that the tried and tested approach to styling at list of articles won’t give them the unique UX they want and the 60fps interaction, then of course they’re going to bake up their own technology (in Flipboard’s case, re-inventing wheels with canvas…the exact same way Bespin did back in it’s day).

But…how would a thick-client work without JavaScript?

Angular, for instance, did not have a developer story for how to develop a site with progressive enhancement as a baseline.

Does this mean it’s not possible? I don’t think so. Without the stories though, developers will gravitate towards solved problems (understandably).

What does this story look like when a framework is a prerequisite of the project?

Web Components

Web Components are a hot debate topic. They could cause all kinds of mess of the web. On the other hand, they’re also a perfect fit for progressive enhancement.

Take the following HTML:

<input type="text" name="creditcard" required autocomplete="cc-number">

Notice that I’m not using the pattern attribute because it’s hard to match correctly to credit cards (they might have spaces between groups of numbers, or dashes, or not at all).

There’s also no validation, and the first number also tells us what kind of card is being used (4 indicates a Visa card for instance).

A web component could progressively enhance the element similarly to the way the browser would natively enhance type="date" for instance.

<stripe-cc-card> <input type="text" name="creditcard" required autocomplete="cc-number">

I wonder, are web components the future of progressive enhancement?

Potential problems on the horizon

Developers are inherently lazy. It’s what makes them/us optimise our workflows and become very good at solving problems. We re-use what’s known to work and tend to eke out the complex parts that we can “live without”. Sadly, this can be at the cost of accessibility and progressive enhancement.

I think there’s some bigger potential problems on the horizon: ES6 - esnext (i.e. the future of JavaScript).

“But progressive enhancement has nothing to do with ES-whatever…”

Taking a step back for a moment. Say we’re writing an HTML only web site (no CSS or JS). But we want to use the latest most amazing native email validation:

<input type="email" required>

Simple. But…what happens if type="email" isn’t supported? Well, nothing bad. The element will be a text element (and we can validate on the server). The HTML doesn’t break.

JavaScript isn’t quite the same, but we can code defensively, using feature detection and polyfills where appropriate.

ES6 has features that breaks this design. Syntax breaking features that cannot exist alongside our ES5 and cannot be polyfilled. It must be transpiled.

There’s currently talk of smart pipelines that can deliver polyfilled code to “old” browsers and light native ES-x features to those newer browsers. Though, I would imagine the older browsers would be running on older machines and therefore wouldn’t perform well with more code in the JavaScript bundles. Compared with new browsers running on new machines are probably faster and are probably more capable than their elderly peers at running lots of code. IDK, just a thought.

Syntax breaking

There’s a small number of ES6 features that are syntax breaking, the “arrow function” in particular.

This means, if the arrow function is encountered by a browser that doesn’t support ES6 arrows, it’s cause a syntax error. If the site is following best practise and combining all their JavaScript into a single file, this means that all their JavaScript just broke (I’ve personally seen this on JS Bin when we used jshint which uses ES5 setters and broke IE8).

I’ve asked people on the TC39 group and JavaScript experts as to what the right approach here is (bear in mind this is still early days).

The answer was a mix of:

  • Use feature detection (including for syntax breaking features) and conditionally load the right script, either the ES5 or ES6
  • Transpile your ES6 to ES5 and make both available

This seems brittle and the more complexity the more likely that as time goes by, new projects will leave out the transpile part, and forget about supporting older browsers - or even newer browsers that don’t ship with ES6 support (perhaps because the VM footprint is smaller and has to work in a super low powered environment).

Since JavaScript doesn’t exhibit the same resilience that HTML & CSS does, so the fear is that it’ll leave users who can’t upgrade, faced with a broken or blank page.

Is there a workflow that solves this? Or are we forced to support two incompatible languages on the web?

Thanks for reading. As usual, it depends. In fact, that it does depend, applies to every single project I work on.

Further reading

# Wednesday, July 24th, 2019 at 12:00pm


# Shared by Jonathan Poh on Monday, November 3rd, 2014 at 3:30am

# Shared by Eoin Brazil on Monday, November 3rd, 2014 at 4:43am

# Shared by Michael Royer on Monday, November 3rd, 2014 at 6:03am

# Shared by Dan Thomas on Monday, November 3rd, 2014 at 8:21am

# Shared by Manuel Strehl on Monday, November 3rd, 2014 at 8:38am

# Shared by David Goss on Monday, November 3rd, 2014 at 8:40am

# Shared by Adewale Oshineye on Monday, November 3rd, 2014 at 9:26am

# Shared by Kevin Suttle on Monday, November 3rd, 2014 at 1:30pm

# Shared by Tom Maslen on Monday, November 3rd, 2014 at 2:22pm

# Shared by Brett Jankord on Monday, November 3rd, 2014 at 3:39pm

# Shared by Ian McBurnie on Tuesday, November 18th, 2014 at 7:28pm

# Shared by Caleb Meredith on Monday, June 22nd, 2015 at 11:13pm


# Liked by Marc Drummond on Monday, November 3rd, 2014 at 8:18am

# Liked by alex thomas on Monday, November 3rd, 2014 at 9:14am

# Liked by Fredrik Matheson on Monday, November 3rd, 2014 at 9:14am

# Liked by Bart Engels on Monday, November 3rd, 2014 at 9:14am

# Liked by Eoin Brazil on Monday, November 3rd, 2014 at 9:14am

# Liked by L A Watts on Monday, November 3rd, 2014 at 9:14am

# Liked by Orde Saunders on Monday, November 3rd, 2014 at 10:08am

# Liked by 什么他妈的我在干嘛 on Monday, November 3rd, 2014 at 11:50am

# Liked by Carl Räfting on Monday, November 3rd, 2014 at 11:52am

# Liked by Tim Day on Monday, November 3rd, 2014 at 12:09pm

# Liked by Sascha Lack on Monday, November 3rd, 2014 at 12:31pm

# Liked by vollepeer on Monday, November 3rd, 2014 at 1:05pm

# Liked by Chris on Monday, November 3rd, 2014 at 2:29pm

# Liked by Tom Maslen on Monday, November 3rd, 2014 at 2:30pm

# Liked by Jack Appleby on Monday, November 3rd, 2014 at 2:41pm

# Liked by Brett Jankord on Monday, November 3rd, 2014 at 3:46pm

# Liked by Ben Seven on Monday, November 3rd, 2014 at 6:49pm

# Liked by Phil Hawksworth on Tuesday, November 4th, 2014 at 7:33am

# Liked by Jonas Päckos on Tuesday, November 4th, 2014 at 9:08am

# Liked by Dan Boulet on Tuesday, November 4th, 2014 at 10:52pm

# Liked by Marco Raimondi on Saturday, November 8th, 2014 at 6:47am

# Liked by Wendy Robins on Monday, November 17th, 2014 at 10:55pm

# Liked by ┗ᵒᶰᵍ·Lazuli on Monday, June 22nd, 2015 at 2:30pm

# Liked by diego.rb on Monday, June 22nd, 2015 at 2:30pm

# Liked by Jan Skovgaard on Monday, June 22nd, 2015 at 2:30pm

# Liked by Shaun Bellis on Monday, June 22nd, 2015 at 2:30pm

# Liked by Jake Archibald on Monday, June 22nd, 2015 at 2:30pm

# Liked by Kevin Lozandier on Monday, June 22nd, 2015 at 3:05pm

# Liked by Heydon on Monday, June 22nd, 2015 at 3:33pm

# Liked by Andrew Woods on Monday, June 22nd, 2015 at 4:34pm

# Liked by Morgan Hallgren on Tuesday, June 23rd, 2015 at 3:54pm

Previously on this day

10 years ago I wrote Perfect Pitch

In which I lose my DMCA virginity.

11 years ago I wrote Nihon

I’m going to Japan.

12 years ago I wrote Open?

Can OpenSocial enable portable social networks?

13 years ago I wrote Matrix locations in Sydney

Follow in the footsteps of Neo.

17 years ago I wrote DVDs

In anticipation of the arrival of my lovely new iMac with with its 17 inch screen and superdrive, I decided to pre-emptively stock up with some DVDs.

18 years ago I wrote School says Bible permits smacking

The debate surrounding corporal punishment has flared up again here in England.