Progressive enhancement with Ajax

The excitement over Ajax shows no signs of dying down anytime soon. The Man In Blue has weighed in with his thoughts on the matter:

“It would be nice if Google Maps were accessible by non-JavaScript enabled user agents, but in practice this must be weighed up against market forces - is an acceptable proportion of your target market likely to have JavaScript enabled, or is your service so useful that people will go out of their way to acquire JavaScript capabilities to use it?”

This is where I feel that Google and others are approaching the whole issue of Ajax (and JavaScript in general) in a back-asswards manner.

To me, it makes sense to first build your application using old-fashioned server-side technology to do all the work with old-fashioned page refreshes to display updated information. Once you’ve got that built, you can then apply JavaScript to make a better, richer, more usable application. By intercepting the default actions with JavaScript and replacing them with XMLHttpRequest calls, you can add a lovely layer of instant interaction. But, and this is the kicker, you know that your smooth, slick application will degrade gracefully in browsers that don’t support JavaScript.

It’s called progressive enhancement and it’s how we should all be building our web applications. The concept forms the basis of applying CSS correctly. Ideally, we should be following through on this idea whenever we decide to wield the scalpel of JavaScript.

For example, I love the way the JavaScript on the Panic shop works but if the principle of progressive enhancement had been applied in the planning of the shopping cart, it would degrade gracefully for shoppers without JavaScript.

I’ve heard all the arguments about market forces and browser statistics informing a “JavaScript only, please” decision on building web apps but they make little sense to me. With a little bit of forethought, you can build an application that can be used by everybody whilst giving the majority an enhanced experience.

Think about it: you have to build all the server-side logic anyway so why not it build it in such a way that it can be used equally well by a refreshing web page as by a call via XMLHttpRequest?

That was the way I went about adding the recent enhancements to The Session. I made sure that the server-side functions responsible for executing searches and returning results were abstracted enough so that they could be re-used easily by an Ajax script. If you wanted to follow this process of abstraction to its logical conclusion, then I guess you could have all data returned as XML. Then you could build your regular website, add a nice layer of Ajax enhancements and provide an API for web services as nice little bonus.

To develop directly for Ajax might seem to save time and costs but ultimately, it will work out a lot more time consuming in the long run if you ever decide to build a non-JavaScript version.

Ajax is an interesting technology. It straddles the worlds of client-side and server-side scripting, a combination reminiscent of Kipling’s poem:

“East is East, and West is West, and never the twain shall meet”

To use this technology correctly, developers need to understand both worlds. The concept of progressive enhancement is probably a new one to server-side programmers while the idea of data abstraction may be new to client-side developers.

Which reminds me…

If PHP is your server-side language of choice, the future is looking quite rosy thanks to a helping hand from IBM:

“IBM is putting its corporate heft behind a popular open-source Web development technology called PHP, in a move meant to reach out to a broader set of developers.”

Have you published a response to this? :

Responses

Jeremy Keith

@jgarber Yeah, I used to have horrible URLs. I’ll maintain those redirects for a few more decades, I reckon.