Tags: progressive_enhancement

323

sparkline

Friday, May 29th, 2020

CUBE CSS - Piccalilli

I really, really like Andy’s approach here:

The focus of the methodology is utilising the power of CSS and the web platform as a whole, with some added controls and structures that help to keep things a bit more maintainable and predictable. The end-goal is shipping as little CSS as possible—leaning heavily into progressive enhancement and modern techniques.

If you use the cascade for everything, you’re going to run into trouble. But equally, micro-managing styles on every element will also get you into trouble. I think Andy’s found a really great sweet spot here that gets the balance just right.

CUBE CSS in essence, is a progressive enhancement approach, vs a fight against the grain of CSS or a pixel-pushing your project to within an inch of its life approach.

Yes! It feels very “webby” to me.

Friday, May 1st, 2020

The beauty of progressive enhancement - Manuel Matuzović

Progressive Enhancement allows us to use the latest and greatest features HTML, CSS and JavaScript offer us, by providing a basic, but robust foundation for all.

Some great practical examples of progressive enhancement on one website:

  • using grid layout in CSS,
  • using type="module" to enhance a form with JavaScript,
  • using the picture element to provide webp images in HTML.

All of those enhancements work great in modern browsers, but the underlying functionality is still available to a browser like Opera Mini on a feature phone.

Thursday, April 30th, 2020

‘The stakes feel higher but, with good practice, it need not be scary’ – NHS.UK design lead on responding to coronavirus | PublicTechnology.net

This isn’t the time to get precious about your favourite design and development tools. Use progressive enhancement as your philosophy. Your service might have to be accessed on old devices in hospitals with outdated tech or unsupported operating systems. HTML+CSS is your best bet to ensure that the service can be accessed in unlikely scenarios you’ve never even considered. Do you want to take that risk at a time like this? Nope, me neither.

Save the React squabbles for another time. Make it accessible and robust from day one.

Tuesday, April 28th, 2020

Web Typography News #43: Typesetting Moby-Dick, part 2

Great typography on the web should be designed in layers. The web is an imperfect medium, consumed by countless different devices over untold numbers of network connections—each with their own capabilities, limitations, and peculiarities. To think that you can create one solution that will look and work the same everywhere is a fantasy. To make this more than just one nice book website, the whole project and process needs to embrace this reality.

What is a resilient website? (with Jeremy Keith) | A Question of Code

I really enjoyed having a chat with Ed and Tom on their podcast. It’s aimed at people making a career shift into web development, but that didn’t stop me banging on about my usual hobby horses: progressive enhancement, resilient web design, and all that jazz.

Available for your huffduffing pleasure.

Friday, April 17th, 2020

Future Sync 2020

I was supposed to be in Plymouth yesterday, giving the opening talk at this year’s Future Sync conference. Obviously, that train journey never happened, but the conference did.

The organisers gave us speakers the option of pre-recording our talks, which I jumped on. It meant that I wouldn’t be reliant on a good internet connection at the crucial moment. It also meant that I was available to provide additional context—mostly in the form of a deluge of hyperlinks—in the chat window that accompanied the livestream.

The whole thing went very smoothly indeed. Here’s the video of my talk. It was The Layers Of The Web, which I’ve only given once before, at Beyond Tellerrand Berlin last November (in the Before Times).

As well as answering questions in the chat room, people were also asking questions in Sli.do. But rather than answering those questions there, I was supposed to respond in a social medium of my choosing. I chose my own website, with copies syndicated to Twitter.

Here are those questions and answers…

The first few questions were about last years’s CERN project, which opens the talk:

Based on what you now know from the CERN 2019 WorldWideWeb Rebuild project—what would you have done differently if you had been part of the original 1989 Team?

I responded:

Actually, I think the original WWW project got things mostly right. If anything, I’d correct what came later: cookies and JavaScript—those two technologies (which didn’t exist on the web originally) are the source of tracking & surveillance.

The one thing I wish had been done differently is I wish that JavaScript were a same-origin technology from day one:

https://adactio.com/journal/16099

Next question:

How excited were you when you initially got the call for such an amazing project?

My predictable response:

It was an unbelievable privilege! I was so excited the whole time—I still can hardly believe it really happened!

https://adactio.com/journal/14803

https://adactio.com/journal/14821

Later in the presentation, I talked about service workers and progressive web apps. I got a technical question about that:

Is there a limit to the amount of local storage a PWA can use?

I answered:

Great question! Yes, there are limits, but we’re generally talking megabytes here. It varies from browser to browser and depends on the available space on the device.

But files stored using the Cache API are less likely to be deleted than files stored in the browser cache.

More worrying is the announcement from Apple to only store files for a week of browser use:

https://adactio.com/journal/16619

Finally, there was a question about the over-arching theme of the talk…

Great talk, Jeremy. Do you encounter push-back when using the term “Progressive Enhancement”?

My response:

Yes! …And that’s why I never once used the phrase “progressive enhancement” in my talk. 🙂

There’s a lot of misunderstanding of the term. Rather than correct it, I now avoid it:

https://adactio.com/journal/9195

Instead of using the phrase “progressive enhancement”, I now talk about the benefits and effects of the technique: resilience, universality, etc.

Future Sync Distributed 2020

Tuesday, March 24th, 2020

BBC - Future Media Standards & Guidelines - Accessibility Guidelines v2.0

A timely reminder:

The minimum dependency for a web site should be an internet connection and the ability to parse HTML.

Saturday, February 15th, 2020

Same HTML, Different CSS

Like a little mini CSS Zen Garden, here’s one compenent styled five very different ways.

Crucially, the order of the markup doesn’t consider the appearance—it’s concerned purely with what makes sense semantically. And now with CSS grid, elements can be rearranged regardless of source order.

CSS is powerful and capable of doing amazingly beautiful things. Let’s embrace that and keep the HTML semantical instead of adapting it to the need of the next design change.

Thursday, February 6th, 2020

Local First, Undo Redo, JS-Optional, Create Edit Publish - Tantek

Tantek documents the features he wants his posting interface to have.

Hydration

As you may have noticed, I’m a fan of progressive enhancement.

It’s not cool. It’s often at odds with “modern” web development, so I end up looking like an old man yelling at a cloud to get off my lawn. Or something.

At its heart though, progressive enhancement seems fairly uncontroversial and inoffensive to me. It’s an approach. A mindset. Here’s how I describe it in Resilient Web Design:

  1. Identify core functionality.
  2. Make that functionality available using the simplest possible technology.
  3. Enhance!

Progressive enhancement makes use of the principle of least power:

Choose the least powerful language suitable for a given purpose.

That’s step two of the three-step process. But the third step is vital.

I think a lot of the hostility towards progressive enhancement comes from a misunderstanding of that three-step process, perhaps thinking that it stops at step two. I’m sure that some have intrepreted progressive enhancement as preventing developers from using the latest and greatest technology. Nothing could be further from the truth!

Taking a layered approach to building on the web gives you permission to try cutting‐edge JavaScript APIs, regardless of how many or how few browsers currently implement them.

The most common misunderstanding of progressive enhancement is that it’s inherently about JavaScript. That’s not true. You can apply progressive enhancement at every step of front-end development: HTML, CSS, and JavaScript.

But because of JavaScript’s strict error-handling model (at least compared to HTML and CSS), it’s in the JavaScript layer that the lack of a progressive enhancement mindset is most often felt.

That’s why I was saddened by the rise of frameworks and mindsets that assume the availability of JavaScript. Single page apps generally follow this assumption. Everything is delivered via JavaScript: content, markup, styles, and behaviour.

This leads to a terrible situation for performance. The user is left staring at a blank screen, waiting for something—anything!—to appear. Browsers are optimised to stream HTML as soon as they can. Delivering your content via JavaScript rather than HTML means you’re not taking advantage of that optimisation. Your users suffer.

But I was very heartened when I saw the pendulum start to swing back the other way a bit…

Let’s say you’re using a JavaScript framework like React. But the reason you’re using it isn’t because you’re doing anything particularly complex in the browser involving state management. You might be using React because you really like the way it encourages modularity and componentisation.

A few years ago, making a single page app was pretty much the only way you could use React. For you as a developer to experience the benefits of modularity and componentisation, users had to pay the price in the payload (and fragility) of client-side JavaScript.

That’s no longer the case. Now that we can run JavaScript on the server, it’s possible to build in a modular, componentised way and still use progressive enhancement.

When I first heard about Gatsby and Next.js, I thought that was the selling point. Run React on the server; send pre-generated HTML down the wire to the user; then enhance with client-side JavaScript.

But that’s not exactly how it works. The pre-generated HTML isn’t functional. It still needs a bucketload of JavaScript before it can do anything. The actual process is: Run React on the server; send pre-generated HTML down the wire to the user; then send everything again but this time in JavaScript, bundled with the entire React library.

This leads to a situation for users that’s almost worse than before. Instead of staring at a blank screen, now they get HTML lickety-split—excellent! But if they try to interact with what’s on screen, they’ll find that nothing is working yet. Even worse, once the JavaScript is delivered, and is being parsed, they probably can’t even scroll—their device is too busy interpreting all that JavaScript. Your users suffer.

All your content is sent twice. First HTML is sent from the server. These days this is called “server-side rendering”, even though for decades the technical term was “serving a web page” (I’m pretty sure the rendering part happens in a browser). Then a JavaScript library—plus all your bespoke JavaScript—is loaded. Then all your content is loaded again as JSON.

So you’ve got a facade of an interface that you can’t actually interact with until a deluge of JavaScript has been loaded, parsed and executed. The term used for this stage of the process is “hydration”, which makes it sound more like a relaxing treatment from Gwyneth Paltrow than the horrible user experience it is.

The idea is that subsequent navigations—which will happen with Ajax—should be snappy. But the price has already been paid by then. The initial loading experience is jagged and frustrating.

Don’t get me wrong: server-side rendering is great …if what you’re sending from the server is functional. It’s the combination of hollow HTML sent from the server, followed by a huge browser-freezing dump of JavaScript that is an anti-pattern.

This use of server-side rendering followed by hydration feels like progressive enhancement, because it separates out the delivery of markup and scripts. But it’s missing the mindset.

The layered approach of progressive enhancement echoes the separation of concerns in the front-end stack: HTML, CSS, and JavaScript—each layer expressing more power. But while these concepts are related, they’re not interchangable. Separating out the layers of your tech stack isn’t necessarily progressive enhancement. If you have some HTML that relies on JavaScript to be useful, then there’s no benefit in separating that HTML into a separate payload. The HTML that you initially send down the wire needs to be functional (at least at a basic level) before the JavaScript arrives.

I was a little disappointed to see Kyle Simpson—who I admire greatly—conflate separation of concerns with progressive enhancement in his talk from JSCamp 2019:

This content is here. I can see it, and it’s even styled. But I can’t click on the damn button because nothing has loaded in the JavaScript layer yet.

Anybody experienced that where you’ve been on a web page and it’s not really fully functional yet? I can see something but I can’t actually make any usage of it yet.

These are all things that cropped out of our thought process that said: “Let’s build the web in layers. Let’s deliver it progressively in layers. Because that’s morally right. We call this progressive enhancement. And let’s not worry too much about all these potential user experience flaws that may happen.”

That’s a spot-on description of server-side rendering and hydration, but it’s a gross mischaracterisation of progressive enhancement.

That button that requires JavaScript to work? That should’ve been generated with JavaScript. (For example, if you’re building a complex web app, consider sending a read-only view down the wire in HTML—then add any interactive interface elements with JavaScript in the browser.)

If people are equating progressive enhancement with thoughtless server-side rendering and hydration, then I can see why they’d be hostile towards it.

Users would be better served with unprogressive non-enhancement:

You take some structured content, which follows the vertical flow of the document in a way that everyone understands.

Which people traverse easily by either dragging their scroll bar with their mouse, or operating the keyboard using the up and down keys, or using the spacebar.

Or if they’re using a touch device, simply flicking backwards and forwards in that easy way that we’ve all become used to. What you do is you take that, and you fucking well leave it alone.

Alas, that’s not what tools like Gatsby offer. The latest post on their blog is called Why Gatsby is better with JavaScript:

But what about sites or pages where there is no client-side interactivity? Even for those pages, Gatsby offers performance benefits by including JavaScript.

I beg to differ.

(By the way, that same blog post also initially tried to equate the performance hit of client-side JavaScript with the performance hit of images. Andy explains why that’s disingenuous.)

Hope is on the horizon for React in the form of partial hydration. I sincerely hope that it will become the default way of balancing server-side rendering with just-in-time client-side interaction.

The situation we have now is the worst of both worlds: server-side rendering followed by a tsunami of hydration. It has a whiff of progressive enhancement to it (because there’s a cosmetic separation of concerns) but it has none of the user benefits.

Monday, February 3rd, 2020

Progressive enhancement doesn’t have to be hard - Levi McGranahan

It’s wild because in engineering terms this question, how does it fail?, should be the first one we ask, but oftentimes it is never even considered in front-end development. A good example is most client-side JS frameworks that render the entire UI in the browser, how would your app or site fail in that situation?

N26 and lack of JavaScript | Hugo Giraudel

JavaScript is fickle. It can fail to load. It can be disabled. It can be blocked. It can fail to run. It probably is fine most of the time, but when it fails, everything tends to go bad. And having such a hard point of failure is not ideal.

This is a very important point:

It’s important not to try making the no-JS experience work like the full one. The interface has to be revisited. Some features might even have to be removed, or dramatically reduced in scope. That’s also okay. As long as the main features are there and things work nicely, it should be fine that the experience is not as polished.

Monday, January 13th, 2020

Smaller HTML Payloads with Service Workers — Philip Walton

This is a great progressive enhancement for performance that uses a service worker to combine reusable bits of a page with fresh content. The numbers are very convincing!

Alas, the code is using the Workbox library, but figuring out the vanilla code to write shouldn’t be too tricky seeing as Philip talks through his logic step by step.

Sunday, December 29th, 2019

Move Fast & Don’t Break Things | Filament Group, Inc.

This is the transcript of a brilliant presentation by Scott—read the whole thing! It starts with a much-needed history lesson that gets to where we are now with the dismal state of performance on the web, and then gives a whole truckload of handy tips and tricks for improving performance when it comes to styles, scripts, images, fonts, and just about everything on the front end.

Essential!

Thursday, November 21st, 2019

A Non-Business Case for Supporting Old Browsers « Texte | ovl – code & design

Supporting Internet Explorer 11 doesn’t mean you need to give it the same experience as a modern browser:

Making sure (some of) your code works in older browsers, does not mean all functionality has to work everywhere. But, mind you, ninety percent of web development means putting text and images in boxes.

And to be honest, there is no reason to not enable this everywhere. Same for form submissions. Make it boring. Make it solid. And sprinkle delight on it.

Saturday, November 16th, 2019

The Layers Of The Web - Jeremy Keith on Vimeo

Thanks to the quick work of Marc and his team, the talk I gave at Beyond Tellerrand on Thursday was online within hours!

I’m really pleased with how this turned out. I wasn’t sure if anybody was going to be interested in the deep dive into history that I took for the first 15 or 20 minutes, but lots of people told me that they really enjoyed that part, so that makes me happy.

Tuesday, November 5th, 2019

JavaScript isn’t always available and it’s not the user’s fault by Adam Silver

It’s not a matter of if your users don’t have JavaScript—it’s a matter of when and how often.

The answer to that is around 1% of the time.

If you had an application bug which occurred 1% of the time, you’d fix it. No team I’ve come across would put up with that level of reliability.

The same goes for JavaScript. It’s not about people who turn it off. It’s about the nature of the web itself.

Tuesday, October 29th, 2019

Using ES6 modules for progressive enhancement | Blog | Decade City

It looks like modules could be a great way to serve modern JavaScript to modern browsers, and serve polyfills or older code to older browsers.

Sunday, October 13th, 2019

The “P” in Progressive Enhancement stands for “Pragmatism” - Andy Bell

With a Progressive Enhancement mindset, support actually means support. We’re not trying to create an identical experience: we’re creating a viable experience instead.

Also with Progressive Enhancement, it’s incredibly likely that your IE11 user, or your user on a low-powered device, or even your user on a poor connection won’t notice that they’re experiencing a “minor” experience because it’ll just work for them. This is the magic, right there. Everyone’s a winner.

Tuesday, September 17th, 2019

NoJS Side-by-Side

Drag this to your browser’s bookmark bar now!

Such a useful quick check for resilience—this bookmarklet shows you a side-by-side comparison of a site with JavaScript enabled and disabled.