Link tags: resilience

58

sparkline

BBC - Future Media Standards & Guidelines - Accessibility Guidelines v2.0

A timely reminder:

The minimum dependency for a web site should be an internet connection and the ability to parse HTML.

Get Static – Eric’s Archived Thoughts

Performance matters …especially when the chips are down:

If you are in charge of a web site that provides even slightly important information, or important services, it’s time to get static. I’m thinking here of sites for places like health departments (and pretty much all government services), hospitals and clinics, utility services, food delivery and ordering, and I’m sure there are more that haven’t occurred to me. As much as you possibly can, get it down to static HTML and CSS and maybe a tiny bit of enhancing JS, and pare away every byte you can.

Software and Home Renovation | Jim Nielsen’s Weblog

Lessons for web development from a home renovation project:

  • Greenfield Projects Are Everyone’s Favorite
  • The Last Person’s Work Is Always Bewildering
  • It’s All About the Trade-Offs
  • It ALWAYS Takes Longer Than You Think
  • Communication, Communication, Communication!

And there’s this:

You know those old homes people love because they’re unique, have lasted for decades, and have all that character? In contrast, you have these modern subdivision homes that, while shiny and new, are often bland and identical (and sometimes shoddily built). node_modules is like the suburbia/subdivision of modern web development: it seems nice and fancy today, and most everyone is doing it, but in 30 years everyone will hate the idea. They’ll all need to be renovated or torn down. Meanwhile, the classical stuff that’s still standing from 100 years ago lives on but nobody seems to be building houses that way anymore for some reason. Similarly, the first website ever is still viewable in all modern web browsers. But many websites built last year on last year’s bleeding edge tech already won’t work in a browser.

Progressive enhancement doesn’t have to be hard - Levi McGranahan

It’s wild because in engineering terms this question, how does it fail?, should be the first one we ask, but oftentimes it is never even considered in front-end development. A good example is most client-side JS frameworks that render the entire UI in the browser, how would your app or site fail in that situation?

NoJS Side-by-Side

Drag this to your browser’s bookmark bar now!

Such a useful quick check for resilience—this bookmarklet shows you a side-by-side comparison of a site with JavaScript enabled and disabled.

Ralph Lavelle: On resilience

Thoughts on frameworks, prompted by a re-reading of Resilient Web Design. I quite like the book being described as a “a bird’s-eye view of the whole web design circus.”

Chaos Design: Before the robots take our jobs, can we please get them to help us do some good work?

This is a great piece! It starts with a look back at some of the great minds of the nineteenth century: Herschel, Darwin, Babbage and Lovelace. Then it brings us, via JCR Licklider, to the present state of the web before looking ahead to what the future might bring.

So what will the life of an interface designer be like in the year 2120? or 2121 even? A nice round 300 years after Babbage first had the idea of calculations being executed by steam.

I think there are some missteps along the way (I certainly don’t think that inline styles—AKA CSS in JS—are necessarily a move forwards) but I love the idea of applying chaos engineering to web design:

Think of every characteristic of an interface you depend on to not ‘fail’ for your design to ‘work.’ Now imagine if these services were randomly ‘failing’ constantly during your design process. How might we design differently? How would our workflows and priorities change?

In defence of graceful degradation and where progressive enhancement comes in by Adam Silver

This does a really good job of describing the difference between progressive enhancement and graceful degradation …but I don’t buy the conclusion: I don’t think that feature detection equates to graceful degradation. I do agree though that, when it comes to JavaScript, the result of progressive enhancement is that the language degrades gracefully.

This is progressive enhancement. An approach to making interfaces that ensures JavaScript degrades gracefully—something that HTML and CSS do automatically.

But there’s a difference between something degrading gracefully (the result) and graceful degradation (the approach).

How I failed the <a>

I think the situation that Remy outlines here is quite common (in client-rehydrated server-rendered pages), but what’s less common is Remy’s questioning and iteration.

So I now have a simple rule of thumb: if there’s an onClick, there’s got to be an anchor around the component.

Stuffing the Front End

53% of mobile visits leave a page that takes longer than 3 seconds to load. That means that a large number of visitors probably abandoned these sites because they were staring at a blank screen for 3 seconds, said “fuck it,” and left approximately half way before the page showed up. The fact that the next page interaction would have been quicker—assuming all the JS files even downloaded correctly in the first attempt—doesn’t amount to much if they didn’t stick around for the first page to load. What was gained by putting the business logic in the front end in this scenario?

The Lean Web video from Boston CSS | Go Make Things

A good talk from from Chris Ferdinandi, who says:

One of the central themes of my talk on The Lean Web is that we as developers repeatedly take all of the great things the web and browsers give us out-of-the-box, break them, and then re-implement them poorly with JavaScript.

A Simpler Web: I Concur

Tales of over-engineering, as experienced by Bridget. This resonates with me, and I think she’s right when she says that these things go in cycles. The pendulum always ends up swinging the other way eventually.

New Adventures 2019 | Part Two: Progressive Web | Abstrakt

Here’s a thorough blow-by-blow account of the workshop I ran in Nottingham last week:

Jeremy’s workshop was a fascinating insight into resilience and how to approach a web project with ubiquity and consistency in mind from both a design and development point of view.

Openness and Longevity

A really terrific piece from Garrett on the nature of the web:

Markup written almost 30 years ago runs exactly the same today as it did then without a single modification. At the same time, the platform has expanded to accommodate countless enhancements. And you don’t need a degree in computer science to understand or use the vast majority of it. Moreover, a well-constructed web page today would still be accessible on any browser ever made. Much of the newer functionality wouldn’t be supported, but the content would be accessible.

I share his concerns about the maintainability overhead introduced by new tools and frameworks:

I’d argue that for every hour these new technologies have saved me, they’ve cost me another in troubleshooting or upgrading the tool due to a web of invisible dependencies.

The Elements of UI Engineering - Overreacted

These are good challenges to think about. Almost all of them are user-focused, and there’s a refreshing focus away from reaching for a library:

It’s tempting to read about these problems with a particular view library or a data fetching library in mind as a solution. But I encourage you to pretend that these libraries don’t exist, and read again from that perspective. How would you approach solving these issues?

The Hurricane Web | Max Böck - Frontend Web Developer

When a storm comes, some of the big news sites like CNN and NPR strip down to a zippy performant text-only version that delivers the content without the bells and whistles.

I’d argue though that in some aspects, they are actually better than the original.

The numbers:

The “full” NPR site in comparison takes ~114 requests and weighs close to 3MB on average. Time to first paint is around 20 seconds on slow connections. It includes ads, analytics, tracking scripts and social media widgets.

Meanwhile, the actual news content is roughly the same.

I quite like the idea of storm-driven development.

…websites built for a storm do not rely on Javascript. The benefit simply does not outweigh the cost. They rely on resilient HTML, because that’s all that is really necessary here.

How do you mark up an accordion? — Sara Soueidan

I love this deep dive that Sara takes into the question of marking up content for progressive disclosure. It reminds me Dan’s SimpleQuiz from back in the day.

Then there’s this gem, which I think is a terrificly succinct explanation of the importance of meaningful markup:

It’s always necessary, in my opinion, to consider what content would render and look like in foreign environments, or in environments that are not controlled by our own styles and scripts. Writing semantic HTML is the first step in achieving truly resilient Web sites and applications.

The Web is Made of Edge Cases by Taylor Hunt on CodePen

Oh, this is magnificent! A rallying call for everyone designing and developing on the web to avoid making any assumptions about the people we’re building for:

People will use your site how they want, and according to their means. That is wonderful, and why the Web was built.

I would even say that the % of people viewing your site the way you do rapidly approaches zilch.

Short note on progressive ARIA by The Paciello Group

Léonie makes a really good point here: if you’re adding aria attributes to indicate interactions you’re making available through JavaScript, then make sure you also use JavaScript to add those aria attributes.

User agent as a runtime | Blog | Decade City

Ultimately you can’t control when and how things go wrong but you do have some control over what happens. This is why progressive enhancement exists.