Tags: ress

576

sparkline

Wednesday, October 21st, 2020

Van11y: Accessibility and Vanilla JavaScript - ES2015

Van11y (for Vanilla-Accessibility) is a collection of accessible scripts for rich interfaces elements, built using progressive enhancement and customisable.

Monday, October 19th, 2020

Continuous partial browser support

Vendor prefixes didn’t work. The theory was sound. It was a way of marking CSS and JavaScript features as being experimental. Developers could use the prefixed properties as long as they understood that those features weren’t to be relied upon.

That’s not what happened though. Developers used vendor-prefixed properties as though they were stable. Tutorials were published that basically said “Go ahead and use these vendor-prefixed properties and ship it!” There were even tools that would add the prefixes for you so you didn’t have to type them out for yourself.

Browsers weren’t completely blameless either. Long after features were standardised, they would only be supported in their prefixed form. Apple was and is the worst for this. To this day, if you want to use the clip-path property in your CSS, you’ll need to duplicate your declaration with -webkit-clip-path if you want to support Safari. It’s been like that for seven years and counting.

Like capitalism, vendor prefixes were one of those ideas that sounded great in theory but ended up being unworkable in practice.

Still, developers need some way to get their hands on experiment features. But we don’t want browsers to ship experimental features without some kind of safety mechanism.

The current thinking involves something called origin trials. Here’s the explainer from Microsoft Edge and here’s Google Chrome’s explainer:

  • Developers are able to register for an experimental feature to be enabled on their origin for a fixed period of time measured in months. In exchange, they provide us their email address and agree to give feedback once the experiment ends.
  • Usage of these experiments is constrained to remain below Chrome’s deprecation threshold (< 0.5% of all Chrome page loads) by a system which automatically disables the experiment on all origins if this threshold is exceeded.

I think it works pretty well. If you’re really interested in kicking the tyres on an experimental feature, you can opt in to the origin trial. But it’s very clear that you wouldn’t want to ship it to production.

That said…

You could ship something that’s behind an origin trial, but you’d have to make sure you’re putting safeguards in place. At the very least, you’d need to do feature detection. You certainly couldn’t use an experimental feature for anything mission critical …but you could use it as an enhancement.

And that is a pretty great way to think about all web features, experimental or otherwise. Don’t assume the feature will be supported. Use feature detection (or @supports in the case of CSS). Try to use the feature as an enhancement rather than a dependency.

If you treat all browser features as though they’re behind an origin trial, then suddenly the landscape of browser support becomes more navigable. Instead of looking at the support table for something on caniuse.com and thinking, “I wish more browsers supported this feature so that I could use it!”, you can instead think “I’m going to use this feature today, but treat it as an experimental feature.”

You can also do it for well-established features like querySelector, addEventListener, and geolocation. Instead of assuming that browser support is universal, it doesn’t hurt to take a more defensive approach. Assume nothing. Acknowledge and embrace unpredictability.

The debacle with vendor prefixes shows what happens if we treat experimental features as though they’re stable. So let’s flip that around. Let’s treat stable features as though they’re experimental. If you cultivate that mindset, your websites will be more robust and resilient.

Thursday, October 15th, 2020

Progressier | Make your website a PWA in 42 seconds

This in an intriguing promise (there’s no code yet):

A PWA typically requires writing a service worker, an app manifest and a ton of custom code. Progressier flattens the learning curve. Just add it to your html template — you’re done.

I worry that this one line of code will pull in many, many, many, many lines of JavaScript.

Wednesday, October 14th, 2020

Saving forms

I added a long-overdue enhancement to The Session recently. Here’s the scenario…

You’re on a web page with a comment form. You type your well-considered thoughts into a textarea field. But then something happens. Maybe you accidentally navigate away from the page or maybe your network connection goes down right when you try to submit the form.

This is a textbook case for storing data locally on the user’s device …at least until it has safely been transmitted to the server. So that’s what I set about doing.

My first decision was choosing how to store the data locally. There are multiple APIs available: sessionStorage, IndexedDB, localStorage. It was clear that sessionStorage wasn’t right for this particular use case: I needed the data to be saved across browser sessions. So it was down to IndexedDB or localStorage. IndexedDB is the more versatile and powerful—because it’s asynchronous—but localStorage is nice and straightforward so I decided on that. I’m not sure if that was the right decision though.

Alright, so I’m going to store the contents of a form in localStorage. It accepts key/value pairs. I’ll make the key the current URL. The value will be the contents of that textarea. I can store other form fields too. Even though localStorage technically only stores one value, that value can be a JSON object so in reality you can store multiple values with one key (just remember to parse the JSON when you retrieve it).

Now I know what I’m going to store (the textarea contents) and how I’m going to store it (localStorage). The next question is when should I do it?

I could play it safe and store the comment whenever the user presses a key within the textarea. But that seems like overkill. It would be more efficient to only save when the user leaves the current page for any reason.

Alright then, I’ll use the unload event. No! Bad Jeremy! If I use that then the browser can’t reliably add the current page to the cache it uses for faster back-forwards navigations. The page life cycle is complicated.

So beforeunload then? Well, maybe. But modern browsers also support a pagehide event that looks like a better option.

In either case, just adding a listener for the event could screw up the caching of the page for back-forwards navigations. I should only listen for the event if I know that I need to store the contents of the textarea. And in order to know if the user has interacted with the textarea, I’m back to listening for key presses again.

But wait a minute! I don’t have to listen for every key press. If the user has typed anything, that’s enough for me. I only need to listen for the first key press in the textarea.

Handily, addEventListener accepts an object of options. One of those options is called “once”. If I set that to true, then the event listener is only fired once.

So I set up a cascade of event listeners. If the user types anything into the textarea, that fires an event listener (just once) that then adds the event listener for when the page is unloaded—and that’s when the textarea contents are put into localStorage.

I’ve abstracted my code into a gist. Here’s what it does:

  1. Cut the mustard. If this browser doesn’t support localStorage, bail out.
  2. Set the localStorage key to be the current URL.
  3. If there’s already an entry for the current URL, update the textarea with the value in localStorage.
  4. Write a function to store the contents of the textarea in localStorage but don’t call the function yet.
  5. The first time that a key is pressed inside the textarea, start listening for the page being unloaded.
  6. When the page is being unloaded, invoke that function that stores the contents of the textarea in localStorage.
  7. When the form is submitted, remove the entry in localStorage for the current URL.

That last step isn’t something I’m doing on The Session. Instead I’m relying on getting something back from the server to indicate that the form was successfully submitted. If you can do something like that, I’d recommend that instead of listening to the form submission event. After all, something could still go wrong between the form being submitted and the data being received by the server.

Still, this bit of code is better than nothing. Remember, it’s intended as an enhancement. You should be able to drop it into any project and improve the user experience a little bit. Ideally, no one will ever notice it’s there—it’s the kind of enhancement that only kicks in when something goes wrong. A little smidgen of resilient web design. A defensive enhancement.

Tuesday, October 6th, 2020

Nils Binder’s Website

The “Adjust CSS” slider on this delightful homepage is an effective (and cute) illustration of progressive enhancement in action.

Monday, September 28th, 2020

Chris Ferdinandi: The Lean Web | July 2020 - YouTube

A great presentation on taking a sensible approach to web development. Great advice, as always, from the blogging machine that is Chris Ferdinandi.

The web is a bloated, over-engineered mess. And, according to developer and educator Chris Ferdinandi, many of our modern “best practices” are actually making the web worse. In this talk, Chris explores The Lean Web, a set of principles for a simpler, faster world-wide web.

Chris Ferdinandi: The Lean Web | July 2020

Saturday, September 26th, 2020

On not choosing WordPress for the W3C redesign project - Working in the open with W3C and Studio 24

The use of React complicates front-end build. We have very talented front-end developers, however, they are not React experts - nor should they need to be. I believe front-end should be built as standards-compliant HTML/CSS with JavaScript used to enrich functionality where necessary and appropriate.

Wednesday, September 9th, 2020

AVIF has landed - JakeArchibald.com

There’s a new image format on the browser block and it’s very performant indeed. Jake has all the details you didn’t ask for.

Wednesday, August 26th, 2020

Submitting a form with datalist

I’m a big fan of HTML5’s datalist element and its elegant design. It’s a way to progressively enhance any input element into a combobox.

You use the list attribute on the input element to point to the ID of the associated datalist element.

<label for="homeworld">Your home planet</label>
<input type="text" name="homeworld" id="homeworld" list="planets">
<datalist id="planets">
 <option value="Mercury">
 <option value="Venus">
 <option value="Earth">
 <option value="Mars">
 <option value="Jupiter">
 <option value="Saturn">
 <option value="Uranus">
 <option value="Neptune">
</datalist>

It even works on input type="color", which is pretty cool!

The most common use case is as an autocomplete widget. That’s how I’m using it over on The Session, where the datalist is updated via Ajax every time the input is updated.

But let’s stick with a simple example, like the list of planets above. Suppose the user types “jup” …the datalist will show “Jupiter” as an option. The user can click on that option to automatically complete their input.

It would be handy if you could automatically submit the form when the user chooses a datalist option like this.

Well, tough luck.

The datalist element emits no events. There’s no way of telling if it has been clicked. This is something I’ve been trying to find a workaround for.

I got my hopes up when I read Amber’s excellent article about document.activeElement. But no, the focus stays on the input when the user clicks on an option in a datalist.

So if I can’t detect whether a datalist has been used, this best I can do is try to infer it. I know it’s not exactly the same thing, and it won’t be as reliable as true detection, but here’s my logic:

  • Keep track of the character count in the input element.
  • Every time the input is updated in any way, check the current character count against the last character count.
  • If the difference is greater than one, something interesting happened! Maybe the user pasted a value in …or maybe they used the datalist.
  • Loop through each of the options in the datalist.
  • If there’s an exact match with the current value of the input element, chances are the user chose that option from the datalist.
  • So submit the form!

Here’s how that translates into DOM scripting code:

document.querySelectorAll('input[list]').forEach( function (formfield) {
  var datalist = document.getElementById(formfield.getAttribute('list'));
  var lastlength = formfield.value.length;
  var checkInputValue = function (inputValue) {
    if (inputValue.length - lastlength > 1) {
      datalist.querySelectorAll('option').forEach( function (item) {
        if (item.value === inputValue) {
          formfield.form.submit();
        }
      });
    }
    lastlength = inputValue.length;
  };
  formfield.addEventListener('input', function () {
    checkInputValue(this.value);
  }, false);
});

I’ve made a gist with some added feature detection and mustard-cutting at the start. You should be able to drop it into just about any page that’s using datalist. It works even if the options in the datalist are dynamically updated, like the example on The Session.

It’s not foolproof. The inference relies on the difference between what was previously typed and what’s autocompleted to be more than one character. So in the planets example, if someone has type “Jupite” and then they choose “Jupiter” from the datalist, the form won’t automatically submit.

But still, I reckon it covers most common use cases. And like the datalist element itself, you can consider this functionality a progressive enhancement.

Sunday, August 9th, 2020

Dream speak

I had a double-whammy of a stress dream during the week.

I dreamt I was at a conference where I was supposed to be speaking, but I wasn’t prepared, and I wasn’t where I was supposed to be when I was supposed to be there. Worse, my band were supposed to be playing a gig on the other side of town at the same time. Not only was I panicking about getting myself and my musical equipment to the venue on time, I was also freaking out because I couldn’t remember any of the songs.

You don’t have to be Sigmund freaking Freud to figure out the meanings behind these kinds of dreams. But usually these kind of stress dreams are triggered by some upcoming event like, say, oh, I don’t know, speaking at a conference or playing a gig.

I felt really resentful when I woke up from this dream in a panic in the middle of the night. Instead of being a topical nightmare, I basically had the equivalent of one of those dreams where you’re back at school and it’s the day of the exam and you haven’t prepared. But! When, as an adult, you awake from that dream, you have that glorious moment of remembering “Wait! I’m not in school anymore! Hallelujah!” Whereas with my double-booked stress dream, I got all the stress of the nightmare, plus the waking realisation that “Ah, shit. There are no more conferences. Or gigs.”

I miss them.

Mind you, there is talk of re-entering the practice room at some point in the near future. Playing gigs is still a long way off, but at least I could play music with other people.

Actually, I got to play music with other people this weekend. The music wasn’t Salter Cane, it was traditional Irish music. We gathered in a park, and played together while still keeping our distance. Jessica has written about it in her latest journal entry:

It wasn’t quite a session, but it was the next best thing, and it was certainly the best we’re going to get for some time. And next week, weather permitting, we’ll go back and do it again. The cautious return of something vaguely resembling “normality”, buoying us through the hot days of a very strange summer.

No chance of travelling to speak at a conference though. On the plus side, my carbon footprint has never been lighter.

Online conferences continue. They’re not the same, but they can still be really worthwhile in their own way.

I’ll be speaking at An Event Apart: Front-end Focus on Monday, August 17th (and I’m very excited to see Ire’s talk). I’ll be banging on about design principles for the web:

Designing and developing on the web can feel like a never-ending crusade against the unknown. Design principles are one way of unifying your team to better fight this battle. But as well as the design principles specific to your product or service, there are core principles underpinning the very fabric of the World Wide Web itself. Together, we’ll dive into applying these design principles to build websites that are resilient, performant, accessible, and beautiful.

Tickets are $350 but I can get you a discount. Use the code AEAJER to get $50 off.

I wonder if I’ll have online-appropriate stress dreams in the next week? “My internet is down!”, “I got the date and time wrong!”, “I’m not wearing any trousers!”

Actually, that’s pretty much just my waking life these days.

Saturday, August 1st, 2020

The amazing power of service workers | Go Make Things

So, why would you want to use a service worker? Here are some cool things you can do with it.

Chris lists some of the ways a service worker can enhance user experience.

Thursday, July 30th, 2020

Lateral Thinking With Withered Technology · Matthias Ott – User Experience Designer

What web development can learn from the Nintendo Game and Watch.

The Web now consists of an ever-growing number of different frameworks, methodologies, screen sizes, devices, browsers, and connection speeds. “Lateral thinking with withered technology” – progressively enhanced – might actually be an ideal philosophy for building accessible, performant, resilient, and original experiences for a wide audience of users on the Web.

Friday, July 24th, 2020

Progressive · Matthias Ott – User Experience Designer

Progressive enhancement is not yet another technology or passing fad. It is a lasting strategy, a principle, to deal with complexity because it lets you build inclusive, resilient experiences that work across different contexts and that will continue to work, once the next fancy JavaScript framework enters the scene – and vanishes again.

But why don’t more people practice progressive enhancement? Is it only because they don’t know better? This might, in fact, be the primary reason. On top of that, especially many JavaScript developers seem to believe that it is not possible or necessary to build modern websites and applications that way.

A heartfelt look at progressive enhancement:

Some look at progressive enhancement like a thing from the past of which the old guard just can’t let go. But to me, progressive enhancement is the future of the Web. It is the basis for building resilient, performant, interoperable, secure, usable, accessible, and thus inclusive experiences. Not only for the Web of today but for the ever-growing complexity of an ever-changing and ever-evolving Web.

MSEdgeExplainers/explainer.md at main · MicrosoftEdge/MSEdgeExplainers

This is great! Ideas for allowing more styling of form controls. I agree with the goals 100% and I like the look of the proposed solutions too.

The team behind this are looking for feedback so be sure to share your thoughts (I’ll probably formulate mine into a blog post).

Saturday, July 18th, 2020

Indexing your offline-capable pages with the Content Indexing API

A Chrome-only API for adding offline content to an index that can be exposed in Android’s “downloads” list. It just shipped in the lastest version of Chrome.

I’m not a fan of browser-specific non-standards but you can treat this as an enhancement—implementing it doesn’t harm non-supporting browsers and you can use feature detection to test for it.

Works offline

How do we tell our visitors our sites work offline? How do we tell our visitors that they don’t need an app because it’s no more capable than the URL they’re on right now?

Remy expands on his call for ideas on branding websites that work offline with a universal symbol, along the lines of what we had with RSS.

What I’d personally like to see as an outcome: some simple iconography that I can use on my own site and other projects that can offer ambient badging to reassure my visitor that the URL they’re visiting will work offline.

An Introduction To Stimulus.js — Smashing Magazine

An intro to Stimulus, the lightweight JavaScript library from Basecamp that takes a progressive enhancement approach, as seen with HEY.

One aspect I really like about the approach Stimulus encourages, is I can focus on sending HTML down the wire to my users, which is then jazzed up a little with JavaScript.

I’ve always been a fan of using the first few milliseconds of a user’s attention getting what I have to share with them — in front of them. Then worrying setting up the interaction layer while the user can start processing what they’re seeing.

Furthermore, if the JavaScript were to fail for whatever reason, the user can still see the content and interact with it without JavaScript.

Thursday, July 16th, 2020

Hey now

Progressive enhancement is at the heart of everything I do on the web. It’s the bedrock of my speaking and writing too. Whether I’m writing about JavaScript, Ajax, HTML, or service workers, it’s always through the lens of progressive enhancement. Sometimes I explicitly bang the drum, like with Resilient Web Design. Other times I don’t mention it by name at all, and instead talk only about its benefits.

I sometimes get asked to name some examples of sites that still offer their core functionality even when JavaScript fails. I usually mention Amazon.com, although that has other issues. But quite often I find that a lot of the examples I might mention are dismissed as not being “web apps” (whatever that means).

The pushback I get usually takes the form of “Well, that approach is fine for websites, but it wouldn’t work something like Gmail.”

It’s always Gmail. Which is odd. Because if you really wanted to flummox me with a product or service that defies progressive enhancement, I’d have a hard time with something like, say, a game (although it would be pretty cool to build a text adventure that’s progressively enhanced into a first-person shooter). But an email client? That would work.

Identify core functionality.

Read emails. Write emails.

Make that functionality available using the simplest possible technology.

HTML for showing a list of emails, HTML for displaying the contents of the HTML, HTML for the form you write the response in.

Enhance!

Now add all the enhancements that improve the experience—keyboard shortcuts; Ajax instead of full-page refreshes; local storage, all that stuff.

Can you build something that works just like Gmail without using any JavaScript? No. But that’s not what progressive enhancement is about. It’s about providing the core functionality (reading and writing emails) with the simplest possible technology (HTML) and then enhancing using more powerful technologies (like JavaScript).

Progressive enhancement isn’t about making a choice between using simpler more robust technologies or using more advanced features; it’s about using simpler more robust technologies and then using more advanced features. Have your cake and eat it.

Fortunately I no longer need to run this thought experiment to imagine what it would be like if something like Gmail were built with a progressive enhancement approach. That’s what HEY is.

Sam Stephenson describes the approach they took:

HEY’s UI is 100% HTML over the wire. We render plain-old HTML pages on the server and send them to your browser encoded as text/html. No JSON APIs, no GraphQL, no React—just form submissions and links.

If you think that sounds like the web of 25 years ago, you’re right! Except the HEY front-end stack progressively enhances the “classic web” to work like the “2020 web,” with all the fidelity you’d expect from a well-built SPA.

See? It’s not either resilient or modern—it’s resilient and modern. Have your cake and eat it.

And yet this supremely sensible approach is not considered “modern” web development:

The architecture astronauts who, for the past decade, have been selling us on the necessity of React, Redux, and megabytes of JS, cannot comprehend the possibility of building an email app in 2020 with server-rendered HTML.

HEY isn’t perfect by any means—they’ve got a lot of work to do on their accessibility. But it’s good to have a nice short answer to the question “But what about something like Gmail?”

It reminds me of responsive web design:

When Ethan Marcotte demonstrated the power of responsive design, it was met with resistance. “Sure, a responsive design might work for a simple personal site but there’s no way it could scale to a large complex project.”

Then the Boston Globe launched its responsive site. Microsoft made their homepage responsive. The floodgates opened again.

It’s a similar story today. “Sure, progressive enhancement might work for a simple personal site, but there’s no way it could scale to a large complex project.”

The floodgates are ready to open. We just need you to create the poster child for resilient web design.

It looks like HEY might be that poster child.

I have to wonder if its coincidence or connected that this is a service that’s also tackling ethical issues like tracking? Their focus is very much on people above technology. They’ve taken a human-centric approach to their product and a human-centric approach to web development …because ultimately, that’s what progressive enhancement is.

Wednesday, July 15th, 2020

Round 1: post your ideas / designs · Issue #1 · works-offline/logo

This is an interesting push by Remy to try to figure out a way we can collectively indicate to users that a site works offline.

Well, seeing as browsers have completely dropped the ball on any kind of ambient badging, it’s fair enough that we take matters into our own hands.

Wednesday, July 1st, 2020

Fussy Web, True Meaning. · Matthias Ott – User Experience Designer

Websites are primarily seen as functional software, built to fulfill a business objective and to reach quantifiable goals. The field of user experience is obsessed with KPIs, jobs-to-be-done, optimized user flows, and conversion rates. And in quest of ever more efficient processes – and in the spirit of true modernists –, design and development teams try to standardize solutions into reusable templates and components, streamlined pattern libraries, and scalable design systems.