Link tags: speed

157

sparkline

Visitors, Developers, or Machines

Garrett’s observation is spot-on here:

I’ve been trying to understand the appeal of these frameworks by giving them an objective chance. I’ve expanded my knowledge of JavaScript and tried to give them the benefit of the doubt. They do have their places, but the only explanation I can come up with is that developers are taking a similar approach as Ruby and focusing on developer convenience and productivity. Only, instead of Ruby’s performance being tied to the CPU level, JavaScript frameworks push the performance burden to the client.

In both cases, the tradeoff happens in the name of developer happiness and productivity, but the strategies have entirely different consequences. With Ruby, the CPU is still (mostly) the responsibility of the development team, and it can be upgraded. With JavaScript, the page weight becomes an externality pushed onto visitors.

Why 543 KB keep me up at night - Manuel Matuzović

How and when did I get to the point where I would consider a page weight of 4 MB on a large page and 500 KB on a small page normal?

This isn’t just a well-earned rant from Manuel. I mean, it *is that, but it’s also packed with practical performance advice.

Reflections on software performance - Made of Bugs

I’ve really come to appreciate that performance isn’t just some property of a tool independent from its functionality or its feature set. Performance — in particular, being notably fast — is a feature in and of its own right, which fundamentally alters how a tool is used and perceived.

This is a fascinating look into how performance has knock-on effects beyond the obvious:

It’s probably fairly intuitive that users prefer faster software, and will have a better experience performing a given task if the tools are faster rather than slower.

What is perhaps less apparent is that having faster tools changes how users use a tool or perform a task.

This observation is particularly salient for web developers:

We have become accustomed to casually giving up factors of two or ten or more with our choices of tools and libraries, without asking if the benefits are worth it.

Web bloat

Pages are often designed so that they’re hard or impossible to read if some dependency fails to load. On a slow connection, it’s quite common for at least one depedency to fail.

Fire up Reader Mode and read this excellent article informed by data from using a typically slow connection in rural USA today. Two findings are:

  1. A large fraction of the web is unusable on a bad connection. Even on a good (0% packetloss, no ping spike) dialup connection, some sites won’t load.
  2. Some sites will use a lot of data!

CrUX.RUN

This is so useful! Get instant results from Google’s Chrome User Experience Report without having to wait (or pay) for BigQuery.

Here’s an example of my site’s metrics over the last few months, complete with nice charts.

Page Speed Benchmarks | SpeedCurve

This is going to be so useful for client work at Clearleft—instant snapshots of performance metrics across industries and regions!

For example, we’ve been working a lot with the travel sector, and now we can call up these benchmarks without having to generate a whole bunch of Web Page Test results ourselves.

See Tammy’s blog post for me details.

Innovation Can’t Keep the Web Fast | CSS-Tricks

I’ve come to accept that our current approach to remedy poor performance largely consists of engineering techniques that stem from the ill effects of our business, product management, and engineering practices. We’re good at applying tourniquets, but not so good at sewing up deep wounds.

It’s becoming increasingly clear that web performance isn’t solely an engineering problem, but a problem of people.

Software disenchantment @ tonsky.me

I want to deliver working, stable things. To do that, we need to understand what we are building, in and out, and that’s impossible to do in bloated, over-engineered systems.

This pairs nicely with Craig’s post on fast software.

Everyone is busy building stuff for right now, today, rarely for tomorrow. But it would be nice to also have stuff that lasts a little longer than that.

I just got a new laptop and I decided to go with fresh installs rather than a migration. This really resonates:

It just seems that nobody is interested in building quality, fast, efficient, lasting, foundational stuff anymore. Even when efficient solutions have been known for ages, we still struggle with the same problems: package management, build systems, compilers, language design, IDEs.

Smaller HTML Payloads with Service Workers — Philip Walton

This is a great progressive enhancement for performance that uses a service worker to combine reusable bits of a page with fresh content. The numbers are very convincing!

Alas, the code is using the Workbox library, but figuring out the vanilla code to write shouldn’t be too tricky seeing as Philip talks through his logic step by step.

Move Fast & Don’t Break Things | Filament Group, Inc.

This is the transcript of a brilliant presentation by Scott—read the whole thing! It starts with a much-needed history lesson that gets to where we are now with the dismal state of performance on the web, and then gives a whole truckload of handy tips and tricks for improving performance when it comes to styles, scripts, images, fonts, and just about everything on the front end.

Essential!

Six Web Performance Technologies to Watch in 2020 – Simon Hearne

The inexorable rise of frameworks such as Angular, React, Vue and their many cousins has been led by an assumption that managing state in the browser is quicker than a request to a server. This assumption, I can only assume, is made by developers who have flagship mobile devices or primarily work on desktop devices.

Chromium Blog: Moving towards a faster web

It’s nice to see that the Chrome browser will add interface enhancements to show whether you can expect a site to load fast or slowly.

Just a shame that the Google search team aren’t doing this kind of badging …unless you’ve given up on your website and decided to use Google AMP instead.

Maybe the Chrome team can figure out what the AMP team are doing to get such preferential treatment from the search team.

Location, Privilege and Performant Websites

Testing on a <$100 Android device on a 3G network should be an integral part of testing your website. Not everyone is on a brand-new device or upgrades often, especially with the price point of a high-end phones these days.

When we design and build our websites with the outliers in mind, whether it’s for performance or even user experience, we build an experience that can be easy for all to access and use — and that’s what the web is about, access and information for all.

Why Progressive Web Apps Are The Future of Mobile Web [2019 Research]

PWAs just work better than your typical mobile site. Period.

But bear in mind:

Maybe simply because the “A” in PWA stands for “app,” too much discussion around PWAs focuses on comparing and contrasting to native mobile applications. We believe this comparison (and the accompanying discussion) is misguided.

5G Will Definitely Make the Web Slower, Maybe | Filament Group, Inc.

The Jevons Paradox in action:

Faster networks should fix our performance problems, but so far, they have had an interesting if unintentional impact on the web. This is because historically, faster network speed has enabled developers to deliver more code to users—in particular, more JavaScript code.

And because it’s JavaScript we’re talking about:

Even if folks are on a new fast network, they’re very likely choking on the code we’re sending, rendering the potential speed improvements of 5G moot.

The longer I spend in this field, the more convinced I am that web performance is not a technical problem; it’s a people problem.

“Never-Slow Mode” (a.k.a. “Slightly-Fast Mode”) Explained

I would very much like this to become a reality.

Never-Slow Mode (“NSM”) is a mode that sites can opt-into via HTTP header. For these sites, the browser imposes per-interaction resource limits, giving users a better user experience, potentially at the cost of extra developer work. We believe users are happier and more engaged on fast sites, and NSM attempts to make it easier for sites to guarantee speed to users. In addition to user experience benefits, sites might want to opt in because browsers could providing UI to users to indicate they are in “fast mode” (a TLS lock icon but for speed).

Accessibility and web performance are not features, they’re the baseline | CSS-Tricks

Performance and accessibility aren’t features that can linger at the bottom of a Jira board to be considered later when it’s convenient.

Instead we must start to see inaccessible and slow websites for what they are: a form of cruelty. And if we want to build a web that is truly a World Wide Web, a place for all and everyone, a web that is accessible and fast for as many people as possible, and one that will outlive us all, then first we must make our websites something else altogether; we must make them kind.

Time to First Byte: What It Is and Why It Matters by Harry Roberts

Harry takes a deep dive into the performance metric of “time to first byte”, or TTFB if you using initialisms that take as long to say as the thing they’re abbreviating.

This makes a great companion piece to Drew’s article on server timing headers.

Fast Software, the Best Software — by Craig Mod

Fast software is not always good software, but slow software is rarely able to rise to greatness. Fast software gives the user a chance to “meld” with its toolset. That is, not break flow.