Tags: speed

108

sparkline

AddyOsmani.com - Start Performance Budgeting

Great ideas from Addy on where to start with creating a performance budget that can act as a red line you don’t want to cross.

If it’s worth getting fast, it’s worth staying fast.

The Hurricane Web | Max Böck - Frontend Web Developer

When a storm comes, some of the big news sites like CNN and NPR strip down to a zippy performant text-only version that delivers the content without the bells and whistles.

I’d argue though that in some aspects, they are actually better than the original.

The numbers:

The “full” NPR site in comparison takes ~114 requests and weighs close to 3MB on average. Time to first paint is around 20 seconds on slow connections. It includes ads, analytics, tracking scripts and social media widgets.

Meanwhile, the actual news content is roughly the same.

I quite like the idea of storm-driven development.

…websites built for a storm do not rely on Javascript. The benefit simply does not outweigh the cost. They rely on resilient HTML, because that’s all that is really necessary here.

DasSur.ma – The 9am rush hour

Why JavaScript is a real performance bottleneck:

Rush hour. The worst time of day to travel. For many it’s not possible to travel at any other time of day because they need to get to work by 9am.

This is exactly what a lot of web code looks like today: everything runs on a single thread, the main thread, and the traffic is bad. In fact, it’s even more extreme than that: there’s one lane all from the city center to the outskirts, and quite literally everyone is on the road, even if they don’t need to be at the office by 9am.

Thoughts on Offline-first | Trys Mudford

Service Workers have such huge potential power, and I feel like we (developers on the web) have barely scratched the surface with what’s possible.

Needless to say, I couldn’t agree more!

Trys is thinking through some of the implicatons of service workers, like how we refresh stale content, and how we deal with slow networks—something that’s actually more of a challenge than dealing with no network connection at all.

There’s some good food for thought here.

I’m so excited to see how we can use Service Workers to improve the web.

How to Build a Low-tech Website?

This is fascinating! A website that’s fast and nimble, not for performance reasons, but to reduce energy consumption. It’s using static files, system fonts and dithered images. And no third-party scripts.

Thanks to a low-tech web design, we managed to decrease the average page size of the blog by a factor of five compared to the old design – all while making the website visually more attractive (and mobile-friendly). Secondly, our new website runs 100% on solar power, not just in words, but in reality: it has its own energy storage and will go off-line during longer periods of cloudy weather.

Ping! That’s the sound of my brain going “service worker!”

I’ve sent them an email offering my help.

Breaking the Deadlock Between User Experience and Developer Experience · An A List Apart Article

Yes! Yes! Yes!

Our efforts to measure and improve UX are packed with tragically ironic attempts to love our users: we try to find ways to improve our app experiences by bloating them with analytics, split testing, behavioral analysis, and Net Promoter Score popovers. We stack plugins on top of third-party libraries on top of frameworks in the name of making websites “better”—whether it’s something misguided, like adding a carousel to appease some executive’s burning desire to get everything “above the fold,” or something truly intended to help people, like a support chat overlay. Often the net result is a slower page load, a frustrating experience, and/or (usually “and”) a ton of extra code and assets transferred to the browser.

Even tools that are supposed to help measure performance in order to make improvements—like, say, Real User Monitoring—require you to add a script to your web pages …thereby increasing the file size and degrading performance! It’s ironic, in that Alanis Morissette sense of not understanding what irony is.

Stacking tools upon tools may solve our problems, but it’s creating a Jenga tower of problems for our users.

This is a great article about evaluating technology.

The “Developer Experience” Bait-and-Switch | Infrequently Noted

JavaScript is the web’s CO2. We need some of it, but too much puts the entire ecosystem at risk. Those who emit the most are furthest from suffering the consequences — until the ecosystem collapses. The web will not succeed in the markets and form-factors where computing is headed unless we get JS emissions under control.

Damn, that’s a fine opening! And the rest of this post by Alex is pretty darn great too. He’s absolutely right in calling out the fetishisation of developer experience at the expense of user needs:

The swap is executed by implying that by making things better for developers, users will eventually benefit equivalently. The unstated agreement is that developers share all of the same goals with the same intensity as end users and even managers. This is not true.

I have a feeling that this will be a very bitter pill for many developers to swallow:

If one views the web as a way to address a fixed market of existing, wealthy web users, then it’s reasonable to bias towards richness and lower production costs. If, on the other hand, our primary challenge is in growing the web along with the growth of computing overall, the ability to reasonably access content bumps up in priority. If you believe the web’s future to be at risk due to the unusability of most web experiences for most users, then discussion of developer comfort that isn’t tied to demonstrable gains for marginalized users is at best misguided.

Oh,captain, my captain!

Tools that cost the poorest users to pay wealthy developers are bunk.

Chrome’s NOSCRIPT Intervention - TimKadlec.com

Testing time with Tim.

Long story short, the NOSCRIPT intervention looks like a really great feature for users. More often than not it provides significant reduction in data usage, not to mention the reduction in CPU time—no small thing for the many, many people running affordable, low-powered devices.

Unit deconverter - make your units less useful!

Take a perfectly useful standardised measurement of length, weight, speed or time, and convert to something far less useful (but much more fun).

The Font Loading Checklist—zachleat.com

This checklist came in very handy during a performance-related workshop I was running today (I may have said the sentence “Always ask yourself What Would Zach Do?”).

  1. Start Important Font Downloads Earlier (Start a Web Font load)
  2. Prioritize Readable Text (Behavior while a Web Font is loading)
  3. Make Fonts Smaller (Reduce Web Font load time)
  4. Reduce Movement during Page Load (Behavior after a Web Font has loaded)

The first two are really straightforward to implement (with rel="preload" and font-display). The second two take more work (with subsetting and the font loading API).

Disable scripts for Data Saver users on slow connections - Chrome Platform Status

An excellent idea for people in low-bandwidth situations: automatically disable JavaScript. As long as the site is built with progressive enhancement, there’s no problem (and if not, the user is presented with the choice to enable scripts).

Power to the people!

On HTTPS and Hard Questions - TimKadlec.com

A great post by Tim following on from the post by Eric I linked to last week.

Is a secure site you can’t access better than an insecure one you can?

He rightly points out that security without performance is exlusionary.

…we’ve made a move to increase the security of the web by doing everything we can to get everything running over HTTPS. It’s undeniably a vital move to make. However this combination—poor performance but good security—now ends up making the web inaccessible to many.

Security. Performance. Accessibility. All three matter.

Stop building for San Francisco

Overwhelmingly, our software is built by well-paid teams with huge monitors and incredibly fast computers running on a high-bandwidth internet connection. We run MacBook Pros, we have cinema displays, we carry iPhones.

That’s not what the rest of the world looks like.

Securing Web Sites Made Them Less Accessible – Eric’s Archived Thoughts

This is a heartbreaking observation by Eric. He’s not anti-HTTPS by any stretch, but he is pointing out that caching servers become a thing of the past on a more secure web.

Can we do anything? For users of up-to-date browsers, yes: service workers create a “good” man in the middle that sidesteps the HTTPS problem, so far as I understand. So if you’re serving content over HTTPS, creating a service worker should be one of your top priorities right now, even if it’s just to do straightforward local caching and nothing fancier.

The Web is Made of Edge Cases by Taylor Hunt on CodePen

Oh, this is magnificent! A rallying call for everyone designing and developing on the web to avoid making any assumptions about the people we’re building for:

People will use your site how they want, and according to their means. That is wonderful, and why the Web was built.

I would even say that the % of people viewing your site the way you do rapidly approaches zilch.

Dynamic resources using the Network Information API and service workers

Smart thinking—similar to this post from last year—about using the navigator.connection API from a service worker to serve up bandwidth-appropriate images.

This is giving me some ideas for my own site.

The Cost Of JavaScript In 2018 – Addy Osmani – Medium

Addy takes a deep, deep dive into JavaScript performance on mobile, and publishes his findings on Ev’s blog, including this cra-yay-zy suggestion:

Maybe server-side-rendered HTML would actually be faster. Consider limiting the use of client-side frameworks to pages that absolutely require them.

The Bullshit Web — Pixel Envy

There is a cumulative effect of bullshit; its depth and breadth is especially profound. In isolation, the few seconds that it takes to load some extra piece of surveillance JavaScript isn’t much. Neither is the time it takes for a user to hide an email subscription box, or pause an autoplaying video. But these actions compound on a single webpage, and then again across multiple websites, and those seemingly-small time increments become a swirling miasma of frustration and pain.

I agree completely. And AMP is not the answer:

Given the assumption that any additional bandwidth offered to web developers will immediately be consumed, there seems to be just one possible solution, which is to reduce the amount of bytes that are transmitted. For some bizarre reason, this hasn’t happened on the main web, because it somehow makes more sense to create an exact copy of every page on their site that is expressly designed for speed. Welcome back, WAP — except, for some reason, this mobile-centric copy is entirely dependent on yet more bytes. This is the dumbfoundingly dumb premise of AMP.

Prioritizing the Long-Tail of Performance - TimKadlec.com

Focusing on the median or average is the equivalent of walking around with a pair of blinders on. We’re limiting our perspective and, in the process, missing out on so much crucial detail. By definition, when we make improving the median or average our goal, we are focusing on optimizing for only about half of our sessions.

Tim does the numbers…

By honing in on the 90th—or 95th or similar—we ensure those weaknesses don’t get ignored. Our goal is to optimize the performance of our site for the majority of our users—not just a small subset of them.

Fixing these webs - daverupert.com

I’m a fan of fast websites. Your website needs to be fast. Our collective excuses, hand-wringing, and inability to come to terms with the problem-set (There is too much script) and solutions (Use less script) of modern web development is getting tired.

I agree with every word of this.

Sadly, I think the one company with a browser that has marketshare dominance and could exert the kind of pressure required to stop ad tracking and surveillance capitalism is not incentivized to do so.

So the problem is approached from the other end. Blame is piled on authors for slow first-party code. We’re told to use certain mobile publishing frameworks that syndicate to proprietary CDNs to appease the gods of luck and fortune.