Tags: speed

84

sparkline

Wednesday, February 14th, 2018

AMP: the missing controversy – Ferdy Christant

AMP pages aren’t fast because of the AMP format. AMP pages are fast when you visit one via Google search …because of Google’s monopoly on preloading:

Technically, a clever trick. It’s hard to argue with that. Yet I consider it cheating and anti competitive behavior.

Preloading is exclusive to AMP. Google does not preload non-AMP pages. If Google would have a genuine interest in speeding up the whole web on mobile, it could simply preload resources of non-AMP pages as well. Not doing this is a strong hint that another agenda is at work, to say the least.

Sunday, February 11th, 2018

Seva Zaikov - Single Page Application Is Not a Silver Bullet

Harsh (but fair) assessment of the performance costs of doing everything on the client side.

Friday, January 19th, 2018

Heisenberg

I wrote about Google Analytics yesterday. As usual, I syndicated the post to Ev’s blog, and I got an interesting response over there. Kelly Burgett set me straight on some of the finer details of how goals work, and finished with this thought:

You mention “delivering a performant, accessible, responsive, scalable website isn’t enough” as if it should be, and I have to disagree. It’s not enough for a business to simply have a great website if you are unable to understand performance of channel marketing, track user demographics and behavior on-site, and optimize your site/brand based on that data. I’ve seen a lot of ugly sites who have done exceptionally well in terms of ROI, simply because they are getting the data they need from the site in order make better business decisions. If your site cannot do that (ie. through data collection, often third party scripts), then your beautifully-designed site can only take you so far.

That makes an excellent case for having analytics. But that’s not necessarily the same as having Google analytics, or even JavaScript-driven analytics at all.

By far the most useful information you get from analytics is around where people have come from, where did they go next, and what kind of device are they using. None of that information requires JavaScript. It’s all available from your server logs.

I don’t want to come across all old-man-yell-at-cloud here, but I’m trying to remember at what point self-hosted software for analysing your log traffic became not good enough.

Here’s the thing: logging on the server has no effect on the user experience. It’s basically free, in terms of performance. Logging via JavaScript, by its very nature, has some cost. Even if its negligible, that’s one more request, and that’s one more bit of processing for the CPU.

All of the data that you can only get via JavaScript (in-page actions, heat maps, etc.) are, in my experience, better handled by dedicated software. To me, that kind of more precise data feels different to analytics in the sense of funnels, conversions, goals and all that stuff.

So in order to get more fine-grained data to analyse, our analytics software has now doubled down on a technology—JavaScript—that has an impact on the end user, where previously the act of observation could be done at a distance.

There are also blind spots that come with JavaScript-based tracking. According to Google Analytics, 0% of your customers don’t have JavaScript. That’s not necessarily true, but there’s literally no way for Google Analytics—which relies on JavaScript—to even do its job in the absence of JavaScript. That can lead to a dangerous situation where you might be led to think that 100% of your potential customers are getting by, when actually a proportion might be struggling, but you’ll never find out about it.

Related: according to Google Analytics, 0% of your customers are using ad-blockers that block requests to Google’s servers. Again, that’s not necessarily a true fact.

So I completely agree than analytics are a good thing to have for your business. But it does not follow that Google Analytics is a good thing for your business. Other options are available.

I feel like the assumption that “analytics = Google Analytics” is like the slippery slope in reverse. If we’re all agreed that analytics are important, then aren’t we also all agreed that JavaScript-based tracking is important?

In a word, no.

This reminds me of the arguments made in favour of intrusive, bloated advertising scripts. All of the arguments focus on the need for advertising—to stay in business, to pay the writers—which are all great reasons for advertising, but have nothing to do with JavaScript, which is at the root of the problem. Everyone I know who uses an ad-blocker—including me—doesn’t use it to stop seeing adverts, but to stop the performance of the page being degraded (and to avoid being tracked across domains).

So let’s not confuse the means with the ends. If you need to have advertising, that doesn’t mean you need to have horribly bloated JavaScript-based advertising. If you need analytics, that doesn’t mean you need an analytics script on your front end.

Thursday, January 18th, 2018

Analysing analytics

Hell is other people’s JavaScript.

There’s nothing quite so crushing as building a beautifully performant website only to have it infested with a plague of third-party scripts that add to the weight of each page and reduce the responsiveness, making a mockery of your well-considered performance budget.

Trent has been writing about this:

My latest realization is that delivering a performant, accessible, responsive, scalable website isn’t enough: I also need to consider the impact of third-party scripts.

He’s started the process by itemising third-party scripts. Frustratingly though, there’s rarely one single culprit that you can point to—it’s the cumulative effect of “just one more beacon” and “just one more analytics script” and “just one more A/B testing tool” that adds up to a crappy experience that warms your user’s hands by ensuring your site is constantly draining their battery.

Actually, having just said that there’s rarely one single culprit, Adobe Tag Manager is often at the root of third-party problems. That and adverts. It’s like opening the door of your beautifully curated dream home, and inviting a pack of diarrhetic elephants in: “Please, crap wherever you like.”

But even the more well-behaved third-party scripts can get out of hand. Google Analytics is so ubiquitous that it’s hardly even considered in the list of potentially harmful third-party scripts. On the whole, it’s a fairly well-behaved citizen of your site’s population of third-party scripts (y’know, leaving aside the whole surveillance capitalism business model that allows you to use such a useful tool for free in exchange for Google tracking your site’s visitors across the web and selling the insights from that data to advertisers).

The initial analytics script that you—asynchronously—load into your page isn’t very big. But depending on how you’ve configured your Google Analytics account, that might just be the start of a longer chain of downloads and event handlers.

Ed recently gave a lunchtime presentation at Clearleft on using Google Analytics—he professes modesty but he really knows his stuff. He was making sure that everyone knew how to set up goals’n’stuff.

As I understand it, there are two main categories of goals: events and destinations (there are also durations and pages, but they feel similar to destinations). You use events to answer questions like “Did the user click on this button?” or “Did the user click on that search field?”. You use destinations to answer questions like “Did the user arrive at this page?” or “Did the user come from that page?”

You can add as many goals to your site’s analytics as you want. That’s an intoxicating offer. The problem is that there is potentially a cost for each goal you create. It’s an invisible cost. It’s paid by the user in the currency of JavaScript sent down the wire (I wish that the Google Analytics admin interface were more like the old interface for Google Fonts, where each extra file you added literally pushed a needle higher on a dial).

It strikes me that the event-based goals would necessarily require more JavaScript in order to listen out for those clicks and fire off that information. The destination-based goals should be able to get all the information needed from regular page navigations.

So I have a hypothesis. I think that destination-based goals are less harmful to performance than event-based goals. I might well be wrong about that, and if I am, please let me know.

With that hypothesis in mind, and until I learn otherwise, I’ve got two rules of thumb to offer when it comes to using Google Analytics:

  1. Try to keep the number of goals to a minimum.
  2. If you must create a goal, favour destinations over events.

Saturday, January 13th, 2018

The Human Computer’s Dreams Of The Future by Ida Rhodes (PDF)

From the proceedings of the Electronic Computer Symposium in 1952, the remarkable Ida Rhodes describes a vision of the future…

My crystal ball reveals Mrs. Mary Jones in the living room of her home, most of the walls doubling as screens for projected art or information. She has just dialed her visiophone. On the wall panel facing her, the full colored image of a rare orchid fades, to be replaced by the figure of Mr. Brown seated at his desk. Mrs. Jones states her business: she wishes her valuable collection of orchid plants insured. Mr. Brown consults a small code book and dials a string of figures. A green light appears on his wall. He asks Mrs. Jones a few pertinent questions and types out her replies. He then pushes the start button. Mr. Brown fades from view. Instead, Mrs. Jones has now in front of her a set of figures relating to the policy in which she is interested. The premium rate and benefits are acceptable and she agrees to take out the policy. Here is Brown again. From a pocket in his wall emerges a sealed, addressed, and postage-metered envelope which drops into the mailing chute. It contains, says Brown, an application form completely filled out by the automatic computer and ready for her signature.

Monday, November 27th, 2017

Network based image loading using the Network Information API in Service Worker | justmarkup

This is clever—you can use the navigator.connection API from a service worker (because it’s asynchronous) which means you can have a service worker script that serves differently sized images based on bandwidth.

The Fallacies of Distributed Computing (Applied to Front-End Performance) – CSS Wizardry – CSS Architecture, Web Performance Optimisation, and more, by Harry Roberts

Harry cautions against making assumptions about the network when it comes to front-end development:

Yet time and time again I see developers falling into the same old traps—making assumptions or overly-optimistic predictions about the conditions in which their apps will run.

Planning for the worst-case scenario is never a wasted effort:

If you build and structure applications such that they survive adverse conditions, then they will thrive in favourable ones.

Saturday, November 18th, 2017

Using SVG as placeholders — More Image Loading Techniques - JMPerez Blog

Here’s a clever to technique to improve the perceived performance of image loading with a polygonal SVG placeholder.

Sunday, October 29th, 2017

Can You Afford It?: Real-world Web Performance Budgets – Infrequently Noted

Alex looks at the mindset and approaches you need to adopt to make a performant site. There’s some great advice in here for setting performance budgets for JavaScript.

JavaScript is the single most expensive part of any page in ways that are a function of both network capacity and device speed. For developers and decision makers with fast phones on fast networks this is a double-whammy of hidden costs.

Monday, October 9th, 2017

“async” attribute on img, and corresponding “ready” event · Issue #1920 · whatwg/html

It looks like the async attribute is going to ship in Chrome for img elements:

This attribute would have two states:

  • “on”: This indicates that the developer prefers responsiveness and performance over atomic presentation of content.
  • “off”: This indicates that the developer prefers atomic presentation of content over responsiveness.

Monday, October 2nd, 2017

eBay’s Font Loading Strategy | eBay Tech Blog

Here’s the flow that eBay use for the font-loading. They’ve decided that on the very first page view, seeing a system font is an acceptable trade-off. I think that makes sense for their situation.

Interestingly, they set a flag for subsequent visits using localStorage rather than a cookie. I wonder why that is? For me, the ability to read cookies on the server as well as the client make them quite handy for situations like this.

Monday, September 25th, 2017

Network Information API

It looks like this is landing in Chrome. The navigator.connection.type property will allow us to progressively enhance based on connection type:

A web application that makes use of a service worker to cache resources during installation might have different bundles of assets that it might cache: a list of crucial assets that are cached unconditionally, and a bundle of larger, optional assets that are only cached ahead of time when navigator.connection.type is 'ethernet' or 'wifi'.

There are potential security issues around fingerprinting that are addressed in this document.

Why it’s tricky to measure Server-side Rendering performance

A good analysis, but my takeaway was that the article could equally be called Why it’s tricky to measure Client-side Rendering performance. In a nutshell, just looking at metrics can be misleading.

Pre-classified metrics are a good signal for measuring performance. At the end of the day though, they may not properly reflect your site’s performance story. Profile each possibility and give it the eye test.

And it’s always worth bearing this in mind:

The best way to prioritize content by building a static site. Ask yourself if the content needs JavaScript.

Tuesday, August 22nd, 2017

Inside a super fast CSS engine: Quantum CSS (aka Stylo) ★ Mozilla Hacks – the Web developer blog

Lin gives a deep dive into Firefox’s new CSS engine specifically, but this is also an excellent primer on how browsers handle CSS in general: parsing, styling, layout, painting, compositing, and rendering.

Thursday, August 3rd, 2017

The Critical Request | CSS-Tricks

Ben takes us on a journey inside the mind of a browser (Chrome in this case). It’s all about priorities when it comes to the critical path.

Friday, July 14th, 2017

Focusing on What Matters at Fluent, 2017 - YouTube

A great short talk by Tim. It’s about performance, but so much more too.

Focusing on What Matters at Fluent, 2017

Tuesday, July 11th, 2017

Designed lines. — Ethan Marcotte

We’re building on a web littered with too-heavy sites, on an internet that’s unevenly, unequally distributed. That’s why designing a lightweight, inexpensive digital experience is a form of kindness. And while that kindness might seem like a small thing these days, it’s a critical one.

Monday, July 3rd, 2017

Fidget Spinners — Real Life

A look at our relationship with waiting, and how that is manifested in the loading icons in our interfaces.

For me, in my moments of boredom, as I turn to my phone and refresh my social media feed, I imagine that what’s on the other side of the buffering icon might be the content that will rid me of boredom and produce a satisfying social connection. The buffering icon here represents my hopes for the many ways that my social media feeds can satisfy my longings at any given moment. They rarely do, though I believe that we are half in love with the buffering icon here because it represents the promise of intimacy or excitement across the distances that separate us.

Wednesday, June 7th, 2017

A day without Javascript

Charlie conducts an experiment by living without JavaScript for a day.

So how was it? Well, with just a few minutes of sans-javascript life under my belt, my first impression was “Holy shit, things are fast without javascript”. There’s no ads. There’s no video loading at random times. There’s no sudden interrupts by “DO YOU WANT TO FUCKING SUBSCRIBE?” modals.

As you might expect, lots of sites just don’t work, but there are plenty of sites that work just fine—Google search, Amazon, Wikipedia, BBC News, The New York Times. Not bad!

This has made me appreciate the number of large sites that make the effort to build robust sites that work for everybody. But even on those sites that are progressively enhanced, it’s a sad indictment of things that they can be so slow on the multi-core hyperpowerful Mac that I use every day, but immediately become fast when JavaScript is disabled.

Tuesday, May 30th, 2017

Daring Fireball: Scott Gilbertson: ‘Kill Google AMP Before It Kills the Web’

If you are a publisher and your web pages don’t load fast, the sane solution is to fix your fucking website so that pages load fast, not to throw your hands up in the air and implement AMP.

Pretty strong meat there from Gruber.

(I’m not going to link through to the Register article though—that rag does not deserve our attention.)