Tags: performance

435

sparkline

Monday, January 13th, 2020

Smaller HTML Payloads with Service Workers — Philip Walton

This is a great progressive enhancement for performance that uses a service worker to combine reusable bits of a page with fresh content. The numbers are very convincing!

Alas, the code is using the Workbox library, but figuring out the vanilla code to write shouldn’t be too tricky seeing as Philip talks through his logic step by step.

Friday, January 10th, 2020

Performance Budgets, Pragmatically – CSS Wizardry

Smart advice from Harry on setting performance budgets:

They shouldn’t be aspirational, they should be preventative … my suggestion for setting a budget for any trackable metric is to take the worst data point in the past two weeks and use that as your limit

Monday, December 30th, 2019

How creating a Progressive Web App has made our website better for people and planet

Creating a PWA has saved a lot of kilobytes after the initial load by storing files on the device to reuse on subsequent requests – this in turn lowers the load time and carbon footprint on subsequent page views, making the website better for both people and planet. We’ve also enabled offline access, which significantly improves user experience for people in areas with patchy connections, such as mobile users on their commute.

Sunday, December 29th, 2019

Move Fast & Don’t Break Things | Filament Group, Inc.

This is the transcript of a brilliant presentation by Scott—read the whole thing! It starts with a much-needed history lesson that gets to where we are now with the dismal state of performance on the web, and then gives a whole truckload of handy tips and tricks for improving performance when it comes to styles, scripts, images, fonts, and just about everything on the front end.

Essential!

Sunday, December 22nd, 2019

This Page is Designed to Last: A Manifesto for Preserving Content on the Web

Geocities, LiveJournal, what.cd, now Yahoo Groups. One day, Medium, Twitter, and even hosting services like GitHub Pages will be plundered then discarded when they can no longer grow or cannot find a working business model.

Considering the needs of someone who wants to make and maintain a website, without the ridiculous complexity of “modern” web tooling:

How do we make web content that can last and be maintained for at least 10 years? As someone studying human-computer interaction, I naturally think of the stakeholders we aren’t supporting. Right now putting up web content is optimized for either the professional web developer (who use the latest frameworks and workflows) or the non-tech savvy user (who use a platform).

Saturday, December 14th, 2019

Monday, December 2nd, 2019

Six Web Performance Technologies to Watch in 2020 – Simon Hearne

The inexorable rise of frameworks such as Angular, React, Vue and their many cousins has been led by an assumption that managing state in the browser is quicker than a request to a server. This assumption, I can only assume, is made by developers who have flagship mobile devices or primarily work on desktop devices.

Friday, November 22nd, 2019

Fit on a Floppy

Here’s a nice little test for the file size of your web pages—could they fit on a floppy disk?

The Session fits comfortably and adactio.com just about scrapes by.

Thursday, November 21st, 2019

Request with Intent: Caching Strategies in the Age of PWAs – A List Apart

Aaron outlines some sensible strategies for serving up images, including using the Cache API from your service worker script.

Tuesday, November 19th, 2019

paulirish/lite-youtube-embed: A faster youtube embed.

A very handy web component from Paul—this works exactly like a regular YouTube embed, but is much more performant.

Sunday, November 17th, 2019

The Web We’ve Made

Let us not overlook the fact that a semantic HTML web site is inherently accessible by default. When we bend the web to our will, we break that. So we have a responsibility to correct it. Sure the new technologies are neat, but the end result is usually garbage. This all requires some next-level narcissism that our goals and priorities as developers are far more important than that of the audience we’re theoretically building software to serve.

A Web Developers New Working Week

I think these are great habit-forming ideas for any web designer or developer: a day without using your mouse; a day with your display set to grayscale; a day spent using a different web browser; a day with your internet connection throttled. I’m going to try these!

Responsible JavaScript: Part III – A List Apart

This chimes nicely with my recent post on third-party scripts. Here, Jeremy treats third-party JavaScript at technical debt and outlines some solutions to staying on top of it.

Convenience always has a price, and the web is wracked by our collective preference for it. JavaScript, in particular, is employed in a way that suggests a rapidly increasing tendency to outsource whatever it is that We (the first party) don’t want to do. At times, this is a necessary decision; it makes perfect financial and operational sense in many situations.

But make no mistake, third-party JavaScript is never cheap. It’s a devil’s bargain where vendors seduce you with solutions to your problem, yet conveniently fail to remind you that you have little to no control over the side effects that solution introduces.

Saturday, November 16th, 2019

What would happen if we allowed blocking 3rd-Party JavaScript as an option?

This would be a fascinating experiment to run in Firefox nightly! This is in response to that post I wrote about third-party scripts.

(It’s fascinating to see how different this response is to the responses from people working at Google.)

Tuesday, November 12th, 2019

Third party

The web turned 30 this year. When I was back at CERN to mark this anniversary, there was a lot of introspection and questioning the direction that the web has taken. Everyone I know that uses the web is in agreement that tracking and surveillance are out of control. It seems only right to question whether the web has lost its way.

But here’s the thing: the technologies that enable tracking and surveillance didn’t exist in the early years of the web—JavaScript and cookies.

Without cookies, the web was stateless. This was by design. Now, I totally understand why cookies—or something like cookies—were needed. Without some way of keeping track of state, there’s no good way for a website to “remember” what’s in your shopping cart, or whether you’ve authenticated yourself.

But why would cookies ever need to work across domains? Authentication, shopping carts and all that good stuff can happen on the same domain. Third-party cookies, on the other hand, seem custom made for tracking and frankly, not much else.

Browsers allow you to disable third-party cookies, though it’s not yet the default. If enough people do it—and complain about the sites that stop working when third-party cookies are disabled—then maybe it can become the default.

Firefox is taking steps in this direction, automatically disabling some third-party cookies—the ones that known trackers. Safari is also taking steps to prevent cross-site tracking. It’s not too late to change the tide of third-party cookies.

Then there’s third-party JavaScript.

In retrospect, it seems unbelievable that third-party JavaScript is even possible. I mean, putting arbitrary code—that can then inject even more arbitrary code—onto your website? That seems like a security nightmare!

I imagine if JavaScript were being specced today, it would almost certainly be restricted to the same origin by default. But I guess the precedent had been set with images and style sheets: they could be embedded regardless of whether their domain names matched yours. Still, this is executable code we’re talking about here: that’s quite a footgun that the web has given site owners. And boy, oh boy, has it been used by the worst people to do the most damage.

Again, as with cookies, if we were to imagine what the web would be like if JavaScript was restricted by a same-domain policy, there are certainly things that would be trickier to do.

  • Embedding video, audio, and maps would get a lot finickier.
  • Analytics would need to be self-hosted. I don’t think that would bother any site owners. An analytics platform like Google Analytics that tracks people across domains is doing it for its own benefit rather than that of site owners.
  • Advertising wouldn’t be creepy and annoying. Instead of what’s so euphemistically called “personalisation”, advertisers would have to rely on serving relevant ads based on the content of the site rather than an invasive psychological profile of the user. (I honestly think that advertisers would benefit from this kind of targetting.)

It’s harder to imagine putting the genie back in the bottle when it comes to third-party JavaScript than it is with third-party cookies. All the same, I wish that browsers made it easier to experiment with it. Just as I can choose to accept all cookies, reject all cookies, or only accept same-origin cookies, I wish I could accept all JavaScript, reject all JavaScript, or only accept same-origin JavaScript.

As it is, browsers are making it harder and harder to exercise any control over JavaScript at all. So we reach for third-party tools. We don’t call them JavaScript managers though. We call them ad blockers. But honestly, most of the ad-blocker users I know—myself included—are not bothered by the advertising; we’re bothered by the tracking. We should really call them surveillance blockers.

If third-party JavaScript weren’t the norm, not only would it make the web more secure, it would make it way more performant. Read the chapter on third parties in this year’s newly-released Web Almanac. The figures are staggering.

93% of pages include at least one third-party resource, 76% of pages issue a request to an analytics domain, the median page requests content from at least 9 unique third-party domains that represent 35% of their total network activity, and the most active 10% of pages issue a whopping 175 third-party requests or more.

I don’t think all the web’s performance ills are due to third-party scripts; developers are doing a bang-up job of making their sites big and bloated with their own self-hosted frameworks and code. But as long as third-party JavaScript is allowed onto a site, there’s a limit to how much good developers can do to improve the performance of their sites.

I go to performance-related conferences and you know who I’ve never seen at those events? The people who write the JavaScript for third-party tracking scripts. Those developers are wielding an outsized influence on the health of the web.

I’m very happy to see the work being done by Mozilla and Apple to normalise the idea of rejecting third-party cookies. I’d love to see the rejection of third-party JavaScript normalised in the same way. I know that it would make my life as a developer harder. But that’s of lesser importance. It would be better for the web.

The Department of Useless Images - Gerry McGovern

The Web is smothering in useless images. These clichéd, stock images communicate absolutely nothing of value, interest or use. They are one of the worst forms of digital pollution because they take up space on the page, forcing more useful content out of sight. They also slow down the site’s ability to download quickly.

Chromium Blog: Moving towards a faster web

It’s nice to see that the Chrome browser will add interface enhancements to show whether you can expect a site to load fast or slowly.

Just a shame that the Google search team aren’t doing this kind of badging …unless you’ve given up on your website and decided to use Google AMP instead.

Maybe the Chrome team can figure out what the AMP team are doing to get such preferential treatment from the search team.

Monday, November 11th, 2019

FF Conf 2019

Friday was FF Conf day here in Brighton. This was the eleventh(!) time that Remy and Julie have put on the event. It was, as ever, excellent.

It’s a conference that ticks all the boxes for me. For starters, it’s a single-track event. The more I attend conferences, the more convinced I am that multi-track events are a terrible waste of time for attendees (and a financially bad model for organisers). I know that sounds like a sweeping broad generalisation, but ask me about it next time we meet and I’ll go into more detail. For now, I just want to talk about this mercifully single-track conference.

FF Conf has built up a rock-solid reputation over the years. I think that’s down to how Remy curates it. He thinks about what he wants to know and learn more about, and then thinks about who to invite to speak on those topics. So every year is like a snapshot of Remy’s brain. By happy coincidence, a snapshot of Remy’s brain right now looks a lot like my own.

You could tell that Remy had grouped the talks together in themes. There was a performance-themed chunk right after lunch. There was a people-themed chunk in the morning. There was a creative-coding chunk at the end of the day. Nice work, DJ.

I think it was quite telling what wasn’t on the line-up. There were no talks about specific libraries or frameworks. For me, that was a blessed relief. The only technology-specific talk was Alice’s excellent talk on Git—a tool that’s useful no matter what you’re coding.

One of the reasons why I enjoyed the framework-free nature of the day is that most talks—and conferences—that revolve around libraries and frameworks are invariably focused on the developer experience. Think about it: next time you’re watching a talk about a framework or library, ask yourself how it impacts user experience.

At FF Conf, the focus was firmly on people. In the case of Laura’s barnstorming presentation, those people are end users (I’m constantly impressed by how calm and measured Laura remains even when talking about blood-boilingly bad behaviour from the tech industry). In the case of Amina’s talk, the people are junior developers. And for Sharon’s presentation, the people are everyone.

One of the most useful talks of the day was from Anna who took us on a guided tour of dev tools to identify performance improvements. I found it inspiring in a very literal sense—if I had my laptop with me, I think I would’ve opened it up there and then and started tinkering with my websites.

Harry also talked about performance, but at Remy’s request, it was more business focused. Specifically, it was focused on Harry’s consultancy business. I think this would’ve been the perfect talk for more of an “industry” event, whereas FF Conf is very much a community event: Harry’s semi-serious jibes about keeping his performance secrets under wraps didn’t quite match the generous tone of the rest of the line-up.

The final two talks from Charlotte and Suz were a perfect double whammy.

When I saw Charlotte speak at Material in Iceland last year, I wrote this aside in my blog post summary:

(Oh, and Remy, when you start to put together the line-up for next year’s FF Conf, be sure to check out Charlotte Dann—her talk at Material was the perfect mix of code and creativity.)

I don’t think I can take credit for Charlotte being on the line-up, but I will take credit for saying she’d be the perfect fit.

And then Suz Hinton closed out the conference with this rallying cry that resonated perfectly with Laura’s talk:

Less mass-produced surveillance bullshit and more Harry Potter magic (please)!

I think that rallying cry could apply equally well to conferences, and I think FF Conf is a good example of that ethos in action.

JavaScript | 2019 | The Web Almanac by HTTP Archive

It’s time for a look at the state of the web when it comes to JavaScript usage. Here’s the report powered by data from HTTP Archive:

JavaScript is the most costly resource we send to browsers; having to be downloaded, parsed, compiled, and finally executed. Although browsers have significantly decreased the time it takes to parse and compile scripts, download and execution have become the most expensive stages when JavaScript is processed by a web page.

Sending smaller JavaScript bundles to the browser is the best way to reduce download times, and in turn improve page performance. But how much JavaScript do we really use?

When it comes to frameworks and UI libraries, there are some interesting numbers. Given the volume of chatter in the dev world, you’d be forgiven for thinking that React is used on the majority of websites today. The real number? 4.6% of websites. That’s less than the number of websites using CSS custom properties.

This is reminding me of what I wrote about dev perception.

Friday, November 1st, 2019

Location, Privilege and Performant Websites

Testing on a <$100 Android device on a 3G network should be an integral part of testing your website. Not everyone is on a brand-new device or upgrades often, especially with the price point of a high-end phones these days.

When we design and build our websites with the outliers in mind, whether it’s for performance or even user experience, we build an experience that can be easy for all to access and use — and that’s what the web is about, access and information for all.