Jeremy Keith

Jeremy Keith

Making websites. Writing books. Speaking at events. Living in Brighton. Working at Clearleft. Playing music. Taking photos. Answering email.

Journal 2711 sparkline Links 8840 sparkline Articles 76 sparkline Notes 5620 sparkline

Monday, September 28th, 2020

Chris Ferdinandi: The Lean Web | July 2020 - YouTube

A great presentation on taking a sensible approach to web development. Great advice, as always, from the blogging machine that is Chris Ferdinandi.

The web is a bloated, over-engineered mess. And, according to developer and educator Chris Ferdinandi, many of our modern “best practices” are actually making the web worse. In this talk, Chris explores The Lean Web, a set of principles for a simpler, faster world-wide web.

Chris Ferdinandi: The Lean Web | July 2020

Sunday, September 27th, 2020

The case for rereading | A Working Library

Reading, especially fiction, is often referred to as an escape, but I’ve never believed that. It’s true that a great story transports you somewhere else, that returning to your life afterwards can feel like an abrupt reentry. But I think that’s less because you escaped the real world, however briefly, and more that you got a clearer look at it. A great book rearranges time: it brings both history and speculative futures into the present, into a now you can occupy and taste and feel.

‘Real’ Programming Is an Elitist Myth | WIRED

The title says it all, really. This is another great piece of writing from Paul Ford.

I’ve noticed that when software lets nonprogrammers do programmer things, it makes the programmers nervous. Suddenly they stop smiling indulgently and start talking about what “real programming” is. This has been the history of the World Wide Web, for example. Go ahead and tweet “HTML is real programming,” and watch programmers show up in your mentions to go, “As if.” Except when you write a web page in HTML, you are creating a data model that will be interpreted by the browser. This is what programming is.

Saturday, September 26th, 2020

On not choosing WordPress for the W3C redesign project - Working in the open with W3C and Studio 24

The use of React complicates front-end build. We have very talented front-end developers, however, they are not React experts - nor should they need to be. I believe front-end should be built as standards-compliant HTML/CSS with JavaScript used to enrich functionality where necessary and appropriate.

Friday, September 25th, 2020

Thursday, September 24th, 2020

Performance and people

I was helping a client with a bit of a performance audit this week. I really, really enjoy this work. It’s such a nice opportunity to get my hands in the soil of a website, so to speak, and suggest changes that will have a measurable effect on the user’s experience.

Not only is web performance a user experience issue, it may well be the user experience issue. Page speed has a proven demonstrable direct effect on user experience (and revenue and customer satisfaction and whatever other metrics you’re using).

It struck me that there’s a continuum of performance challenges. On one end of the continuum, you’ve got technical issues. These can be solved with technical solutions. On the other end of the continuum, you’ve got human issues. These can be solved with discussions, agreement, empathy, and conversations (often dreaded or awkward).

I think that, as developers, we tend to gravitate towards the technical issues. That’s our safe space. But I suspect that bigger gains can be reaped by tackling the uncomfortable human issues.

This week, for example, I uncovered three performance issues. One was definitely technical. One was definitely human. One was halfway between.

The technical issue was with web fonts. It’s a lot of fun to dive into this aspect of web performance because quite often there’s some low-hanging fruit: a relatively simple technical fix that will boost the performance (or perceived performance) of a website. That might be through resource hints (using link rel=“preload” in the HTML) or adjusting the font loading (using font-display in the CSS) or even nerdier stuff like subsetting.

In this case, the issue was with the file format of the font itself. By switching to woff2, there were significant file size savings. And the great thing is that @font-face rules allow you to specify multiple file formats so you can still support older browsers that can’t handle woff2. A win all ‘round!

The performance issue that was right in the middle of the technical/human continuum was with images. At first glance it looked like a similar issue to the fonts. Some images were being served in the wrong formats. When I say “wrong”, I guess I mean inappropriate. A photographic image, for example, is probably going to best served as a JPG rather than a PNG.

But unlike the fonts, the images weren’t in the direct control of the developers. These images were coming from a Content Management System. And while there’s a certain amount of processing you can do on the server, a human still makes the decision about what file format they’re uploading.

I’ve seen this happen at Clearleft. We launched an event site with lean performant code, but then someone uploaded an image that’s megabytes in size. The solution in that case wasn’t technical. We realised there was a knowledge gap around image file formats—which, let’s face it, is kind of a techy topic that most normal people shouldn’t be expected to know.

But it was extremely gratifying to see that people were genuinely interested in knowing a bit more about choosing the right format for the right image. I was able to provide a few rules of thumb and point to free software for converting images. It empowered those people to feel more confident using the Content Management System.

It was a similar situation with the client site I was looking at this week. Nobody is uploading oversized images in order to deliberately make the site slower. They probably don’t realise the difference that image formats can make. By having a discussion and giving them some pointers, they’ll have more knowledge and the site will be faster. Another win all ‘round!

At the other end of continuum was an issue that wasn’t technical. From a technical point of view, there was just one teeny weeny little script. But that little script is Google Tag Manager which then calls many, many other scripts that are not so teeny weeny. Third party scripts …the bane of web performance!

In retrospect, it seems unbelievable that third-party JavaScript is even possible. I mean, putting arbitrary code—that can then inject even more arbitrary code—onto your website? That seems like a security nightmare!

Remember when I did a countdown of the top four web performance challenges? At the number one spot is other people’s JavaScript.

Now one technical solution would be to remove the Google Tag Manager script. But that’s probably not very practical—you’ll probably just piss off some other department. That said, if you can’t find out which department was responsible for adding the Google Tag Manager script in the first place, it might we well be an option to remove it and then wait and see who complains. If no one notices it’s gone, job done!

More realistically, there’s someone who’s added that Google Tag Manager script for their own valid reasons. You’ll need to talk to them and understand their needs.

Again, as with images uploaded in a Content Management System, they may not be aware of the performance problems caused by third-party scripts. You could try throwing numbers at them, but I think you get better results by telling the story of performance.

Use tools like Request Map Generator to help them visualise the impact that third-party scripts are having. Talk to them. More importantly, listen to them. Find out why those scripts are being requested. What are the outcomes they’re working towards? Can you offer an alternative way of providing the data they need?

I think many of us developers are intimidated or apprehensive about approaching people to have those conversations. But it’s necessary. And in its own way, it can be as rewarding as tinkering with code. If the end result is a faster website, then the work is definitely worth doing—whether it’s technical work or people work.

Personally, I just really enjoy working on anything that will end up improving a website’s performance, and by extension, the user experience. If you fancy working with me on your site, you should get in touch with Clearleft.

15 years of Clearleft

Ah, look at this beautiful timeline that Cassie designed and built—so many beautiful little touches! It covers the fifteen years(!) of Clearleft so far.

But you can also contribute to it …by looking ahead to the next fifteen years:

Let’s imagine it’s 2035…

How do you hope the practice of design will have changed for the better?

Fill out an online postcard with your hopes for the future.

The failed promise of Web Components – Lea Verou

A spot-on summary of where we’ve ended up with web components.

Web Components had so much potential to empower HTML to do more, and make web development more accessible to non-programmers and easier for programmers.

But then…

Somewhere along the way, the space got flooded by JS frameworks aficionados, who revel in complex APIs, overengineered build processes and dependency graphs that look like the roots of a banyan tree.

Alas, that’s true. Lea wonders how this can be fixed:

I’m not sure if this is a design issue, or a documentation issue.

I worry that is a cultural issue.

Using a custom element from the directory often needs to be preceded by a ritual of npm flugelhorn, import clownshoes, build quux, all completely unapologetically because “here is my truckload of dependencies, yeah, what”.

Wednesday, September 23rd, 2020

Blacklight – The Markup

This is an excellent new tool for showing exactly what kind of tracking a site is doing:

Who is peeking over your shoulder while you work, watch videos, learn, explore, and shop on the internet? Enter the address of any website, and Blacklight will scan it and reveal the specific user-tracking technologies on the site—and who’s getting your data. You may be surprised at what you learn.

Best of all, you can inspect the raw data and analyse the methodology.

There are some accompanying explainers:

Tuesday, September 22nd, 2020

The Economics of the Front-End - Jim Nielsen’s Weblog

I do think large tech companies employ JavaScript frameworks because, amongst other things, it saves them money at their scale. And what Big Tech does trickles down in the form of default choices for many others (“they’re doing it and are insanely successful, so mimicking them can’t be a bad idea”). However, the scale at which smaller projects operate doesn’t necessarily translate to the same kind of cost savings.