Tags: party

38

sparkline

Friday, December 18th, 2020

No cookie for you - The GitHub Blog

I wish more companies would realise that this is a perfectly reasonable approach to take:

We decided to look for a solution. After a brief search, we found one: just don’t use any non-essential cookies. Pretty simple, really. 🤔

So, we have removed all non-essential cookies from GitHub, and visiting our website does not send any information to third-party analytics services.

Monday, December 14th, 2020

Cloudflare’s privacy-first Web Analytics is now available for everyone

Cloudfare’s alternative to Google Analytics is now available—for free—regardless of whether your a Cloudflare customer or not:

Being privacy-first means we don’t track individual users for the purposes of serving analytics. We don’t use any client-side state (like cookies or localStorage) for analytics purposes. Cloudflare also doesn’t track users over time via their IP address, User Agent string, or any other immutable attributes for the purposes of displaying analytics — we consider “fingerprinting” even more intrusive than cookies, because users have no way to opt out.

Thursday, November 12th, 2020

Thursday, October 29th, 2020

Phantom Analyzer

A simple, real-time website scanner to see what invisible creepers are lurking in the shadows and collecting information about you.

Looks good for adactio.com, thesession.org, and huffduffer.com …but clearleft.com is letting the side down.

Thursday, October 8th, 2020

Parties and browsers

Tess calls for more precise language—like “site” and “origin”—when talking about browsers and resources:

When talking about web features with security or privacy impact, folks often talk about “first parties” and “third parties”. Everyone sort of knows what we mean when we use these terms, but it turns out that we often mean different things, and what we each think these terms mean usually doesn’t map cleanly onto the technical mechanisms browsers actually use to distinguish different actors for security or privacy purposes.

Personally, rather than say “third-party JavaScript”, I prefer the more squirm-inducing and brutually honest phrase “other people’s JavaScript”.

Thursday, September 24th, 2020

Performance and people

I was helping a client with a bit of a performance audit this week. I really, really enjoy this work. It’s such a nice opportunity to get my hands in the soil of a website, so to speak, and suggest changes that will have a measurable effect on the user’s experience.

Not only is web performance a user experience issue, it may well be the user experience issue. Page speed has a proven demonstrable direct effect on user experience (and revenue and customer satisfaction and whatever other metrics you’re using).

It struck me that there’s a continuum of performance challenges. On one end of the continuum, you’ve got technical issues. These can be solved with technical solutions. On the other end of the continuum, you’ve got human issues. These can be solved with discussions, agreement, empathy, and conversations (often dreaded or awkward).

I think that, as developers, we tend to gravitate towards the technical issues. That’s our safe space. But I suspect that bigger gains can be reaped by tackling the uncomfortable human issues.

This week, for example, I uncovered three performance issues. One was definitely technical. One was definitely human. One was halfway between.

The technical issue was with web fonts. It’s a lot of fun to dive into this aspect of web performance because quite often there’s some low-hanging fruit: a relatively simple technical fix that will boost the performance (or perceived performance) of a website. That might be through resource hints (using link rel=“preload” in the HTML) or adjusting the font loading (using font-display in the CSS) or even nerdier stuff like subsetting.

In this case, the issue was with the file format of the font itself. By switching to woff2, there were significant file size savings. And the great thing is that @font-face rules allow you to specify multiple file formats so you can still support older browsers that can’t handle woff2. A win all ‘round!

The performance issue that was right in the middle of the technical/human continuum was with images. At first glance it looked like a similar issue to the fonts. Some images were being served in the wrong formats. When I say “wrong”, I guess I mean inappropriate. A photographic image, for example, is probably going to best served as a JPG rather than a PNG.

But unlike the fonts, the images weren’t in the direct control of the developers. These images were coming from a Content Management System. And while there’s a certain amount of processing you can do on the server, a human still makes the decision about what file format they’re uploading.

I’ve seen this happen at Clearleft. We launched an event site with lean performant code, but then someone uploaded an image that’s megabytes in size. The solution in that case wasn’t technical. We realised there was a knowledge gap around image file formats—which, let’s face it, is kind of a techy topic that most normal people shouldn’t be expected to know.

But it was extremely gratifying to see that people were genuinely interested in knowing a bit more about choosing the right format for the right image. I was able to provide a few rules of thumb and point to free software for converting images. It empowered those people to feel more confident using the Content Management System.

It was a similar situation with the client site I was looking at this week. Nobody is uploading oversized images in order to deliberately make the site slower. They probably don’t realise the difference that image formats can make. By having a discussion and giving them some pointers, they’ll have more knowledge and the site will be faster. Another win all ‘round!

At the other end of continuum was an issue that wasn’t technical. From a technical point of view, there was just one teeny weeny little script. But that little script is Google Tag Manager which then calls many, many other scripts that are not so teeny weeny. Third party scripts …the bane of web performance!

In retrospect, it seems unbelievable that third-party JavaScript is even possible. I mean, putting arbitrary code—that can then inject even more arbitrary code—onto your website? That seems like a security nightmare!

Remember when I did a countdown of the top four web performance challenges? At the number one spot is other people’s JavaScript.

Now one technical solution would be to remove the Google Tag Manager script. But that’s probably not very practical—you’ll probably just piss off some other department. That said, if you can’t find out which department was responsible for adding the Google Tag Manager script in the first place, it might we well be an option to remove it and then wait and see who complains. If no one notices it’s gone, job done!

More realistically, there’s someone who’s added that Google Tag Manager script for their own valid reasons. You’ll need to talk to them and understand their needs.

Again, as with images uploaded in a Content Management System, they may not be aware of the performance problems caused by third-party scripts. You could try throwing numbers at them, but I think you get better results by telling the story of performance.

Use tools like Request Map Generator to help them visualise the impact that third-party scripts are having. Talk to them. More importantly, listen to them. Find out why those scripts are being requested. What are the outcomes they’re working towards? Can you offer an alternative way of providing the data they need?

I think many of us developers are intimidated or apprehensive about approaching people to have those conversations. But it’s necessary. And in its own way, it can be as rewarding as tinkering with code. If the end result is a faster website, then the work is definitely worth doing—whether it’s technical work or people work.

Personally, I just really enjoy working on anything that will end up improving a website’s performance, and by extension, the user experience. If you fancy working with me on your site, you should get in touch with Clearleft.

Wednesday, September 23rd, 2020

Blacklight – The Markup

This is an excellent new tool for showing exactly what kind of tracking a site is doing:

Who is peeking over your shoulder while you work, watch videos, learn, explore, and shop on the internet? Enter the address of any website, and Blacklight will scan it and reveal the specific user-tracking technologies on the site—and who’s getting your data. You may be surprised at what you learn.

Best of all, you can inspect the raw data and analyse the methodology.

There are some accompanying explainers:

Thursday, January 2nd, 2020

Making a ‘post-it game’ PWA with mobile accelerometer API’s | Trys Mudford

I made an offhand remark at the Clearleft Christmas party and Trys ran with it…

Saturday, December 14th, 2019

Thursday, November 21st, 2019

Surveillance giants: How the business model of Google and Facebook threatens human rights | Amnesty International

Amnesty International have released a PDF report on the out-of-control surveillance perpetrated by Google and Facebook:

Google and Facebook’s platforms come at a systemic cost. The companies’ surveillance-based business model forces people to make a Faustian bargain, whereby they are only able to enjoy their human rights online by submitting to a system predicated on human rights abuse. Firstly, an assault on the right to privacy on an unprecedented scale, and then a series of knock-on effects that pose a serious risk to a range of other rights, from freedom of expression and opinion, to freedom of thought and the right to non-discrimination.

However…

This page on the Amnesty International website has six tracking scripts. Also, consent to accept tracking cookies is assumed (check dev tools). It looks like you can reject marketing cookies, but I tried that without any success.

The stone PDF has been thrown from a very badly-performing glass house.

Wednesday, October 2nd, 2019

Same-Site Cookies By Default | text/plain

This is good news. I have third-party cookies disabled in my browser, and I’m very happy that it will become the default.

It’s hard to believe that we ever allowed third-party cookies and scripts in the first place. Between them, they’re responsible for the worst ills of the World Wide Web.

Friday, September 13th, 2019

5G Will Definitely Make the Web Slower, Maybe | Filament Group, Inc.

The Jevons Paradox in action:

Faster networks should fix our performance problems, but so far, they have had an interesting if unintentional impact on the web. This is because historically, faster network speed has enabled developers to deliver more code to users—in particular, more JavaScript code.

And because it’s JavaScript we’re talking about:

Even if folks are on a new fast network, they’re very likely choking on the code we’re sending, rendering the potential speed improvements of 5G moot.

The longer I spend in this field, the more convinced I am that web performance is not a technical problem; it’s a people problem.

Tuesday, September 10th, 2019

Request mapping

The Request Map Generator is a terrific tool. It’s made by Simon Hearne and uses the WebPageTest API.

You pop in a URL, it fetches the page and maps out all the subsequent requests in a nifty interactive diagram of circles, showing how many requests third-party scripts are themselves generating. I’ve found it to be a very effective way of showing the impact of third-party scripts to people who aren’t interested in looking at waterfall diagrams.

I was wondering… Wouldn’t it be great if this were built into browsers?

We already have a “Network” tab in our developer tools. The purpose of this tab is to show requests coming in. The browser already has all the information it needs to make a diagram of requests in the same that the request map generator does.

In Firefox, there’s a little clock icon in the bottom left corner of the “Network” tab. Clicking that shows a pie-chart view of requests. That’s useful, but I’d love it if there were the option to also see the connected circles that the request map generator shows.

Just a thought.

Friday, August 23rd, 2019

Is client side A/B testing always a bad idea in your experience? · Issue #53 · csswizardry/ama

Harry enumerates the reasons why client-side A/B testing is terrible:

  • It typically blocks rendering.
  • Providers are almost always off-site.
  • It happens on every page load.
  • No user-benefitting reuse.
  • They likely skip any governance process.

While your engineers are subject to linting, code-reviews, tests, auditors, and more, your marketing team have free rein of the front-end.

Note that the problem here is not A/B testing per se, it’s client-side A/B testing. For some reason, we seem to have collectively decided that A/B testing—like analytics—is something we should offload to the JavaScript parser in the user’s browser.

Monday, June 3rd, 2019

Self-Host Your Static Assets – CSS Wizardry

Trust no one! Harry enumerates the reason why you should be self-hosting your assets (and busts some myths along the way).

There really is very little reason to leave your static assets on anyone else’s infrastructure. The perceived benefits are often a myth, and even if they weren’t, the trade-offs simply aren’t worth it. Loading assets from multiple origins is demonstrably slower.

Tuesday, May 7th, 2019

Test the impact of ads and third party scripts

This is a very useful new feature in Calibre, the performance monitoring tool. Now you can get data about just how much third-party scripts are affecting your site’s performance:

The best way of circumventing fear and anxiety around third party script performance is to capture metrics that clearly articulate their performance impact.

Monday, March 18th, 2019

Hello, Goodbye - Browser Extension

A handy browser extension for Chrome and Firefox:

“Hello, Goodbye” blocks every chat or helpdesk pop up in your browser.

Thursday, September 20th, 2018

The costs and benefits of tracking scripts – business vs. user // Sebastian Greger

I am having a hard time seeing the business benefits weighing in more than the user cost (at least for those many organisations out there who rarely ever put that data to proper use). After all, keeping the costs low for the user should be in the core interest of the business as well.

Friday, September 14th, 2018

On using tracking scripts | justmarkup

Weighing up the pros and cons of adding tracking scripts to a website, from a business perspective and from a user perspective.

When looking at the costs versus the benefits it is hard to believe that almost every website is using tracking scripts.

The next time, you implement a tracking script it would be great if you could rethink it and ask yourself if it is really worth it.

Wednesday, September 12th, 2018

Private by Default

Feedbin has removed third-party iframes and JavaScript (oEmbed provides a nice alternative), as well as stripping out Google Analytics, and even web fonts that aren’t self-hosted. This is excellent!