Is my host fast yet?
This is an interesting project to try to rank web hosts by performance:
Real-world server response (Time to First Byte) latencies, as experienced by real-world users navigating the web.
This is an interesting project to try to rank web hosts by performance:
Real-world server response (Time to First Byte) latencies, as experienced by real-world users navigating the web.
This is a great progressive enhancement for performance that uses a service worker to combine reusable bits of a page with fresh content. The numbers are very convincing!
Alas, the code is using the Workbox library, but figuring out the vanilla code to write shouldn’t be too tricky seeing as Philip talks through his logic step by step.
Harry breaks down cache-control
headers into steps that even I can understand. I’ll be using this a reference for sure.
I remember Jason telling me about this weird service worker caching behaviour a little while back. This piece is a great bit of sleuthing in tracking down the root causes of this strange issue, followed up with a sensible solution.
Really smart thinking from Stuart on how the randomised response technique could be applied to analytics. My only question is who exactly does the implementation.
The key point here is that, if you’re collecting data about a load of users, you’re usually doing so in order to look at it in aggregate; to draw conclusions about the general trends and the general distribution of your user base. And it’s possible to do that data collection in ways that maintain the aggregate properties of it while making it hard or impossible for the company to use it to target individual users. That’s what we want here: some way that the company can still draw correct conclusions from all the data when collected together, while preventing them from targeting individuals or knowing what a specific person said.
I love Tim Bray’s idea for naming the response code for censored content on the internet in honour of Ray Bradbury.
I'm the world's worst emailer. This may help me.