Link tags: rendering

58

sparkline

What is the Value of Browser Diversity? - daverupert.com

I’ve thought about these questions for over a year and narrowed my feelings of browser diversity down to two major value propositions:

  1. Browser diversity keeps the Web deliberately slow
  2. Browser diversity fosters consensus and cooperation over corporate rule

radEventListener: a Tale of Client-side Framework Performance | CSS-Tricks

Excellent research by Jeremy Wagner comparing the performance impact of React, Preact, and vanilla JavaScript. The results are simultaneously shocking and entirely unsurprising.

the Web at a crossroads - Web Directions

John weighs in on the clashing priorities of browser vendors.

Imagine if the web never got CSS. Never got a way to style content in sophisticated ways. It’s hard to imagine its rise to prominence in the early 2000s. I’d not be alone in arguing a similar lack of access to the sort of features inherent to the mobile experience that WebKit and the folks at Mozilla have expressed concern about would (not might) largely consign the Web to an increasingly marginal role.

We need more inclusive web performance metrics | Filament Group, Inc.

Good point. When we talk about perceived performance, the perception in question is almost always visual. We should think more inclusively than that.

as days pass by — Browsers are not rendering engines

You see, diversity of rendering engines isn’t actually in itself the point. What’s really important is diversity of influence: who has the ability to make decisions which shape the web in particular ways, and do they make those decisions for good reasons or not so good?

Stuart responds to a post from Brian that was riffing off a post of mine from a while back. I like this kind of social network.

Switching to Firefox | Brad Frost

Like Brad, I switched to Firefox for web browsing and Duck Duck Go for searching quite a while back. I highly recommend it.

Diary of an Engine Diversity Absolutist – Dan’s Blog

Dan responds to an extremely worrying sentiment from Alex:

The sentiment about “engine diversity” points to a growing mindset among (primarily) Google employees that are involved with the Chromium project that puts an emphasis on getting new features into Chromium as a much higher priority than working with other implementations.

Needless to say, I agree with this:

Proponents of a “move fast and break things” approach to the web tend to defend their approach as defending the web from the dominance of native applications. I absolutely think that situation would be worse right now if it weren’t for the pressure for wide review that multiple implementations has put on the web.

The web’s key differentiator is that it is a part of the commons and that it is multi-stakeholder in nature.

“Never-Slow Mode” (a.k.a. “Slightly-Fast Mode”) Explained

I would very much like this to become a reality.

Never-Slow Mode (“NSM”) is a mode that sites can opt-into via HTTP header. For these sites, the browser imposes per-interaction resource limits, giving users a better user experience, potentially at the cost of extra developer work. We believe users are happier and more engaged on fast sites, and NSM attempts to make it easier for sites to guarantee speed to users. In addition to user experience benefits, sites might want to opt in because browsers could providing UI to users to indicate they are in “fast mode” (a TLS lock icon but for speed).

Is client side A/B testing always a bad idea in your experience? · Issue #53 · csswizardry/ama

Harry enumerates the reasons why client-side A/B testing is terrible:

  • It typically blocks rendering.
  • Providers are almost always off-site.
  • It happens on every page load.
  • No user-benefitting reuse.
  • They likely skip any governance process.

While your engineers are subject to linting, code-reviews, tests, auditors, and more, your marketing team have free rein of the front-end.

Note that the problem here is not A/B testing per se, it’s client-side A/B testing. For some reason, we seem to have collectively decided that A/B testing—like analytics—is something we should offload to the JavaScript parser in the user’s browser.

Paint Holding - reducing the flash of white on same-origin navigations  |  Web  |  Google Developers

This is an excellent UX improvement in Chrome. For sites like The Session, where page loads are blazingly fast, this really makes them feel like single page apps.

Our goal with this work was for navigations in Chrome between two pages that are of the same origin to be seamless and thus deliver a fast default navigation experience with no flashes of white/solid-color background between old and new content.

This is exactly the kind of area where browsers can innovate and compete on the UX of the browser itself, rather than trying to compete on proprietary additions to what’s being rendered.

Breaking the physical limits of fonts

This broke my brain.

The challenge: in the fewest resources possible, render meaningful text.

  • How small can a font really go?
  • How many bytes of memory would you need (to store it and run it?)
  • How much code would it take to express it?

Lets see just how far we can take this!

Preload, prefetch and other link tags: what they do and when to use them · PerfPerfPerf

Following on from Harry’s slides, here’s another round-up of thoserel attribute values that begin with pre.

More Than You Ever Wanted to Know About Resource Hints - Speaker Deck

Slides from Harry’s deep dive into rel values: preconnect, prefetch, and preload.

Disenchantment - Tim Novis

I would urge front-end developers to take a step back, breathe, and reassess. Let’s stop over engineering for the sake of it. Let’s think what we can do with the basic tools, progressive enhancement and a simpler approach to building websites. There are absolutely valid usecases for SPAs, React, et al. and I’ll continue to use these tools reguarly and when it’s necessary, I’m just not sure that’s 100% of the time.

Stuffing the Front End

53% of mobile visits leave a page that takes longer than 3 seconds to load. That means that a large number of visitors probably abandoned these sites because they were staring at a blank screen for 3 seconds, said “fuck it,” and left approximately half way before the page showed up. The fact that the next page interaction would have been quicker—assuming all the JS files even downloaded correctly in the first attempt—doesn’t amount to much if they didn’t stick around for the first page to load. What was gained by putting the business logic in the front end in this scenario?

Who has the fastest website in F1? - JakeArchibald.com

I think I physically winced on more than one occassion as I read through Jake’s report here.

He makes an interesting observation at the end:

However, none of the teams used any of the big modern frameworks. They’re mostly Wordpress & Drupal, with a lot of jQuery. It makes me feel like I’ve been in a bubble in terms of the technologies that make up the bulk of the web.

Yes! This! Contrary to what you might think reading through the latest and greatest tips and tricks from the front-end community, the vast majority of sites out there on the web are not being built with React, Vue, webpack or any other “modern” tools.

194028 – Add limits to the amount of JavaScript that can be loaded by a website

Now this is a feature request I can get behind!

A user must provide permission to enable geolocation, or notifications, or camera access, so why not also require permission for megabytes of JavaScript that will block the main thread?

Without limits, there is no incentive for a JavaScript developer to keep their codebase small and dependencies minimal. It’s easy to add another framework, and that framework adds another framework, and the next thing you know you’re loading tens of megabytes of data just to display a couple hundred kilobytes of content.

I’m serious about this. It’s is an excellent proposal for WebKit, similar to the never-slow mode proposed by Alex for Chromium.

Goodbye, EdgeHTML - The Mozilla Blog

Mozilla comes out with all guns blazing:

Microsoft is officially giving up on an independent shared platform for the internet. By adopting Chromium, Microsoft hands over control of even more of online life to Google.