Tags: ip



Saturday, March 10th, 2018

Technologist Hippocratic Oath | An optional oath for building ethically considered experiences

Everyone draws their lines in different ways and perhaps there is a spectrum of what is reasonable when implementing influential products. That’s exactly why technologists must seek to educate themselves on the patterns they are implementing in order to understand their psychological influence and other outcomes where intended use is not always the same as the reality of the user experience. Not only that, but we should feel empowered to speak up to authority when something crosses a line.

Wednesday, March 7th, 2018

Parallax scrolling with CSS variables | basicScroll

Don’t let the title fool you—this isn’t just for parallax scrolling (thank goodness!)—it’s for triggering any CSS updates based on scroll position. Using CSS custom properties makes a lot of sense. The JavaScript/CSS bridge enabled by custom properties is kind of their superpower. (That’s one of the reasons why I don’t like calling them “CSS variables” which makes them sound like Sass variables—they’re so much more than that!)

Measuring the Hard-to-Measure – CSS Wizardry

Everything old is new again—sometimes the age-old technique of using a 1x1 pixel image to log requests is still the only way to get certain metrics.

While tracking pixels are far from a new idea, there are creative ways in which we can use them to collect data useful to developers. Once the data is gathered, we can begin to make much more informed decisions about how we work.

Tuesday, March 6th, 2018

Minimal viable service worker

I really, really like service workers. They’re one of those technologies that have such clear benefits to users that it seems like a no-brainer to add a service worker to just about any website.

The thing is, every website is different. So the service worker strategy for every website needs to be different too.

Still, I was wondering if it would be possible to create a service worker script that would work for most websites. Here’s the script I came up with.

The logic works like this:

  • If there’s a request for an HTML page, fetch it from the network and store a copy in a cache (but if the network request fails, try looking in the cache instead).
  • For any other files, look for a copy in the cache first but meanwhile fetch a fresh version from the network to update the cache (and if there’s no existing version in the cache, fetch the file from the network and store a copy of it in the cache).

So HTML files are served network-first, while all other files are served cache-first, but in both cases a fresh copy is always put in the cache. The idea is that HTML content will always be fresh (unless there’s a problem with the network), while all other content—images, style sheets, scripts—might be slightly stale, but get refreshed with every request.

My original attempt was riddled with errors. Jake came to my rescue and we revised the script into something that actually worked. In the process, my misunderstanding of how await works led Jake to write a great blog post on await vs return vs return await.

I got there in the end and the script seems solid enough. It’s a fairly simplistic strategy that could work for quite a few sites, but it has some issues…

Service workers don’t perform any automatic cleanup of caches—that’s up to you to do (usually during the activate event). This script doesn’t do any cleanup so the cache might grow and grow and grow. For that reason, I think the script is best suited for fairly small sites.

The strategy also assumes that a file will either be fetched from the network or the cache. There’s no contingency for when both attempts fail. So there’s no fallback offline page, for example.

I decided to test it in the wild, but I expanded it slightly to fix the fallback issue. The version on the Ampersand 2018 website includes a worst-case-scenario option to show a custom offline page that has been pre-cached. (By the way, if you haven’t got a ticket for Ampersand yet, get a ticket now—it’s going to be superb day of web typography nerdery.)

Anyway, this fairly basic script seems to be delivering some good performance improvements. If you’ve got a site that you think would benefit from this network/caching strategy, and it’s served over HTTPS, then:

  1. Feel free to download the script or copy and paste it into a file called serviceworker.js,
  2. Put that file in the root directory of your website,
  3. Add this in a script element at the bottom of your HTML pages:

if (navigator.serviceWorker && !navigator.serviceWorker.controller) { navigator.serviceWorker.register('/serviceworker.js'); }

You can also use the script as a starting point. You might find issues specific to your particular website. That’s okay—you can tweak and adjust the script to suit your needs.

If this minimal service worker script proves in any way useful to you, thank Jake.

Saturday, March 3rd, 2018

Battleship Solitaire: Mindless Podcast Companion

Once I got the hang of this game, I found it incredibly addictive. I would describe it as mindless fun, but I think it’s more like mindful fun: it has the same zen contemplative peacefulness as Sudoku. I can certainly see how it makes for a good activity while listening to podcasts.

Note: click once for water; double-click for ships. And don’t blame me if you lose hours of time to this game.

Thursday, March 1st, 2018

Third party CSS is not safe - JakeArchibald.com

We all know that adding a third-party script to your site is just asking for trouble. But Jake points out that adding a third-party anything to your site is a bad idea.

Trust no one.

Tuesday, February 27th, 2018

Responsive Components: a Solution to the Container Queries Problem — Philip Walton

Here’s a really smart approach to creating container queries today—it uses ResizeObserver to ensure that listening for size changes is nice and performant.

There’s a demo site you can play around with to see it in action.

While the strategy I outline in this post is production-ready, I see us as being still very much in the early stages of this space. As the web development community starts shifting its component design from viewport or device-oriented to container-oriented, I’m excited to see what possibilities and best practices emerge.

Let’s talk about usernames

This post goes into specifics on Django, but the broader points apply no matter what your tech stack. I’m relieved to find out that The Session is using the tripartite identity pattern (although Huffduffer, alas, isn’t):

What we really want in terms of identifying users is some combination of:

  1. System-level identifier, suitable for use as a target of foreign keys in our database
  2. Login identifier, suitable for use in performing a credential check
  3. Public identity, suitable for displaying to other users

Many systems ask the username to fulfill all three of these roles, which is probably wrong.

Monday, February 26th, 2018

Ends and means

The latest edition of the excellent History Of The Web newsletter is called The Day(s) The Web Fought Back. It recounts the first time that websites stood up against bad legislation in the form of the Communications Decency Act (CDA), and goes to recount the even more effective use of blackout protests against SOPA and PIPA.

I remember feeling very heartened to see WikiPedia, Google and others take a stand on January 18th, 2012. But I also remember feeling uneasy. In this particular case, companies were lobbying for a cause I agreed with. But what if they were lobbying for a cause I didn’t agree with? Large corporations using their power to influence politics seems like a very bad idea. Isn’t it still a bad idea, even if I happen to agree with the cause?

Cloudflare quite rightly kicked The Daily Stormer off their roster of customers. Then the CEO of Cloudflare quite rightly wrote this in a company-wide memo:

Literally, I woke up in a bad mood and decided someone shouldn’t be allowed on the Internet. No one should have that power.

There’s an uncomfortable tension here. When do the ends justify the means? Isn’t the whole point of having principles that they hold true even in the direst circumstances? Why even claim that corporations shouldn’t influence politics if you’re going to make an exception for net neutrality? Why even claim that free speech is sacrosanct if you make an exception for nazi scum?

Those two examples are pretty extreme and I can easily justify the exceptions to myself. Net neutrality is too important. Stopping fascism is too important. But where do I draw the line? At what point does something become “too important?”

There are more subtle examples of corporations wielding their power. Google are constantly using their monopoly position in search and browser marketshare to exert influence over website-builders. In theory, that’s bad. But in practice, I find myself agreeing with specific instances. Prioritising mobile-friendly sites? Sounds good to me. Penalising intrusive ads? Again, that seems okey-dokey to me. But surely that’s not the point. So what if I happen to agree with the ends being pursued? The fact that a company the size and power of Google is using their monopoly for any influence is worrying, regardless of whether I agree with the specific instances. But I kept my mouth shut.

Now I see Google abusing their monopoly again, this time with AMP. They may call the preferential treatment of Google-hosted AMP-formatted pages a “carrot”, but let’s be honest, it’s an abuse of power, plain and simple.

By the way, I have no doubt that the engineers working on AMP have the best of intentions. We are all pursuing the same ends. We all want a faster web. But we disagree on the means. If Google search results gave preferential treatment to any fast web pages, that would be fine. But by only giving preferential treatment to pages written in a format that they created, and hosted on their own servers, they are effectively forcing everyone to use AMP. I know for a fact that there are plenty of publications who are producing AMP content, not because they are sold on the benefits of the technology, but because they feel strong-armed into doing it in order to compete.

If the ends justify the means, then it’s easy to write off Google’s abuse of power. Those well-intentioned AMP engineers honestly think that they have the best interests of the web at heart:

We were worried about the web not existing anymore due to native apps and walled gardens killing it off. We wanted to make the web competitive. We saw a sense of urgency and thus we decided to build on the extensible web to build AMP instead of waiting for standard and browsers and websites to catch up. I stand behind this process. I’m a practical guy.

There’s real hubris and audacity in thinking that one company should be able to tackle fixing the whole web. I think the AMP team are genuinely upset and hurt that people aren’t cheering them on. Perhaps they will dismiss the criticisms as outpourings of “Why wasn’t I consulted?” But that would be a mistake. The many thoughtful people who are extremely critical of AMP are on the same side as the AMP team when it comes the end-goal of better, faster websites. But burning the web to save it? No thanks.

Ben Thompson goes into more detail on the tension between the ends and the means in The Aggregator Paradox:

The problem with Google’s actions should be obvious: the company is leveraging its monopoly in search to push the AMP format, and the company is leveraging its dominant position in browsers to punish sites with bad ads. That seems bad!

And yet, from a user perspective, the options I presented at the beginning — fast loading web pages with responsive designs that look great on mobile and the elimination of pop-up ads, ad overlays, and autoplaying videos with sounds — sounds pretty appealing!

From that perspective, there’s a moral argument to be made for wielding monopoly power like Google is doing. No doubt the AMP team feel it would be morally wrong for Google not to use its influence in search to give preferential treatment to AMP pages.

Going back to the opening examples of online blackouts, was it morally wrong for companies to use their power to influence politics? Or would it have been morally wrong for them not to have used their influence?

When do the ends justify the means?

Here’s a more subtle example than Google AMP, but one which has me just as worried for the future of the web. Mozilla announced that any new web features they add to their browser will require HTTPS.

The end-goal here is one I agree with: HTTPS everywhere. On the face of it, the means of reaching that goal seem reasonable. After all, we already require HTTPS for sensitive JavaScript APIs like geolocation or service workers. But the devil is in the details:

Effective immediately, all new features that are web-exposed are to be restricted to secure contexts. Web-exposed means that the feature is observable from a web page or server, whether through JavaScript, CSS, HTTP, media formats, etc. A feature can be anything from an extension of an existing IDL-defined object, a new CSS property, a new HTTP response header, to bigger features such as WebVR.

Emphasis mine.

This is a step too far. Again, I am in total agreement that we should be encouraging everyone to switch to HTTPS. But requiring HTTPS in order to use CSS? The ends don’t justify the means.

If there were valid security reasons for making HTTPS a requirement, I would be all for enforcing this. But these are two totally separate areas. Enforcing HTTPS by withholding CSS support is no different to enforcing AMP by withholding search placement. In some ways, I think it might actually be worse.

There’s an assumption in this decision that websites are being made by professionals who will know how to switch to HTTPS. But the web is for everyone. Not just for everyone to use. It’s for everyone to build.

One of my greatest fears for the web is that building it becomes the domain of a professional priesthood. Anything that raises the bar to writing some HTML or CSS makes me very worried. Usually it’s toolchains that make things more complex, but in this case the barrier to entry is being brought right into the browser itself.

I’m trying to imagine future Codebar evenings, helping people to make their first websites, but now having to tell them that some CSS will be off-limits until they meet the entry requirements of HTTPS …even though CSS and HTTPS have literally nothing to do with one another. (And yes, there will be an exception for localhost and I really hope there’ll be an exception for file: as well, but that’s simply postponing the disappointment.)

No doubt Mozilla (and the W3C Technical Architecture Group) believe that they are doing the right thing. Perhaps they think it would be morally wrong if browsers didn’t enforce HTTPS even for unrelated features like new CSS properties. They believe that, in this particular case, the ends justify the means.

I strongly disagree. If you also disagree, I encourage you to make your voice heard. Remember, this isn’t about whether you think that we should all switch to HTTPS—we’re all in agreement on that. This is about whether it’s okay to create collateral damage by deliberately denying people access to web features in order to further a completely separate agenda.

This isn’t about you or me. This is about all those people who could potentially become makers of the web. We should be welcoming them, not creating barriers for them to overcome.

Reliable Experiences - PWA Roadshow - YouTube

A good hands-on introduction to service workers from Mariko.

Reliable Experiences - PWA Roadshow

Sunday, February 25th, 2018

Cooking with Ursula K. Le Guin

Recipes inspired by The Left Hand Of Darkness.

I mostly stuck to Le Guin’s world-building rules for Winter, which were “no large meat-animals … and no mammalian products, milk, butter or cheese; the only high-protein, high-carbohydrate foods are the various kinds of eggs, fish, nuts and Hainish grains.” I did, however, add some hot-climate items found in Manhattan’s Chinatown for their space-age looks and good flavors (dragonfruit, pomelo, galangal, chilis, and kaffir limes).

Serve with hot beer.

Sunday, February 11th, 2018

Seva Zaikov - Single Page Application Is Not a Silver Bullet

Harsh (but fair) assessment of the performance costs of doing everything on the client side.

Friday, February 9th, 2018

Href Tools - Free online web tools

Handy web-based tools—compress HTML, CSS, and JavaScript, and convert files from one format to another.

Everything Easy is Hard Again – Frank Chimero

I wonder if I have twenty years of experience making websites, or if it is really five years of experience, repeated four times.

I saw Frank give this talk at Mirror Conf last year and it resonated with me so so much. I’ve been looking forward to him publishing the transcript ever since. If you’re anything like me, this will read as though it’s coming from directly inside your head.

In one way, it is easier to be inexperienced: you don’t have to learn what is no longer relevant. Experience, on the other hand, creates two distinct struggles: the first is to identify and unlearn what is no longer necessary (that’s work, too). The second is to remain open-minded, patient, and willing to engage with what’s new, even if it resembles a new take on something you decided against a long time ago.

I could just keep quoting the whole thing, because it’s all brilliant, but I’ll stop with one more bit about the increasing complexity of build processes and the decreasing availability of a simple view source:

Illegibility comes from complexity without clarity. I believe that the legibility of the source is one of the most important properties of the web. It’s the main thing that keeps the door open to independent, unmediated contributions to the network. If you can write markup, you don’t need Medium or Twitter or Instagram (though they’re nice to have). And the best way to help someone write markup is to make sure they can read markup.

Thursday, February 8th, 2018

Progressive enhancement and the things that are here to stay, with Jeremy Keith | Fixate

I enjoyed chatting to Larry Botha on the Fixate On Code podcast—I hope you’ll enjoy hearing it.

Available for your huffduffing pleasure.

Wednesday, February 7th, 2018

Resilience: Building a Robust Web That Lasts by Jeremy Keith—An Event Apart video on Vimeo

This is the rarely-seen hour-long version of my Resilience talk. It’s the director’s cut, if you will, featuring an Arthur C. Clarke sub-plot that goes from the telegraph to the World Wide Web to the space elevator.

Resilience: Building a Robust Web That Lasts by Jeremy Keith—An Event Apart video

Wednesday, January 31st, 2018

Stimulus 1.0: A modest JavaScript framework for the HTML you already have

All our applications have server-side rendered HTML at their core, then add sprinkles of JavaScript to make them sparkle.

Yup!—I’m definitely liking the sound of this Stimulus JavaScript framework.

It’s designed to read as a progressive enhancement when you look at the HTML it’s addressing.

Monday, January 29th, 2018

GDPR and Google Analytics

Enforcement of the European Union’s General Data Protection Regulation is coming very, very soon. Look busy. This regulation is not limited to companies based in the EU—it applies to any service anywhere in the world that can be used by citizens of the EU.

It’s less about data protection and more like a user’s bill of rights. That’s good. Cennydd has written a techie’s rough guide to GDPR.

The Open Data Institute’s Jeni Tennison wrote down her thoughts on how it could change data portability in particular. While she welcomes GDPR, she has some misgivings.

Blaine—who really needs to get a blog—shared his concerns in the form of the online equivalent of interpretive dance …a twitter thread (it’s called a thread because it inevitably gets all tangled, and it’s easy to break.)

The interesting thing about the so-called “cookie law” is that it makes no mention of cookies whatsoever. It doesn’t list any specific technology. Instead it states that any means of tracking or identifying users across websites requires disclosure. So if you’re setting a cookie just to manage state—so that users can log in, or keep items in a shopping basket—the legislation doesn’t apply. But as soon as your site allows a third-party to set a cookie, it’s banner time.

Google Analytics is a classic example of a third-party service that uses cookies to track people across domains. That’s pretty much why it exists. We, as site owners, get to use this incredibly powerful tool, and all we have to do in return is add one little snippet of JavaScript to our pages. In doing so, we’re allowing a third party to read or write a cookie from their domain.

Before Google Analytics, Google—the search engine business—was able to identify and track what users were searching for, and which search results they clicked on. But as soon as the user left google.com, the trail went cold. By creating an enormously useful analytics product that only required site owners to add a single line of JavaScript, Google—the online advertising business—gained the ability to keep track of users across most of the web, whether they were on a site owned by Google or not.

Under the old “cookie law”, using a third-party cookie-setting service like that meant you had to inform any of your users who were citizens of the EU. With GDPR, that changes. Now you have to get consent. A dismissible little overlay isn’t going to cut it any more. Implied consent isn’t enough.

Now this situation raises an interesting question. Who’s responsible for getting consent? Is it the site owner or the third party whose script is the conduit for the tracking?

In the first scenario, you’d need to wait for an explicit agreement from a visitor to your site before triggering the Google Analytics functionality. Suddenly it’s not as simple as adding a single line of JavaScript to your site.

In the second scenario, you don’t do anything differently than before—you just add that single line of JavaScript. But now that script would need to launch the interface for getting consent before doing any tracking. Google Analytics would go from being something invisible to something that directly impacts the user experience of your site.

I’m just using Google Analytics as an example here because it’s so widespread. This also applies to third-party sharing buttons—Twitter, Facebook, etc.—and of course, advertising.

In the case of advertising, it gets even thornier because quite often, the site owner has no idea which third party is about to do the tracking. Many, many sites use intermediary services (y’know, ‘cause bloated ad scripts aren’t slowing down sites enough so let’s throw some just-in-time bidding into the mix too). You could get consent for the intermediary service, but not for the final advert—neither you nor your site’s user would have any idea what they were consenting to.

Interesting times. One way or another, a massive amount of the web—every website using Google Analytics, embedded YouTube videos, Facebook comments, embedded tweets, or third-party advertisements—will be liable under GDPR.

It’s almost as if the ubiquitous surveillance of people’s every move on the web wasn’t a very good idea in the first place.

Friday, January 26th, 2018

HTML templating with vanilla JavaScript ES2015 Template Literals – Ben Frain

Ben makes the very good point that template literals allow you to do a lot of useful stuff that previously would’ve required a library:

Template Literals afford a lot of power with no library overhead. I will definitely continue to use them when complexity of handlebars or similar is overkill.

Chris made a similar observation a little while back. Throw in a little script like lit-html and now you’ve got DOM-diffing too. You might not need insert-current-framework-name after all.

Kinda cool that these mini-libraries exist that do useful things for us, so when situations arise that we want a feature that a big library has, but don’t want to use the whole big library, we got smaller options.

Wednesday, January 24th, 2018

React, Redux and JavaScript Architecture

I still haven’t used React (I know, I know) but this looks like a nice explanation of React and Redux.