Tags: cd

6

sparkline

The Rational Optimist

As part of my ongoing obsession with figuring out how we evaluate technology, I finally got around to reading Matt Ridley’s The Rational Optimist. It was an exasperating read.

On the one hand, it’s a history of the progress of human civilisation. Like Steven Pinker’s The Better Angels Of Our Nature, it piles on the data demonstrating the upward trend in peace, wealth, and health. I know that’s counterintuitive, and it seems to fly in the face of what we read in the news every day. Mind you, The New York Times took some time out recently to acknowledge the trend.

Ridley’s thesis—and it’s a compelling one—is that cooperation and trade are the drivers of progress. As I read through his historical accounts of the benefits of open borders and the cautionary tales of small-minded insular empires that collapsed, I remember thinking, “Boy, he must be pretty upset about Brexit—his own country choosing to turn its back on trade agreements with its neighbours so that it could became a small, petty island chasing the phantom of self-sufficiency”. (Self-sufficiency, or subsistence living, as Ridley rightly argues throughout the book, correlates directly with poverty.)

But throughout these accounts, there are constant needling asides pointing to the perceived enemies of trade and progress: bureaucrats and governments, with their pesky taxes and rule of law. As the accounts enter the twentieth century, the gloves come off completely revealing a pair of dyed-in-the-wool libertarian fists that Ridley uses to pummel any nuance or balance. “Ah,” I thought, “if he cares more about the perceived evils of regulation than the proven benefits of trade, maybe he might actually think Brexit is a good idea after all.”

It was an interesting moment. Given the conflicting arguments in his book, I could imagine him equally well being an impassioned remainer as a vocal leaver. I decided to collapse this probability wave with a quick Google search, and sure enough …he’s strongly in favour of Brexit.

In theory, an author’s political views shouldn’t make any difference to a book about technology and progress. In practice, they barge into the narrative like boorish gatecrashers threatening to derail it entirely. The irony is that while Ridley is trying to make the case for rational optimism, his own personal political feelings are interspersed like a dusting of irrationality, undoing his own well-researched case.

It’s not just the argument that suffers. Those are the moments when the writing starts to get frothy, if not downright unhinged. There were a number of confusing and ugly sentences that pulled me out of the narrative and made me wonder where the editor was that day.

The last time I remember reading passages of such poor writing in a non-fiction book was Nassim Nicholas Taleb’s The Black Swan. In the foreword, Taleb provides a textbook example of the Dunning-Kruger effect by proudly boasting that he does not need an editor.

But there was another reason why I thought of The Black Swan while reading The Rational Optimist.

While Ridley’s anti-government feelings might have damaged his claim to rationality, surely his optimism is unassailable? Take, for example, his conclusions on climate change. He doesn’t (quite) deny that climate change is real, but argues persuasively that it won’t be so bad. After all, just look at the history of false pessimism that litters the twentieth century: acid rain, overpopulation, the Y2K bug. Those turned out okay, therefore climate change will be the same.

It’s here that Ridley succumbs to the trap that Taleb wrote about in his book: using past events to make predictions about inherently unpredictable future events. Taleb was talking about economics—warning of the pitfalls of treating economic data as though it followed a bell-curve curve, when it fact it’s a power-law distribution.

Fine. That’s simply a logical fallacy, easily overlooked. But where Ridley really lets himself down is in the subsequent defence of fossil fuels. Or rather, in his attack on other sources of energy.

When recounting the mistakes of the naysayers of old, he points out that their fundamental mistake is to assume stasis. Hence their dire predictions of war, poverty, and famine. Ehrlich’s overpopulation scare, for example, didn’t account for the world-changing work of Borlaug’s green revolution (and Ridley rightly singles out Norman Borlaug for praise—possibly the single most important human being in history).

Yet when it comes to alternative sources of energy, they are treated as though they are set in stone, incapable of change. Wind and solar power are dismissed as too costly and inefficient. The Rational Optimist was written in 2008. Eight years ago, solar energy must have indeed looked like a costly investment. But things have changed in the meantime.

As Matt Ridley himself writes:

It is a common trick to forecast the future on the assumption of no technological change, and find it dire. This is not wrong. The future would indeed be dire if invention and discovery ceased.

And yet he fails to apply this thinking when comparing energy sources. If anything, his defence of fossil fuels feels grounded in a sense of resigned acceptance; a sense of …pessimism.

Matt Ridley rejects any hope of innovation from new ideas in the arena of energy production. I hope that he might take his own words to heart:

By far the most dangerous, and indeed unsustainable thing the human race could do to itself would be to turn off the innovation tap. Not inventing, and not adopting new ideas, can itself be both dangerous and immoral.

Class teacher

ES6 introduced a whole bunch of new features to JavaScript. One of those features is the class keyword. This introduction has been accompanied by a fair amount of concern and criticism.

Here’s the issue: classes in JavaScript aren’t quite the same as classes in other programming languages. In fact, technically, JavaScript doesn’t really have classes at all. But some say that technically isn’t important. If it looks like a duck, and quacks like a duck, shouldn’t we call it a duck even if technically it’s somewhat similar—but not quite the same—species of waterfowl?

The argument for doing this is that classes are so familiar from other programming languages, that having some way of using classes in JavaScript—even if it isn’t technically the same as in other languages—brings a lot of benefit for people moving over to JavaScript from other programming languages.

But that comes with a side-effect. Anyone learning about classes in JavaScript will basically be told “here’s how classes work …but don’t look too closely.”

Now if you believe that outcomes matter more than understanding, then this is a perfectly acceptable trade-off. After all, we use computers every day without needing to understand the inner workings of every single piece of code under the hood.

It doesn’t sit well with me, though. I think that understanding how something works is important (in most cases). That’s why I favour learning underlying technologies first—HTML, CSS, JavaScript—before reaching for abstractions like frameworks and libraries. If you understand the way things work first, then your choice of framework, library, or any other abstraction is an informed choice.

The most common way that people refer to the new class syntax in JavaScript is to describe it as syntactical sugar. In other words, it doesn’t fundamentally introduce anything new under the hood, but it gives you a shorter, cleaner, nicer way of dealing with objects. It’s an abstraction. But because it’s an abstraction taken from other programming languages that work differently to JavaScript, it’s a bit of fudge. It’s a little white lie. The class keyword in JavaScript will work just fine as long as you don’t try to understand it.

My personal opinion is that this isn’t healthy.

I’ve come across two fantastic orators who cemented this view in my mind. At Render Conf in Oxford earlier this year, I had the great pleasure of hearing Ashley Williams talk about the challenges of teaching JavaScript. Skip to the 15 minute mark to hear her introduce the issues thrown up classes in JavaScript.

More recently, the mighty Kyle Simpson was on an episode of the JavaScript Jabber podcast. Skip to the 17 minute mark to hear him talk about classes in JavaScript.

(Full disclosure: Kyle also some very kind things about some of my blog posts at the end of that episode, but you can switch it off before it gets to that bit.)

Both Ashley and Kyle bring a much-needed perspective to the discussion of language design. That perspective is the perspective of a teacher.

In his essay on W3C’s design principles, Bert Bos lists learnability among the fundamental driving forces (closely tied to readability). Learnability and teachability are two sides of the same coin, and I find it valuable to examine any language decisions through that lens. With that mind, introducing a new feature into a language that comes with such low teachability value as to warrant a teacher actively telling a student not to learn how things really work …well, that just doesn’t seem right.

The Progressive Web App Dev Summit

I was in Amsterdam again at the start of last week for the Progressive Web App Dev Summit, organised by Google. Most of the talks were given by Google employees, but not all—this wasn’t just a European version of Google I/O. Representatives from Opera, Mozilla, Samsung, and Microsoft were also there, and there were quite a few case studies from independent companies. That was very gratifying to see.

Almost all the talks were related to progressive web apps. I say, “almost all” because there were occasional outliers. There was a talk on web components, which don’t have anything directly to do with progressive web apps (and I hope there won’t be any attempts to suggest otherwise), and another on rendering performance that had good advice for anyone building any kind of website. Most of the talks were about the building blocks of progressive web apps: HTTPS, Service Workers, push notifications, and all that jazz.

I was very pleased to see that there was a move away from the suggesting that single-page apps with the app-shell architecture model were the only way of building progressive web apps.

There were lots of great examples of progressively enhancing existing sites into progressive web apps. Jeff Posnick’s talk was a step-by-step walkthrough of doing exactly that. Reading through the agenda, I was really happy to see this message repeated again and again:

In this session we’ll take an online-only site and turn it into a fully network-resilient, offline-first installable progressive web app. We’ll also break out of the app shell and look at approaches that better-suit traditional server-driven sites.

Progressive Web Apps should work everywhere for every user. But what happens when the technology and API’s are not available for in your users browser? In this talk we will show you how you can think about and build sites that work everywhere.

Progressive Web Apps should load fast, work great offline, and progressively enhance to a better experience in modern browsers.

How do you put the “progressive” into your current web app?

You can (and should!) build for the latest and greatest browsers, but through a collection of fallbacks and progressive enhancements you can bring a lot tomorrow’s web to yesterday’s browsers.

I think this is a really smart move. It’s a lot easier to sell people on incremental changes than it is to convince them to rip everything out and start from scratch (another reason why I’m dubious about any association between web components and progressive web apps—but I’ll save that for another post).

The other angle that I really liked was the emphasis on emerging markets, not just wealthy westerners. Tal Oppenheimer’s talk Building for Billions was superb, and Alex kicked the whole thing off with some great facts and figures on mobile usage.

In my mind, these two threads are very much related. Progressive enhancement allows us to have our progressive web app cake and eat it too: we can make websites that can be accessed on devices with limited storage and slow networks, while at the same time ensuring those same sites take advantage of all the newest features in the latest and greatest browsers. I talked to a lot of Google devs about ways to measure the quality of a progressive web app, and I’m coming to the conclusion that a truly high-quality site is one that can still be accessed by a proxy browser like Opera Mini, while providing a turbo-charged experience in the latest version of Chrome. If you think that sounds naive or unrealistic, then I think you might want to dive deeper into all the technologies that make progressive web apps so powerful—responsive design, Service Workers, a manifest file, HTTPS, push notifications; all of those features can and should be used in a layered fashion.

Speaking of Opera, Andreas kind of stole the show, demoing the latest interface experiments in Opera Mobile.

That ambient badging that Alex was talking about? Opera is doing it. The importance of being able to access URLs that I’ve been ranting about? Opera is doing it.

Then we had the idea to somehow connect it to the “pull-to-refresh” spinner, as a secondary gesture to the left or right.

Nice! I’m looking forward to seeing what other browsers come up with it. It’s genuinely exciting to see all these different browser makers in complete agreement on which standards they want to support, while at the same time differentiating their products by competing on user experience. Microsoft recently announced that progressive web apps will be indexed in their app store just like native apps—a really interesting move.

The Progressive Web App Dev Summit wrapped up with a closing panel, that I had the honour of hosting. I thought it was very brave of Paul to ask me to host this, considering my strident criticism of Google’s missteps.

Initially there were going to be six people on the panel. Then it became eight. Then I blinked and it suddenly became twelve. Less of a panel, more of a jury. Half the panelists were from Google and the other half were from Opera, Microsoft, Mozilla, and Samsung. Some of those representatives were a bit too media-trained for my liking: Ali from Microsoft tried to just give a spiel, and Alex Komoroske from Google wouldn’t give me a straight answer about whether he wants Android Instant apps to succeed—Jake was a bit more honest. I should have channelled my inner Paxman a bit more.

Needless to say, nobody from Apple was at the event. No surprise there. They’ve already promised to come to the next event. There won’t be an Apple representative on stage, obviously—that would be asking too much, wouldn’t it? But at least it looks like they’re finally making an effort to engage with the wider developer community.

All in all, the Progressive Web App Dev Summit was good fun. I found the event quite inspiring, although the sausage festiness of the attendees was depressing. It would be good if the marketing for these events reached a wider audience—I met a lot of developers who only found out about it a week or two before the event.

I really hope that people will come away with the message that they can get started with progressive web apps right now without having to re-architect their whole site. Right now the barrier to entry is having your site running on HTTPS. Once you’ve got that up and running, it’s pretty much a no-brainer to add a manifest file and a basic Service Worker—to boost performance if nothing else. From there, you’re in a great position to incrementally add more and more features—an offline-first approach with your Service Worker, perhaps? Or maybe start dabbling in push notifications. The great thing about all of these technologies (with the glaring exception of web components in their current state) is that you don’t need to bet the farm on any of them. Try them out. Use them as enhancements. You’ve literally got nothing to lose …and your users have everything to gain.

Thank you, jQuery

I turned Huffduffer into a progressive web app recently. It was already running on HTTPS so I didn’t have much to do. One manifest file and one basic Service Worker did the trick.

Getting the 'add to home screen' prompt for https://huffduffer.com/ on Android Chrome.

I also did a bit of spring cleaning, refactoring some CSS. The site dates to 2008 so there’s plenty in there that I would do very differently today. Still, considering the age of the code, I wasn’t cursing my past self too much.

After that, I decided to refactor the JavaScript too. There, I had a clear goal: could I remove the dependency on jQuery?

It turned out to be pretty straightforward. I was able to bring my total JavaScript file size down to 3K (gzipped). Pretty much everything I was doing in jQuery could be just as easily accomplished with DOM methods like addEventListener and querySelectorAll, and objects like XMLHttpRequest and classList.

Of course, the reason why half of those handy helpers exist is because of jQuery. Certainly in the case of querySelector and querySelectorAll, jQuery blazed a trail for browsers and standards bodies to pave. In some cases, like animation, the jQuery-led solutions ended up in CSS instead of JavaScript, but the story was the same: developers used the heck of jQuery and browser makers paid attention to that. This is something that Jack spoke about at Render Conf a little while back.

Brian once said of PhoneGap that its ultimate purpose is to cease to exist. I think of jQuery in a similar way.

jQuery turned ten years old this year, and jQuery version 3.0 was just released. Congratulations, jQuery! You have served the web well.

Twilight

I went out to The Albert the other night to see Twilight Hotel play. There were really good, so after the show I bought their CD, When The Wolves Go Blind.

It was only when I got home that I realised that I had no device that could play Compact Discs. I play all my music on my iPod or iPhone connected to a speaker dock. And my computer is a Macbook Air …no disc drive. So I had to bring the CD into work with me, stick into my iMac and rip the songs from there.

It’s funny how format (or storage medium) obsolescence creeps up on you like that. I wonder how long it will be until I’m not using any kind of magnetic medium at all.

Righting copywrongs

This week, when I’m not battling the zombies of the linkrot apocalypse with a squirrel, I’m preparing my presentation for Bamboo Juice. I wasted far too much time this morning watching the ancillary material from the BBC’s The Speaker in the vain hope that it might help my upcoming public speaking engagement.

My talk is going to be a long zoom presentation along the lines of Open Data and The Long Web. I should concentrate on technologies, standards and file formats but I find myself inevitably being drawn in to the issue of copyright and the current ludicrous state of things.

If you feel like getting as riled up as I am, be sure to listen to James Boyle as he speaks at the RSA or is interviewed on CBC. Or you could just cut to the chase and read his book, The Public Domain. If you want to try before you buy, you can read the entire book online in PDF or HTML format—I recommend reading that version with the help of the fantastic .

As if any proof were needed that this is an important, current, relevant issue, Tom reminds me that the future of our culture is under threat again tomorrow. I have duly written to some of my MEPs. Fortunately, I have a most excellent representative:

We’re talking about a gigantic windfall for a few multinational companies, taking millions of pounds from the pockets of consumers and giving it to the record labels. Also, the artistic cost of making songs from the last 50 years public property, thus allowing endless sampling by DJs and other artists, must be taken into consideration.

The UK Greens are committed to a system known as Creative Commons, which offers a flexible range of protections and freedoms for authors and artists. We want to encourage innovation and prevent large corporations from controlling and benefitting from our cultural legacy.