Tags: pub

58

sparkline

Posting to my site

I was idly thinking about the different ways I can post to adactio.com. I decided to count the ways.

Admin interface

This is the classic CMS approach. In my case the CMS is a crufty hand-rolled affair using PHP and MySQL that I wrote years ago. I log in to an admin interface and fill in a form, putting the text of my posts into a textarea. In truth, I usually write in a desktop text editor first, and then paste that into the textarea. That’s what I’m doing now—copying and pasting Markdown from the Typed app.

Directly from my site

If I’m logged in, I get a stripped down posting interface in the notes section of my site.

Notes posting interface

Bookmarklet

This is how I post links. When I’m at a URL I want to bookmark, I hit the “Bookmark it” bookmarklet in my browser’s bookmarks bar. That pops open a version of the admin interface tailored specifically for links. I really, really like bookmarklets. The one big downside is that they don’t work on mobile.

Text message

This is something I knocked together at Indie Web Camp Brighton 2015 using the Twilio API. It’s handy for posting notes if I’m travelling somewhere and data is at a premium. But I don’t use it that often.

Instagram

Thanks to Aaron’s OwnYourGram service—and the fact that my site has a micropub endpoint—I can post images from Instagram to my site. This used to happen instantaneously but Instagram changed their API rules for the worse. Between that and their shitty “algorithmic” timeline, I find myself using the service less and less. At this point I’m only on their for the doggos.

Swarm

Like OwnYourGram, Aaron’s OwnYourSwarm allows me to post check-ins and photos from the Swarm app to my site. Again, micropub makes it all possible.

OwnYourGram and OwnYourSwarm are very similar and could probably be abstracted into a generic service for posting from third-party apps to micropub endpoints. I’d quite like to post my check-ins on Untappd to my site.

Other people’s admin interfaces

Thanks to rel="me" and IndieAuth, I can log into other people’s posting interfaces using my own website as the log-in, and post to my micropub endpoint, like this. Quill is a good example of this. I don’t use it that much, but I really should—the editor interface is quite Medium-like in its design.

Anyway, those are the different ways I can update my website that I can think of right now.

Syndication

In terms of output, I’ve got a few different ways of syndicating what I post here:

Just so you know, if you comment on one of my posts on Facebook, I probably won’t see it. But if you reply to a copy of one of posts on Twitter or Instagram, it will show up over here on adactio.com thanks to the magic of Brid.gy and webmention.

Month maps

One of the topics I enjoy discussing at Indie Web Camps is how we can use design to display activity over time on personal websites. That’s how I ended up with sparklines on my site—it was the a direct result of a discussion at Indie Web Camp Nuremberg a year ago:

During the discussion at Indie Web Camp, we started looking at how silos design their profile pages to see what we could learn from them. Looking at my Twitter profile, my Instagram profile, my Untappd profile, or just about any other profile, it’s a mixture of bio and stream, with the addition of stats showing activity on the site—signs of life.

Perhaps the most interesting visual example of my activity over time is on my Github profile. Halfway down the page there’s a calendar heatmap that uses colour to indicate the amount of activity. What I find interesting is that it’s using two axes of time over a year: days of the month across the X axis and days of the week down the Y axis.

I wanted to try something similar, but showing activity by time of day down the Y axis. A month of activity feels like the right range to display, so I set about adding a calendar heatmap to monthly archives. I already had the data I needed—timestamps of posts. That’s what I was already using to display sparklines. I wrote some code to loop over those timestamps and organise them by day and by hour. Then I spit out a table with days for the columns and clumps of hours for the rows.

Calendar heatmap on Dribbble

I’m using colour (well, different shades of grey) to indicate the relative amounts of activity, but I decided to use size as well. So it’s also a bubble chart.

It doesn’t work very elegantly on small screens: the table is clipped horizontally and can be swiped left and right. Ideally the visualisation itself would change to accommodate smaller screens.

Still, I kind of like the end result. Here’s last month’s activity on my site. Here’s the same time period ten years ago. I’ve also added month heatmaps to the monthly archives for my journal, links, and notes. They’re kind of like an expanded view of the sparklines that are shown with each month.

From one year ago, here’s the daily distribution of

And then here’s the the daily distribution of everything in that month all together.

I realise that the data being displayed is probably only of interest to me, but then, that’s one of the perks of having your own website—you can do whatever you feel like.

Writing on the web

Some people have been putting Paul’s crazy idea into practice.

  • Mike revived his site a while back and he’s been posting gold dust ever since. I enjoy his no-holds-barred perspective on his time in San Francisco.
  • Garrett’s writing goes all the way back to 2005. The cumulative result is two fascinating interweaving narratives—one about his health, another about his business.
  • Charlotte has been documenting her move from Brighton to Sydney. Much as I love her articles about front-end development, I’m liking the slice-of-life updates on life down under even more.
  • Amber has a great way with words. As well as regularly writing on her blog, she’s two-thirds of the way through writing 100 words every day for 100 days.
  • Ethan has been writing about responsive design—of course—but it’s his more personal posts that make me really grateful for his site.
  • Jeffrey and Eric never stopped writing on their own sites. Sure, there’s good stuff on their about web design and development, but it’s the writing about their non-web lives that’s so powerful.

There are more people I could mention …but, to be honest, not that many more. Seems like most people are happy to only publish on Ev’s blog or not at all.

I know not everybody wants to write on the web, and that’s fine. But it makes me sad when people choose not to publish their thoughts because they think no-one will be interested, or that it’s all been said before. I understand where those worries come from, but I believe—no, I know—that they are unfounded.

It’s a world wide web out there. There’s plenty of room for everyone. And I, for one, love reading the words of others.

Progressive Web App questions

I got a nice email recently from Colin van Eenige. He wrote:

For my graduation project I’m researching the development of Progressive Web Apps and found your offline book called resilient web design. I was very impressed by the implementation of the website and it really was a nice experience.

I’m very interested in your vision on progressive web apps and what capabilities are waiting for us regarding offline content. Would it be fine if I’d send you some questions?

I said that would be fine, although I couldn’t promise a swift response. He sent me four questions. I finally got ‘round to sending my answers…

1. https://resilientwebdesign.com/ is an offline web book (progressive web app). What was the primary reason make it available like this (besides the other formats)?

Well, given the subject matter, it felt right that the canonical version of the book should be not just online, but made with the building blocks of the web. The other formats are all nice to have, but the HTML version feels (to me) like the “real” book.

Interestingly, it wasn’t too much trouble for people to generate other formats from the HTML (ePub, MOBI, PDF), whereas I think trying to go in the other direction would be trickier.

As for the offline part, that felt like a natural fit. I had already done that with a previous book of mine, HTML5 For Web Designers, which I put online a year or two after its print publication. In that case, I used AppCache for the offline functionality. AppCache is horrible, but this use case might be one of the few where it works well: a static book that’s never going to change. Cache invalidation is one of the worst parts of using AppCache so by not having any kinds of updates at all, I dodged that bullet.

But when it came time for Resilient Web Design, a service worker was definitely the right technology. Still, I’ve got AppCache in there as well for the browsers that don’t yet support service workers.

2. What effect you you think Progressive Web Apps will have on content consuming and do you think these will take over the purpose of some Native Apps?

The biggest effect that service workers could have is to change the expectations that people have about using the web, especially on mobile devices. Right now, people associate the web on mobile with long waits and horrible spammy overlays. Service workers can help solve that first part.

If people then start adding sites to their home screen, that will be a great sign that the web is really holding its own. But I don’t think we should get too optimistic about that: for a user, there’s no difference between a prompt on their screen saying “add to home screen” and a prompt on their screen saying “download our app”—they’re equally likely to be dismissed because we’ve trained people to dismiss anything that covers up the content they actually came for.

It’s entirely possible that websites could start taking over much of the functionality that previously was only possible in a native app. But I think that inertia and habit will keep people using native apps for quite some time.

The big exception is in markets where storage space on devices is in short supply. That’s where the decision to install a native app isn’t taken likely (given the choice between your family photos and an app, most people will reject the app). The web can truly shine here if we build lightweight, performant services.

Even in that situation, I’m still not sure how many people will end up adding those sites to their home screen (it might feel so similar to installing a native app that there may be some residual worry about storage space) but I don’t think that’s too much of a problem: if people get to a site via search or typing, that’s fine.

I worry that the messaging around “progressive web apps” is perhaps over-fetishising the home screen. I don’t think that’s the real battleground. The real battleground is in people’s heads; how they perceive the web and how they perceive native.

After all, if the average number of native apps installed in a month is zero, then that’s not exactly a hard target to match. :-)

3. What is your vision regarding Progressive Web Apps?

For me, progressive web apps don’t feel like a separate thing from making websites. I worry that the marketing of them might inflate expectations or confuse people. I like the idea that they’re simply websites that have taken their vitamins.

So my vision for progressive web apps is the same as my vision for the web: something that people use every day for all sorts of tasks.

I find it really discouraging that progressive web apps are becoming conflated with single page apps and the app shell model. Those architectural decisions have nothing to do with service workers, HTTPS, and manifest files. Yet I keep seeing the concepts used interchangeably. It would be a real shame if people chose not to use these great technologies just because they don’t classify what they’re building as an “app.”

If anything, it’s good ol’ fashioned content sites (newspapers, wikipedia, blogs, and yes, books) that can really benefit from the turbo boost of service worker+HTTPS+manifest.

I was at a conference recently where someone was given a talk encouraging people to build progressive web apps but discouraging people from doing it for their own personal sites. That’s a horrible, elitist attitude. I worry that this attitude is being codified in the term “progressive web app”.

4. What is the biggest learning you’ve had since working on Progressive Web Apps?

Well, like I said, I think that some people are focusing a bit too much on the home screen and not enough on the benefits that service workers can provide to just about any website.

My biggest learning is that these technologies aren’t for a specific subset of services, but can benefit just about anything that’s on the web. I mean, just using a service worker to explicitly cache static assets like CSS, JS, and some images is a no-brainer for almost any project.

So there you go—I’m very excited about the capabilities of these technologies, but very worried about how they’re being “sold”. I’m particularly nervous that in the rush to emulate native apps, we end up losing the very thing that makes the web so powerful: URLs.

In AMP we trust

AMP Conf was one of those deep dive events, with two days dedicated to one single technology: AMP.

Except AMP isn’t really one technology, is it? And therein lies the confusion. This was at the heart of the panel I was on. When we talk about AMP, we could be talking about one of three things:

  1. The AMP format. A bunch of web components. For instance, instead of using an img element on an AMP page, you use an amp-img element instead.
  2. The AMP rules. There’s one JavaScript file, hosted on Google’s servers, that turns those web components from spans into working elements. No other JavaScript is allowed. All your styles must be in a style element instead of an external file, and there’s a limit on what you can do with those styles.
  3. The AMP cache. The source of most confusion—and even downright enmity—this is what’s behind the fact that when you launch an AMP result from Google search, you don’t go to another website. You see Google’s cached copy of the page instead of the original.

The first piece of AMP—the format—is kind of like a collection of marginal gains. Where the img element might have some performance issues, the amp-img element optimises for perceived performance. But if you just used the AMP web components, it wouldn’t be enough to make your site blazingly fast.

The second part of AMP—the rules—is where the speed gains start to really show. You can’t have an external style sheet, and crucially, you can’t have any third-party scripts other than the AMP script itself. This is key to making AMP pages super fast. It’s not so much about what AMP does; it’s more about what it doesn’t allow. If you never used a single AMP component, but stuck to AMP’s rules disallowing external styles and scripts, you could easily make a page that’s even faster than what AMP can do.

At AMP Conf, Natalia pointed out that The Guardian’s non-AMP pages beat out the AMP pages for performance. So why even have AMP pages? Well, that’s down to the third, most contentious, part of the AMP puzzle.

The AMP cache turns the user experience of visiting an AMP page from fast to instant. While you’re still on the search results page, Google will pre-render an AMP page in the background. Not pre-fetch, pre-render. That’s why it opens so damn fast. It’s also what causes the most confusion for end users.

From my unscientific polling, the behaviour of AMP results confuses the hell out of people. The fact that the page opens instantly isn’t the problem—far from it. It’s the fact that you don’t actually go to an another page. Technically, you’re still on Google. An analogous mental model would be an RSS reader, or an email client: you don’t go to an item or an email; you view it in situ.

Well, that mental model would be fine if it were consistent. But in Google search, only some results will behave that way (the AMP pages) and others will behave just like regular links to other websites. No wonder people are confused! Some search results take them away and some search results keep them on Google …even though the page looks like a different website.

The price that we pay for the instantly-opening AMP pages from the Google cache is the URL. Because we’re looking at Google’s pre-rendered copy instead of the original URL, the address bar is not pointing to the site the browser claims to be showing. Everything in the body of the browser looks like an article from The Guardian, but if I look at the URL (which is what security people have been telling us for years is important to avoid being phished), then I’ll see a domain that is not The Guardian’s.

But wait! Couldn’t Google pre-render the page at its original URL?

Yes, they could. But they won’t.

This was a point that Paul kept coming back to: trust. There’s no way that Google can trust that someone else’s URL will play by the AMP rules (no external scripts, only loading embedded content via web components, limited styles, etc.). They can only trust the copies that they themselves are serving up from their cache.

By the way, there was a joint AMP/search panel at AMP Conf with representatives from both teams. As you can imagine, there were many questions for the search team, most of which were Glomar’d. But one thing that the search people said time and again was that Google was not hosting our AMP pages. Now I don’t don’t know if they were trying to make some fine-grained semantic distinction there, but that’s an outright falsehood. If I click on a link, and the URL I get taken to is a Google property, then I am looking at a page hosted by Google. Yes, it might be a copy of a document that started life somewhere else, but if Google are serving something from their cache, they are hosting it.

This is one of the reasons why AMP feels like such a bait’n’switch to me. When it first came along, it felt like a direct competitor to Facebook’s Instant Articles and Apple News. But the big difference, we were told, was that you get to host your own content. That appealed to me much more than having Facebook or Apple host the articles. But now it turns out that Google do host the articles.

This will be the point at which Googlers will say no, no, no, you can totally host your own AMP pages …but you won’t get the benefits of pre-rendering. But without the pre-rendering, what’s the point of even having AMP pages?

Well, there is one non-cache reason to use AMP and it’s a political reason. Beleaguered developers working for publishers of big bloated web pages have a hard time arguing with their boss when they’re told to add another crappy JavaScript tracking script or bloated library to their pages. But when they’re making AMP pages, they can easily refuse, pointing out that the AMP rules don’t allow it. Google plays the bad cop for us, and it’s a very valuable role. Sarah pointed this out on the panel we were on, and she was spot on.

Alright, but what about The Guardian? They’ve already got fast pages, but they still have to create separate AMP pages if they want to get the pre-rendering benefits when they show up in Google search results. Sorry, says Google, but it’s the only way we can trust that the pre-rendered page will be truly fast.

So here’s the impasse we’re at. Google have provided a list of best practices for making fast web pages, but the only way they can truly verify that a page is sticking to those best practices is by hosting their own copy, URLs be damned.

This was the crux of Paul’s argument when he was on the Shop Talk Show podcast (it’s a really good episode—I was genuinely reassured to hear that Paul is not gung-ho about drinking the AMP Kool Aid; he has genuine concerns about the potential downsides for the web).

Initially, I accepted this argument that Google just can’t trust the rest of the web. But the more I talked to people at AMP Conf—and I had some really, really good discussions with people away from the stage—the more I began to question it.

Here’s the thing: the regular Google search can’t guarantee that any web page is actually 100% the right result to return for a search. Instead there’s a lot of fuzziness involved: based on the content, the markup, and the number of trusted sources linking to this, it looks like it should be a good result. In other words, Google search trusts websites to—by and large—do the right thing. Sometimes websites abuse that trust and try to game the system with sneaky tricks. Google responds with penalties when that happens.

Why can’t it be the same for AMP pages? Let me host my own AMP pages (maybe even host my own AMP script) and then when the Googlebot crawls those pages—the same as it crawls any other pages—that’s when it can verify that the AMP page is abiding by the rules. If I do something sneaky and trick Google into flagging a page as fast when it actually isn’t, then take my pre-rendering reward away from me.

To be fair, Google has very, very strict rules about what and how to pre-render the AMP results it’s caching. I can see how allowing even the potential for a false positive would have a negative impact on the user experience of Google search. But c’mon, there are already false positives in regular search results—fake news, spam blogs. Googlers are smart people. They can solve—or at least mitigate—these problems.

Google says it can’t trust our self-hosted AMP pages enough to pre-render them. But they ask for a lot of trust from us. We’re supposed to trust Google to cache and host copies of our pages. We’re supposed to trust Google to provide some mechanism to users to get at the original canonical URL. I’d like to see trust work both ways.

The many formats of Resilient Web Design

If you don’t like reading in a web browser, you might like to know that Resilient Web Design is now available in more formats.

Jiminy Panoz created a lovely EPUB version. I tried it out in Apple’s iBooks app and it looks great. I tried to submit it to the iBooks store too, but that process threw up a few too many roadblocks.

Oliver Williams has created a MOBI version. That’s means you can read it on a Kindle. I plugged my old Kindle into my computer, dragged that file onto its disc image, and it worked a treat.

And there’s always the PDF versions; one in portrait and another in landscape format. Those were generated straight from the print styles.

Oh, and there’s the podcast. I’ve only released two chapters so far. The Christmas break and an untimely cold have slowed down the release schedule a little bit.

I’d love to make a physical, print-on-demand version of Resilient Web Design available—maybe through Lulu—but my InDesign skills are non-existent.

If you think the book should be available in any other formats, and you fancy having a crack at it, please feel free to use the source files.

Deep linking with fragmentions

When I was marking up Resilient Web Design I wanted to make sure that people could link to individual sections within a chapter. So I added IDs to all the headings. There’s no UI to expose that though—like the hover pattern that some sites use to show that something is linkable—so unless you know the IDs are there, there’s no way of getting at them other than “view source.”

But if you’re reading a passage in Resilient Web Design and you highlight some text, you’ll notice that the URL updates to include that text after a hash symbol. If that updated URL gets shared, then anyone following it should be sent straight to that string of text within the page. That’s fragmentions in action:

Fragmentions find the first matching word or phrase in a document and focus its closest surrounding element. The match is determined by the case-sensitive string following the # (or ## double-hash)

It’s a similar idea to Eric and Simon’s proposal to use CSS selectors as fragment identifiers, but using plain text instead. You can find out more about the genesis of fragmentions from Kevin. I’m using Jonathon Neal’s script with some handy updates from Matthew.

I’m using the fragmention support to power the index of the book. It relies on JavaScript to work though, so Matthew has come to the rescue again and created a version of the site with IDs for each item linked from the index (I must get around to merging that).

The fragmention functionality is ticking along nicely with one problem…

I’ve tweaked the typography of Resilient Web Design to within an inch of its life, including a crude but effective technique to avoid widowed words at the end of a paragraph. The last two words of every paragraph are separated by a UTF-8 no-break space character instead of a regular space.

That solves the widowed words problem, but it confuses the fragmention script. Any selected text that includes the last two words of a paragraph fails to match. I’ve tried tweaking the script, but I’m stumped. If you fancy having a go, please have at it.

Update: And fixed! Thanks to Lee.

The typography of a web book

I’m a sucker for classic old-style serif typefaces: Caslon, Baskerville, Bembo, Garamond …I love ‘em. That’s probably why I’ve always found the typesetting in Edward Tufte’s books so appealing—he always uses a combination of Bembo for body copy and Gill Sans for headings.

Earlier this year I stumbled on a screen version of Bembo used for Tufte’s digital releases called ET Book. Best of all, it’s open source:

ET Book is a Bembo-like font for the computer designed by Dmitry Krasny, Bonnie Scranton, and Edward Tufte. It is free and open-source.

When I was styling Resilient Web Design, I knew that the choice of typeface would be one of the most important decisions I would make. Remembering that open source ET Book font, I plugged it in to see how it looked. I liked what I saw. I found it particularly appealing when it’s full black on full white at a nice big size (with lower contrast or sizes, it starts to get a bit fuzzy).

I love, love, love the old-style numerals of ET Book. But I was disappointed to see that ligatures didn’t seem to be coming through (even when I had enabled them in CSS). I mentioned this to Rich and of course he couldn’t resist doing a bit of typographic sleuthing. It turns out that the ligature glyphs are there in the source files but the files needed a little tweaking to enable them. Because the files are open source, Rich was able to tweak away to his heart’s content. I then took the tweaked open type files and ran them through Font Squirrel to generate WOFF and WOFF2 files. I’ve put them on Github.

For this book, I decided that the measure would be the priority. I settled on a measure of around 55 to 60 characters—about 10 or 11 words per line. I used a max-width of 27em combined with Mike’s brilliant fluid type technique to maintain a consistent measure.

It looks great on small-screen devices and tablets. On large screens, the font size starts to get really, really big. Personally, I like that. Lots of other people like it too. But some people really don’t like it. I should probably add a font-resizing widget for those who find the font size too shocking on luxuriously large screens. In the meantime, their only recourse is to fork the CSS to make their own version of the book with more familiar font sizes.

The visceral reaction a few people have expressed to the font size reminds me of the flak Jeffrey received when he redesigned his personal site a few years back:

Many people who’ve visited this site since the redesign have commented on the big type. It’s hard to miss. After all, words are practically the only feature I haven’t removed. Some of the people say they love it. Others are undecided. Many are still processing. A few say they hate it and suggest I’ve lost my mind.

I wonder how the people who complained then are feeling now, a few years on, in a world with Medium in it? Jeffrey’s redesign doesn’t look so extreme any more.

Resilient Web Design will be on the web for a very, very, very long time. I’m curious to see if its type size will still look shockingly large in years to come.

Introducing Resilient Web Design

I wrote a thing. The thing is a book. But the book is not published on paper. This book is on the web. It’s a web book. Or “wook” if you prefer …please don’t prefer. Here it is:

Resilient Web Design.

It’s yours for free.

Much of the subject matter will be familiar if you’ve seen my conference talks from the past couple of years, particularly Enhance! and Resilience. But the book ended up taking some twists and turns that surprised me. It turned out to be a bit of a history book: the history of design, the history of the web.

Resilient Web Design is a short book. It’s between sixteen and seventeen megawords long. You could read the whole thing in a couple of hours. Or—because the book has seven chapters—you could take fifteen to twenty minutes out of a day to read one chapter and you’d have read the whole thing done in a week.

If you make websites in any capacity, I hope that this book will resonate with you. Even if you don’t make websites, I still hope there’s an interesting story in there for you.

You can read the whole book on the web, but if you’d rather have a single file to carry around, I’ve made some PDFs as well: one in portrait, one in landscape.

I’ve licensed the book quite liberally. It’s released under a Creative Commons attribution share-alike licence. That means you can re-use the material in any way you want (even commercial usage) as long as you provide some attribution and use the same licence. So if you’d like to release the book in some other format like ePub or anything, go for it.

I’m currently making an audio version of Resilient Web Design. I’ll be releasing it one chapter at a time as a podcast. Here’s the RSS feed if you want to subscribe to it. Or you can subscribe directly from iTunes.

I took my sweet time writing this book. I wrote the first chapter in March 2015. I wrote the last chapter in May 2016. Then I sat on it for a while, figuring out what to do with it. Eventually I decided to just put the whole thing up on the web—it seems fitting.

Whereas the writing took over a year of solid procrastination, making the website went surprisingly quickly. After one weekend of marking up and styling, I had most of it ready to go. Then I spent a while tweaking. The source files are on Github.

I’m pretty happy with the end result. I’ll write a bit more about some of the details over the next while—the typography, the offline functionality, print styles, and stuff like that. In the meantime, I hope you’ll peruse this little book at your leisure…

Resilient Web Design.

If you like it, please spread the word.

A decade on Twitter

I wrote my first tweet ten years ago.

That’s the tweetiest of tweets, isn’t it? (and just look at the status ID—only five digits!)

Of course, back then we didn’t call them tweets. We didn’t know what to call them. We didn’t know what to make of this thing at all.

I say “we”, but when I signed up, there weren’t that many people on Twitter that I knew. Because of that, I didn’t treat it as a chat or communication tool. It was more like speaking into the void, like blogging is now. The word “microblogging” was one of the terms floating around, grasped by those of trying to get to grips with what this odd little service was all about.

Twenty days after I started posting to Twitter, I wrote about how more and more people that I knew were joining :

The usage of Twitter is, um, let’s call it… emergent. Whenever I tell anyone about it, their first question is “what’s it for?”

Fair question. But their isn’t really an answer. You send messages either from the website, your mobile phone, or chat. What you post and why you’d want to do it is entirely up to you.

I was quite the cheerleader for Twitter:

Overall, Twitter is full of trivial little messages that sometimes merge into a coherent conversation before disintegrating again. I like it. Instant messaging is too intrusive. Email takes too much effort. Twittering feels just right for the little things: where I am, what I’m doing, what I’m thinking.

“Twittering.” Don’t laugh. “Tweeting” sounded really silly at first too.

Now at this point, I could start reminiscing about how much better things were back then. I won’t, but it’s interesting to note just how different it was.

  • The user base was small enough that there was a public timeline of all activity.
  • The characters in your username counted towards your 140 characters. That’s why Tantek changed his handle to be simply “t”. I tried it for a day. I think I changed my handle to “jk”. But it was too confusing so I changed it back.
  • We weren’t always sure how to write our updates either—your username would appear at the start of the message, so lots of us wrote our updates in the third person present (Brian still does). I’m partial to using the present continuous. That was how I wrote my reaction to Chris’s weird idea for tagging updates.

I think about that whenever I see a hashtag on a billboard or a poster or a TV screen …which is pretty much every day.

At some point, Twitter updated their onboarding process to include suggestions of people to follow, subdivided into different categories. I ended up in the list of designers to follow. Anil Dash wrote about the results of being listed and it reflects my experience too. I got a lot of followers—it’s up to around 160,000 now—but I’m pretty sure most of them are bots.

There have been a lot of changes to Twitter over the years. In the early days, those changes were driven by how people used the service. That’s where the @-reply convention (and hashtags) came from.

Then something changed. The most obvious sign of change was the way that Twitter started treating third-party developers. Where they previously used to encourage and even promote third-party apps, the company began to crack down on anything that didn’t originate from Twitter itself. That change reflected the results of an internal struggle between the people at Twitter who wanted it to become an open protocol (like email), and those who wanted it to become a media company (like Yahoo). The media camp won.

Of course Twitter couldn’t possibly stay the same given its incredible growth (and I really mean incredible—when it started to appear in the mainstream, in films and on TV, it felt so weird: this funny little service that nerds were using was getting popular with everyone). Change isn’t necessarily bad, it’s just different. Your favourite band changed when they got bigger. South by Southwest changed when it got bigger—it’s not worse now, it’s just very different.

Frank described the changing the nature of Twitter perfectly in his post From the Porch to the Street:

Christopher Alexander made a great diagram, a spectrum of privacy: street to sidewalk to porch to living room to bedroom. I think for many of us Twitter started as the porch—our space, our friends, with the occasional neighborhood passer-by. As the service grew and we gained followers, we slid across the spectrum of privacy into the street.

I stopped posting directly to Twitter in May, 2014. Instead I now write posts on my site and then send a copy to Twitter. And thanks to the brilliant Brid.gy, I get replies, favourites and retweets sent back to my own site—all thanks to Webmention, which just become a W3C proposed recommendation.

It’s hard to put into words how good this feels. There’s a psychological comfort blanket that comes with owning your own data. I see my friends getting frustrated and angry as they put up with an increasingly alienating experience on Twitter, and I wish I could explain how much better it feels to treat Twitter as nothing more than a syndication service.

When Twitter rolls out changes these days, they certainly don’t feel like they’re driven by user behaviour. Quite the opposite. I’m currently in the bucket of users being treated to new @-reply behaviour. Tressie McMillan Cottom has written about just how terrible the new changes are. You don’t get to see any usernames when you’re writing a reply, so you don’t know exactly how many people are going to be included. And if you mention a URL, the username associated with that website may get added to the tweet. The end result is that you write something, you publish it, and then you think “that’s not what I wrote.” It feels wrong. It robs you of agency. Twitter have made lots of changes over the years, but this feels like the first time that they’re going to actively edit what you write, without your permission.

Maybe this is the final straw. Maybe this is the change that will result in long-time Twitter users abandoning the service. Maybe.

Me? Well, Twitter could disappear tomorrow and I wouldn’t mind that much. I’d miss seeing updates from friends who don’t have their own websites, but I’d carry on posting my short notes here on adactio.com. When I started posting to Twitter ten years ago, I was speaking (or microblogging) into the void. I’m still doing that ten years on, but under my terms. It feels good.

I’m not sure if my Twitter account will still exist ten years from now. But I’m pretty certain that my website will still be around.

And now, if you don’t mind…

I’m off to grab some lunch.

Fifteen

My site has been behaving strangely recently. It was nothing that I could put my finger on—it just seemed to be acting oddly. When I checked to see if everything was okay, I was told that everything was fine, but still, I sensed something that was amiss.

I’ve just realised what it was. Last week on the 30th of September, I didn’t do or say anything special. That was the problem. I had forgotten my blog’s anniversary.

I’m so sorry, adactio.com! Honestly, I had been thinking about it for all of September but then on the day, one thing led to another, I was busy, and it just completely slipped my mind.

So this is a bit late, but anyway …happy fifteenth anniversary to this journal!

We’ve been through a lot together in those fifteen years, haven’t we, /journal? Oh, the places we’ve been and the things we’ve seen!

I remember where we were on our tenth anniversary: Bologna. Remember we were there for the first edition of the From The Front conference? Now, five years on, we’ve just been to the final edition of that same event—a bittersweet occasion.

Like I said five years ago:

It has been a very rewarding, often cathartic experience so far. I know that blogging has become somewhat passé in this age of Twitter and Facebook but I plan to keep on keeping on right here in my own little corner of the web.

I should plan something special for September 30th, 2021 …just to make sure I don’t forget.

Indie Web Camp Brighton 2016

Indie Web Camp Brighton 2016 is done and dusted. It’s hard to believe that it’s already in its fifth(!) year. As with previous years, it was a lot of fun.

IndieWebCampBrighton2016

The first day—the discussions day—covered a lot of topics. I led a session on service workers, where we brainstormed offline and caching strategies for personal websites.

There was a design session looking at alternatives to simply presenting everything in a stream. Some great ideas came out of that. And there was a session all about bookmarking and linking. That one really got my brain whirring with ideas for the second day—the making/coding day.

I’ve learned from previous Indie Web Camps that a good strategy for the second day is to have two tasks to tackle: one that’s really easy (so you’ve at least got that to demo at the end), and one that’s more ambitious. This time, I put together a list of potential goals, and then ordered them by difficulty. By the end of the day, I managed to get a few of them done.

First off, I added a small bit of code to my bookmarking flow, so that any time I link to something, I send a ping to the Internet Archive to grab a copy of that URL. So here’s a link I bookmarked to one of Remy’s blog posts, and here it is in the Wayback Machine—see how the date of storage matches the date of my link.

The code to do that was pretty straightforward. I needed to hit this endpoint:

http://web.archive.org/save/{url}

I also updated my bookmarklet for posting links so that, if I’ve highlighted any text on the page I’m linking to, that text is automatically pasted in to the description.

I tweaked my webmentions a bit so that if I receive a webmention that has a type of bookmark-of, that is displayed differently to a comment, or a like, or a share. Here’s an example of Aaron bookmarking one of my articles.

The more ambitious plan was to create an over-arching /tags area for my site. I already have tag-based navigation for my journal and my links:

But until this weekend, I didn’t have the combined view:

I didn’t get around to adding pagination. That’s something I should definitely add, because some of those pages get veeeeery long. But I did spend some time adding sparklines. They can be quite revealing, especially on topics that were hot ten years ago, but have faded over time, or topics that have becoming more and more popular with each year.

All in all, a very productive weekend.

Talking about hypertext

#CSSday starts off with a great history lesson of our industry by @adactio

I’ve just published a transcript of the talk I gave at the HTML Special that preceded CSS Day a couple of weeks back. I’ve also recorded an audio version for your huffduffing pleasure.

It’s not like the usual talks I give. The subject matter was assigned to me, Mission Impossible style. PPK wanted each speaker to give an entire talk on just one HTML element. He offered me the best element of them all: the A element.

There were a few different directions I could’ve taken it. I could’ve tried to make it practical, but I quickly dismissed that idea. Instead I went in the completely opposite direction, making it as pretentious as possible. I figured a talk about hypertext could afford to be winding and circuitous, building on some of the ideas I wrote about in my piece for The Manual a few years back. It’s quite self-indulgent of me, but I used it as an opportunity to geek out about some of my favourite things; from Borges, Babbage, and Bletchley to Leibniz, Lovelace, and Licklider.

I wouldn’t usually write out an entire talk word-for-word in advance, but somehow it felt right for this one. In fact, my talk preparation this time ‘round was very similar to the process Charlotte recently wrote about:

  1. Get everything out of my head and onto a mind map.
  2. Write chunks of content in short bursts—this was when I was buddying up with Paul.
  3. Put together a slide deck of visuals to support the narrative.
  4. Practice delivering the talk so I don’t look I’m just reading off a screen.

It takes me a long time to prepare talks. As the deadline for this one approached, I was getting quite panicked. It was touch and go there for a while, but I managed to get it done in time.

I’m pleased with how it turned out. On the day, I had fun delivering it. People seemed to like it too, which was gratifying.

Although with this kind of talk, it was inevitable that I wouldn’t be able to please everyone.

I guess this talk was a one-off affair. That said, if you’re putting on an event and you think this subject matter would be appropriate, let me know. I’d be more than happy to deliver it again.

Taking an online book offline

Application Cache is—as Jake so infamously described—not a good API. It was specced and shipped before developers had a chance to figure out what they really needed, and so AppCache turned out to be frustrating at best and downright dangerous in some situations. Its over-zealous caching combined with its byzantine cache invalidation ensured it was never going to become a mainstream technology.

There are very few use-cases for AppCache, but I think I hit upon one of them. Six years ago, A Book Apart published HTML5 For Web Designers. A year and a half later, I put the book online. The contents are never going to change. There’s a second edition of the book out now but if you want to read all the extra bits that Rachel added, you’re going to have to buy the book. The website for the original book is static and unchanging. That’s what made it such a good candidate for using AppCache. I could just set it and forget.

Except that’s no longer true. AppCache is being deprecated and browsers are starting to withdraw support. Chrome is already making sure that AppCache—like geolocation—no longer works on sites that aren’t served over HTTPS. That’s for the best. In retrospect, those APIs should never have been allowed over unsecured HTTP.

I mentioned that I spent the weekend switching all my book websites over to HTTPS, so AppCache should continue to work …for now. It’s only a matter of time before AppCache is removed completely from many of the browsers that currently support it.

Seeing as I’ve got the HTML5 For Web Designers site running on HTTPS now, I might as well go all out and make it a progressive web app. By far the biggest barrier to making a progressive web app is that first step of setting up HTTPS. It’s gotten cheaper—thanks to Let’s Encrypt Certbot—but it still involves mucking around in the command line with root access; I never wanted to become a sysadmin. But once that’s finally all set up, the other technological building blocks—a Service Worker and a manifest file—are relatively easy.

In this case, the Service Worker is using a straightforward bit of logic:

  • On installation, cache absolutely everything: HTML, CSS, images.
  • When anything is requested, grab it from the cache.
  • If it isn’t in the cache, try the network.
  • If the network doesn’t work, show an offline page (or image).

Basically I’m reproducing AppCache’s overzealous approach. It works for this site because the content is never going to change. I hope that this time, I really can just set it and forget it. I want the site to be an historical artefact, available at the same URL for at least my lifetime. I don’t want to have to maintain it or revisit it every few years to swap out one API for another.

Which brings me back to the way AppCache is being deprecated…

The Firefox team are very eager to ditch AppCache as soon as possible. On the one hand, that’s commendable. They’re rightly proud of shipping Service Workers and they want to encourage people to use the better technology instead. But it sure stings for the suckers (like me) who actually went and built stuff using AppCache.

In a weird way, I think this rush to deprecate AppCache might actually hurt the adoption of Service Workers. Let me explain…

At last year’s Edge Conference, Nolan Lawson gave a great presentation on storing data in the browser. He enumerated the many ways—past and present—that we could store data locally: WebSQL, Local Storage, IndexedDB …the list goes on. He also posed the question: why aren’t more people using insert-name-of-latest-API-here? To me it seemed obvious why more people weren’t diving into using the latest and greatest option for local data storage. It was because they had been burned before. The developers who rushed into trying previous solutions end up being mocked for their choice. “Still using that ol’ thing? Pffftt!”

You can see that same attitude on display from Mozilla as they push towards removing AppCache. Like in a comment that refers to developers using AppCache in production as “the angry hordes”. Reminds me of something Tom said:

In that same Mozilla thread, Soledad echoes Tom’s point:

As a member of the devrel team: I think that this should be better addressed in a blog post that someone from the team responsible for switching AppCache off should write, so everyone can understand the reasons and ask questions to those people.

I’d rather warn people beforehand, pointing them to that post and help them with migration paths than apply emergency mitigation strategies when a lot of people find their stuff stopped working in the newer Firefox…

Bravo! That same approach should have also been taken by the Chrome team when it came to their thread about punishing display:browser in manifest files. There was absolutely no communication with developers about this major decision. I only found out about it because Paul happened to mention it to me.

I was genuinely shocked by this:

Withholding the “add to home screen” prompt like that has a whiff of blackmail about it.

I can confirm that smell. When I was making the manifest file for HTML5 For Web Designers, I really wanted to put display: browser because I want people to be able to copy and paste URLs (for the book, for individual chapters, and for sections within chapters). But knowing that if I did that, Android users would never see the “add to home screen” prompt made me question that decision. I felt strong-armed into declaring display: standalone. And no, I’m not mollified by hand-waving reassurances that the Chrome team will figure out some solution for this. Figure out the solution first, then punish the saps like me who want to use display: browser to allow people to share URLs.

Anyway, the website for HTML5 For Web Designers is now using AppCache and Service Workers. The AppCache part will probably be needed for quite a while yet to provide offline support on iOS. Apple are really dragging their heels on Service Worker support, with at least one WebKit engineer actively looking for reasons not to implement it.

There’s a lot of talk about making apps work offline, but I think it’s just as important that we consider making information work offline. Books are a great example of this. To use the tired transport tropes, the website for a book is something you might genuinely want to access when you’re on a plane, or in the underground, or out at sea.

I really, really like progressive web apps. But I also think it’s important that we don’t fall into the trap of just trying to imitate native apps on the web. I love the idea of taking the best of the web—like information being permanently available at a URL—and marrying that up with the best of native—like offline access. I also like the idea of taking the best of books—a tome of thought—and marrying it up with the best of the web—hypertext.

I’d love to see more experimentation around online/offline hypertext/books. For now, you can visit HTML5 For Web Designers, add it to your home screen, and revisit it whenever and wherever you like.

Owning my words

When I wrote a few words about progressive enhancement recently, I linked to Karolina’s great article The Web Isn’t Uniform. I was a little reluctant to link to it, not because of the content—which is great—but because of its location on Ev’s blog. I much prefer to link directly to people’s own websites (I have a hunch that those resources tend to last longer too) but I understand that Medium offers a nice low barrier to publishing.

That low barrier comes at a price. It means you have to put up with anyone and everyone weighing in with their own hot takes. The way the site works is that anyone who writes a comment on your article is effectively writing their own article—you don’t get to have any editorial control over what kind of stuff appears together with your words. There is very little in the way of community management once a piece is published.

Karolina’s piece attracted some particularly unsavoury snark—tech bros disagreeing in their brash bullying way. I linked to a few comments, leaving out the worst of the snark, but I couldn’t resist editorialising:

Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.

I knew even when I was writing it that it was unproductive, itself a snarky remark. Two wrongs don’t make a right. But I wanted to acknowledge that not only was bad behaviour happening, but that I was seeing it, and I wasn’t ignoring it. I guess it was mostly intended for Karolina—I wanted to extend some kind of acknowledgment that the cumulative weight of those sneering drive-by reckons is a burden that no one should have to put up with.

I knew that when I wrote about Medium being “where the opinions of self-entitled dudes flow like rain from the tech heavens” that I would (rightly) get pushback, and sure enough, I did …on Medium. Not on Twitter or anywhere else, just Medium.

I syndicate my posts to Ev’s blog, so the free-for-all approach to commenting doesn’t bother me that much. The canonical URL for my words remains on my site under my control. But for people posting directly to Medium and then having to put up with other people casually shitting all over their words, it must feel quite disempowering.

I have a similar feeling with Twitter. I syndicate my notes there and if the service disappeared tomorrow, I wouldn’t shed any tears. There’s something very comforting in knowing that any snarky nasty responses to my words are only being thrown at copies. I know a lot of my friends are disheartened about the way that Twitter has changed in recent years. I wish I could articulate how much better it feels to only use Twitter (or Medium or Facebook) as a syndication tool, like RSS.

There is an equal and opposite reaction too. I think it’s easier to fling off some thoughtless remarks when you’re doing it on someone else’s site. I bet you that the discourse on Ev’s blog would be of a much higher quality if you could only respond from your own site. I find I’m more careful with my words when I publish here on adactio.com. I’m taking ownership of what I say.

And when I do lapse and write snarky words like “Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.”, at least I’m owning my own snark. Still, I will endeavour to keep my snark levels down …but that doesn’t mean I’m going to turn a blind eye to bad behaviour.

Content Buddy

I have a new role at Clearleft. It’s not a full-time role. It’s in addition to my existing role of …um …whatever it is I do at Clearleft.

Anyway, my new part-time role is that of being a content buddy. Sounds a little dismissive when I put it like that. Let me put in capitals…

My new part-time role is that of being a Content Buddy.

This is Ellen’s idea. She’s been recruiting Content Guardians and Content Buddies. The Guardians will be responsible for coaxing content out of people, encouraging to write that blog post, article, or case study. The role of the Content Buddy is to help shepherd those pieces into the world.

I have let it be known throughout the office that I am available—day or night, rain or shine—for proof-reading, editing, and general brain-storming and rubber-ducking.

On my first official day as a Content Buddy on Friday I helped Ben polish off a really good blog post (watch this space), listened to a first run-through of Charlotte’s upcoming talk at the Up Front conference in Manchester (which is shaping up to be most excellent), and got together with Paul for a mutual brainstorming session for future conference talks. The fact that Paul is no longer a full-time employee at Clearleft is a mere technicality—Content Buddies for life!

Paul is preparing a talk on design systems for Smashing Conference in Freiburg in September. I’m preparing a talk on the A element for the HTML Special part of CSS Day in Amsterdam in just one month’s time (gulp!). We had both already done a bit of mind-mapping to get a jumble of ideas down on paper. We learned that from Ellen’s excellent workshop.

Talk prep, phase 1: doodling.

Then we started throwing ideas back and forth, offering suggestions, and spotting patterns. Once we had lots of discrete chunks of stuff outlined (but no idea how to piece them together), we did some short intense spurts of writing using the fiendish TheMostDangerousWritingApp.com. I looked at Paul’s mind map, chose a topic from it for him, and he had to write on that non-stop for three to five minutes. Meanwhile he picked a topic from my mind map and I had to do the same. It was exhausting but also exhilarating. Very quickly we had chunks of content that we could experiment with, putting them in together in different ways to find different narrative threads. I might experiment with publishing them as short standalone blog posts.

The point was not to have polished, finished content but rather to get to the “shitty first draft” stage quickly. We were following Hemingway’s advice:

Write drunk, edit sober.

…but not literally. Mind you, I could certainly imagine combining beer o’clock on Fridays with Content Buddiness. That wasn’t an option on this particular Friday though, as I had to run off to band practice with Salter Cane. A very different, and altogether darker form of content creation.

Independently published

Jessica writes about The Heroine’s Journey.

Remy explains Why I love working with the web.

Ludwig dreams of designers and developers working Together.

Charlotte documents her technique Teaching the order of margins in CSS.

Craig field-tests The Leica Q.

Robin thinks about The New Web Typography.

Michael dives deep into A Complete History of the Millennium Falcon.

What do they all have in common? Nothing …other than the fact that each person chose to write on their own website. I’m grateful for that. These are all wonderful pieces of writing—they deserve a long life.

New edition

Six years ago I wrote a book and the brand new plucky upstart A Book Apart published it.

Six years! That’s like a geological age in internet years.

People liked the book. That’s very gratifying. I’m quite proud of it, and it always gives me a warm glow when someone tells me they enjoyed reading it.

Jeffrey asked me a while back about updating the book for a second edition—after all, six years is crazy long time for a web book to be around. I said no, because I just wouldn’t have the time, but mostly because—as the old proverb goes—you can step in the same river twice. Proud as I am of HTML5 For Web Designers, I consider it part of my past.

“What about having someone else update it?” Well, that made me nervous. I feel quite protective of my six year old.

“What about Rachel Andrew?” Ah, well, that’s a different story! Absolutely—if there’s one person I trust to bring the up to date, it’s Rachel.

She’s done a fine, fine job. The second edition of HTML5 For Web Designers is now available.

I know what you’re going to ask: how much difference is there between the two editions? Well, in the introduction to the new edition, I’m very pleased to say that Rachel has written:

I’ve been struck by how much has remained unchanged in that time.

There’s a new section on responsive images. That’s probably the biggest change. The section on video has been expanded to include captioning. There are some updates and tweaks to the semantics of some of the structural elements. So it’s not a completely different book; it’s very much an update rather than a rewrite.

If you don’t have a copy of HTML5 For Web Designers and you’ve been thinking that maybe it’s too out-of-date to bother with, rest assured that it is now bang up to date thanks to Rachel.

Jeffrey has written a lovely new foreword for the second edition:

HTML5 for Web Designers is a book about HTML like Elements of Style is a book about commas. It’s a book founded on solid design principles, and forged at the cutting edge of twenty-first century multi-device design and development.

Quakepunk

There’s an article in The New Yorker by Kathryn Schulz called The Really Big One. It’s been creating quite a buzz, and rightly so. It’s a detailed and evocative piece about the Cascadia fault:

When the next very big earthquake hits, the northwest edge of the continent, from California to Canada and the continental shelf to the Cascades, will drop by as much as six feet and rebound thirty to a hundred feet to the west—losing, within minutes, all the elevation and compression it has gained over centuries.

But there’s another hotspot on the other side of the country: the New Madrid fault line. There isn’t (yet) an article about in The New Yorker. There’s something better. Two articles by Maciej:

  1. Confronting New Madrid and
  2. Confronting New Madrid (Part 2).

The New Madrid Seismic Zone earned its reputation on the strength of three massive earthquakes that struck in the winter of 1811-1812. The region was very sparsely settled at the time, and became more sparsely settled immediately afterwards, as anyone with legs made it their life’s mission to get out of southern Missouri.

The articles are fascinating and entertaining in equal measure. No surprise there. I’ve said it before and I’ll say it again, Maciej Cegłowski is the best writer on the web. Every so often I find myself revisiting Argentina On Two Steaks A Day or A Rocket To Nowhere just for the sheer pleasure of it.

I want to read more from Maciej, and there’s a way to make it happen. If we back him on Kickstarter, he’ll take a trip to the Antarctic and turn it into words:

Soliciting donations to take a 36-day voyage to the Ross Ice Shelf, Bay of Whales and subantarctic islands, and write it up real good.

Let’s make it happen. Let’s throw money at him like he’s a performing monkey. Dance, writer-boy, dance!

Indie Web Camp Brighton 2015

Indie Web Camp Brighton 2015 is a wrap, and what a fun weekend it turned out to be.

I was really pleased with the turnout; not just the number of people who came along—many of them from very far afield—but also the range of skill levels and backgrounds represented. What a lovely bunch!

Indie Web Camp Brighton group photo

We kicked off the first day with a show’n’tell: people demoed their sites, showed their posting interfaces, and talked about what they’d like to improve. That sparked plenty of ideas for the afternoon discussions. But in between we had a nice long lunch break—it was a lovely sunny day in Brighton so we took full advantage of the sun, the street food, and the ice cream.

We wrapped up the first day around 5pm and I immediately dashed off to start loading in and sound checking for a Salter Cane gig that evening. That turned out to be a lot of fun—the audience were great—but I was completely knackered by the end of the day.

The weather on Sunday was far gloomier, but that was okay—we spent the whole day indoors anyway, coding and hacking away at stuff. Quite a few people were adding h-entry and h-card to their sites so I helped them out whenever I could. Meanwhile I was working on trying to get an SMS interface to my site working using the Twilio API.

The actual coding part went pretty quickly, but then I hit a wall. Whenever Twilio tried to reach a URL on my site, it would time out with a 504 error. I couldn’t figure out what was going on. On a hunch, I tried sending it to a subdomain that wasn’t being served over HTTPS. That worked fine. Now, I can’t imagine that Twilio is actually unable to work with secure endpoints, so it must be something to do with the way that I’ve enabled HTTPS on my domain. Anyway, the HTTP subdomain solution worked, and eleven minutes before demo time I finally had something to show.

We finished the day and the event with the quickfire demos. As always, there was some really impressive stuff—it’s quite amazing how much can get done in such a short space of time. Then we tidied up and headed across the street to the pub for a well-deserved pint.

All in all, a great weekend.