Tags: syndication



Owning my words

When I wrote a few words about progressive enhancement recently, I linked to Karolina’s great article The Web Isn’t Uniform. I was a little reluctant to link to it, not because of the content—which is great—but because of its location on Ev’s blog. I much prefer to link directly to people’s own websites (I have a hunch that those resources tend to last longer too) but I understand that Medium offers a nice low barrier to publishing.

That low barrier comes at a price. It means you have to put up with anyone and everyone weighing in with their own hot takes. The way the site works is that anyone who writes a comment on your article is effectively writing their own article—you don’t get to have any editorial control over what kind of stuff appears together with your words. There is very little in the way of community management once a piece is published.

Karolina’s piece attracted some particularly unsavoury snark—tech bros disagreeing in their brash bullying way. I linked to a few comments, leaving out the worst of the snark, but I couldn’t resist editorialising:

Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.

I knew even when I was writing it that it was unproductive, itself a snarky remark. Two wrongs don’t make a right. But I wanted to acknowledge that not only was bad behaviour happening, but that I was seeing it, and I wasn’t ignoring it. I guess it was mostly intended for Karolina—I wanted to extend some kind of acknowledgment that the cumulative weight of those sneering drive-by reckons is a burden that no one should have to put up with.

I knew that when I wrote about Medium being “where the opinions of self-entitled dudes flow like rain from the tech heavens” that I would (rightly) get pushback, and sure enough, I did …on Medium. Not on Twitter or anywhere else, just Medium.

I syndicate my posts to Ev’s blog, so the free-for-all approach to commenting doesn’t bother me that much. The canonical URL for my words remains on my site under my control. But for people posting directly to Medium and then having to put up with other people casually shitting all over their words, it must feel quite disempowering.

I have a similar feeling with Twitter. I syndicate my notes there and if the service disappeared tomorrow, I wouldn’t shed any tears. There’s something very comforting in knowing that any snarky nasty responses to my words are only being thrown at copies. I know a lot of my friends are disheartened about the way that Twitter has changed in recent years. I wish I could articulate how much better it feels to only use Twitter (or Medium or Facebook) as a syndication tool, like RSS.

There is an equal and opposite reaction too. I think it’s easier to fling off some thoughtless remarks when you’re doing it on someone else’s site. I bet you that the discourse on Ev’s blog would be of a much higher quality if you could only respond from your own site. I find I’m more careful with my words when I publish here on adactio.com. I’m taking ownership of what I say.

And when I do lapse and write snarky words like “Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.”, at least I’m owning my own snark. Still, I will endeavour to keep my snark levels down …but that doesn’t mean I’m going to turn a blind eye to bad behaviour.

Syndicating to Medium

When I brainpuked my thoughts on Google’s AMP project, I finished up by saying it was one more option for the Indie Web approach to syndication:

When I publish something on adactio.com in HTML, it already gets syndicated to different places. This is the Indie Web idea of POSSE: Publish (on your) Own Site, Syndicate Elsewhere. As well as providing RSS feeds, I’ve also got Twitter bots that syndicate to Twitter. An If This, Then That script pushes posts to Facebook. And if I publish a photo, it goes to Flickr. Now that Medium is finally providing a publishing API, I’ll probably start syndicating articles there as well. The more, the merrier.

Until Medium provided an API, I didn’t see much point in Medium. Let me clarify: I didn’t see much point in it for me. I’ve already got a website where I can publish whatever I like. For someone who doesn’t have their own website, I guess Medium—like Facebook, Twitter, Tumblr, etc.—provides a place to publish. I think this is what people mean when they use the word “platform” in a digital—rather than a North Sea oil drilling—sense.

Publishing exclusively on somebody else’s site works pretty well right up until the day the platform turns out to be a trap door and disappears from under you.

But I’m really puzzled by people who already have their own website choosing to publish on Medium instead. A shiny content farm is still a content farm.

“It’s the reach!” I’m told. That makes me sad. The whole point of the World Wide Web is that everybody has an equal opportunity to share their thoughts. You don’t need to ask anyone for permission. The gatekeepers of the previous century—record labels, book publishers, film producers—can’t stop you from making whatever you want and putting it out there for the world to see. And thanks to the principle of net neutrality baked into the design of TCP/IP, no one gets preferential treatment.

Notice that I said “people who already have their own website choosing to publish on Medium instead.” That last bit is important. Using Medium to publish copies of what you’ve already published on your own site gives you the best of both worlds: ownership and reach. That’s what Kevin does, for example. And Jeffrey. Until recently that was quite a pain in the ass, requiring a manual copy’n’paste process.

Back when Medium first launched, Dave Winer said:

Let me enter the URL of something I write in my own space, and have it appear here as a first class citizen. Indistinguishable to readers from something written here.

It still isn’t quite that simple, but now that Medium has a publishing API, it’s relatively straightforward to syndicate copies of your posts to Medium at the moment you publish on your own site.

Here’s what I did…

First of all, I signed up for a Medium account. For the longest time, even this simple step was off-limits for me because Medium used to require authentication using Twitter. By itself, that’s not a problem. The problem was that Medium demanded write permissions for my Twitter account. Just say no.

Now it’s possible to sign up for Medium using email so that rudeness is less of an issue (although I’d really like to see Medium stop being so demanding when it comes to Twitter permissions, especially as the interface copy bends over backwards to promise that Medium would never post to Twitter on my behalf …so why ask for permission to do just that?).

Once I had a Medium account, I needed two pieces of secret information in order to use the API.

The first piece is an access token.

I went to my settings on Medium and scrolled all the way to the bottom to the heading “Integration tokens”. I entered a description (“Syndication from adactio.com”) and pressed the “Get integration token” button.

Now I could use that token to get the second piece of information: my user ID.

I opened up a browser tab and went to this URL: https://api.medium.com/v1/me?accessToken= …adding my new secret integration token to the end.

That returns a JSON response. One of the fields in the JSON object has the name “id”. The value of that field is my user ID on Medium.

With those two pieces of information, I could make an authenticated POST request using cURL. Here’s the PHP code I’m using. It’s probably terrible but please feel free to use it, copy it, fork it, or do anything else you want with it.

When I run that code, I get a JSON response back from Medium’s API. Assuming I get a successful response, I can store the URL of the Medium copy and link out to it from here. That copy on Medium has a corresponding link rel="canonical" in the head of the document pointing back here to adactio.com.

That’s pretty much it. I added a checkbox to my posting interface so that sending a copy of a post to Medium is just a toggle away. I’ll tick that checkbox when I post this. You could be reading this on my site or you could be reading the copy on Medium.

The code I wrote is pretty similar to how I post notes to Twitter and photos to Flickr. In fact, posting to Medium is more straightforward: Flickr requires three bits of secret information; Twitter requires four.

What would make this cross-posting with Medium really interesting would be if it could work in both directions. Then I’d be able to use the (very nice) writing interface on Medium to publish on adactio.com.

That’s not so far-fetched. I’ve already got a micropub endpoint here on my site (here’s the code). That’s how I’m able to use Instagram to post photos to my own site (using OwnYourGram). I let Instagram keep a copy of my photo. I’d be happy to let Medium keep a copy of my post.

We could make history:

We need to break out of the model where all these systems are monolithic and standalone. There’s art in each individual system, but there’s a much greater art in the union of all the systems we create.

AMPed up

Apple has Apple News. Facebook has Instant Articles. Now Google has AMP: Accelerated Mobile Pages.

The big players sure are going to a lot of effort to reinvent RSS.

That may sound like a flippant remark, but it’s not too far from the truth. In the case of Apple News, its current incarnation appears to be quite literally an RSS reader, at least until the unveiling of the forthcoming Apple News Format.

Google’s AMP project looks a little bit different to the offerings from Facebook and Apple. Rather than creating a proprietary format from scratch, it mandates a subset of HTML …with some proprietary elements thrown in (or, to use the more diplomatic parlance of the extensible web, custom elements).

The idea is that alongside the regular HTML version of your document, you provide a corresponding AMP HTML version. Because the AMP HTML version will be leaner and meaner, user agents can then grab the AMP HTML version and present that to the end user for a faster browsing experience.

So if an RSS feed is an alternate representation of a homepage or a listing of articles, then an AMP document is an alternate representation of a single article.

Now, my own personal take on providing alternate representations of documents is “Sure. Why not?” Here on adactio.com I provide RSS feeds. On The Session I provide RSS, JSON, and XML. And on Huffduffer I provide RSS, Atom, JSON, and XSPF, adding:

If you would like to see another format supported, share your idea.

Also, each individual item on Huffduffer has a corresponding oEmbed version (and, in theory, an RDF version)—an alternate representation of that item …in principle, not that different from AMP. The big difference with AMP is that it’s using HTML (of sorts) for its format.

All of this sounds pretty reasonable: provide an alternate representation of your canonical HTML pages so that user-agents (Twitter, Google, browsers) can render a faster-loading version …much like an RSS reader.

So should you start providing AMP versions of your pages? My initial reaction is “Sure. Why not?”

The AMP Project website comes with a list of frequently asked questions, which of course, nobody has asked. My own list of invented frequently asked questions might look a little different.

Will this kill advertising?

We live in hope.

Alas, AMP pages will still be able to carry advertising, but in a restricted form. No more scripts that track your movement across the web …unless the script is from an authorised provider, like say, Google.

But it looks like the worst performance offenders won’t be able to get their grubby little scripts into AMP pages. This is a good thing.

Won’t this kill journalism?

Of all the horrid myths currently in circulation, the two that piss me off the most are:

  1. Journalism requires advertising to survive.
  2. Advertising requires invasive JavaScript.

Put the two together and you get the gist of most of the chicken-littling articles currently in circulation: “Journalism requires invasive JavaScript to survive.”

I could argue against the first claim, but let’s leave that for another day. Let’s suppose for now that, sure, journalism requires advertising to survive. Fine.

It’s that second point that is fundamentally wrong. The idea that the current state of advertising is the only way of advertising is incredibly short-sighted and misguided. Invasive JavaScript is not a requirement for showing me an ad. Setting a cookie is not a requirement for showing me an ad. Knowing where I live, who my friends are, what my income level is, and where I’ve been on the web …none of these are requirements for showing me an ad.

It is entirely possible to advertise to me and treat me with respect at the same time. The Deck already does this.

And you know what? Ad networks had their chance. They had their chance to treat us with respect with the Do Not Track initiative. We asked them to respect our wishes. They told us get screwed.

Now those same ad providers are crying because we’re installing ad blockers. They can get screwed.


It is entirely possible to advertise within AMP pages …just not using blocking JavaScript.

For a nicely nuanced take on what AMP could mean for journalism, see Joshua Benton’s article on Nieman Lab—Get AMP’d: Here’s what publishers need to know about Google’s new plan to speed up your website.

Why not just make faster web pages?

Excellent question!

For a site like adactio.com, the difference between the regular HTML version of an article and the corresponding AMP version of the same article is pretty small. It’s a shame that I can’t just say “Hey, the current version of the article is the AMP version”, but that would require that I only use a subset of HTML and that I add some required guff to my page (including an unnecessary JavaScript file).

But for most of the news sites out there, the difference between their regular HTML pages and the corresponding AMP versions will be pretty significant. That’s because the regular HTML versions are bloated with third-party scripts, oversized assets, and cruft around the actual content.

Now it is in theory possible for these news sites to get rid of all those things, and I sincerely hope that they will. But that’s a big political struggle. I am rooting for developers—like the good folks at VOX—who have to battle against bosses who honestly think that journalism requires invasive JavaScript. Best of luck.

Along comes Google saying “If you want to play in our sandbox, you’re going to have to abide by our rules.” Those rules include performance best practices (for the most part—I take issue with some of the requirements, and I’ll go into that in more detail in a moment).

Now when the boss says “Slap a three megabyte JavaScript library on it so we can show a carousel”, the developers can only respond with “Google says No.”

When the boss says “Slap a ton of third-party trackers on it so we can monetise those eyeballs”, the developers can only respond with “Google says No.”

Google have used their influence like this before and it has brought them accusations of monopolistic abuse. Some people got very upset when they began labelling (and later ranking) mobile-friendly pages. Personally, I’ve got no issue with that.

In this particular case, Google aren’t mandating what you can and can’t do on your regular HTML pages; only what you can and can’t do on the corresponding AMP page.

Which brings up another question…

Will the AMP web kill the open web?

If we all start creating AMP versions of our pages, and those pages are faster than our regular HTML versions, won’t everyone just see the AMP versions without ever seeing the “full” versions?

Tim articulates a legitimate concern:

This promise of improved distribution for pages using AMP HTML shifts the incentive. AMP isn’t encouraging better performance on the web; AMP is encouraging the use of their specific tool to build a version of a web page. It doesn’t feel like something helping the open web so much as it feels like something bringing a little bit of the walled garden mentality of native development onto the web.

That troubles me. Using a very specific tool to build a tailored version of my page in order to “reach everyone” doesn’t fit any definition of the “open web” that I’ve ever heard.

Fair point. But I also remember that a lot of people were upset by RSS. They didn’t like that users could go for months at a time without visiting the actual website, and yet they were reading every article. They were reading every article in non-browser user agents in a format that wasn’t HTML. On paper that sounds like the antithesis of the open web, but in practice there was always something very webby about RSS, and RSS feed readers—it put the power back in the hands of the end users.

Some people chose not to play ball. They only put snippets in their RSS feeds, not the full articles. Maybe some publishers will do the same with the AMP versions of their articles: “To read more, click here…”

But I remember what generally tended to happen to the publishers who refused to put the full content in their RSS feeds. We unsubscribed.

Still, I share the concern that any one company—whether it’s Facebook, Apple, or Google—should wield so much power over how we publish on the web. I don’t think you have to be a conspiracy theorist to view the AMP project as an attempt to replace the existing web with an alternate web, more tightly controlled by Google (albeit a faster, more performant, tightly-controlled web).

My hope is that the current will flow in both directions. As well as publishers creating AMP versions of their pages in order to appease Google, perhaps they will start to ask “Why can’t our regular pages be this fast?” By showing that there is life beyond big bloated invasive web pages, perhaps the AMP project will work as a demo of what the whole web could be.

I’ve been playing around with the AMP HTML spec. It has some issues. The good news is that it’s open source and the project owners seem receptive to feedback.


No external JavaScript is allowed in an AMP HTML document. This covers third-party libraries, advertising and tracking scripts. This is A-okay with me.

The reasons given for this ban are related to performance and I agree with them completely. Big bloated JavaScript libraries are one of the biggest performance killers on the web. I’m happy to leave them at the door (although weirdly, web fonts—another big performance killer—are allowed in).

But then there’s a bit of an about-face. In order to have a valid AMP HTML page, you must include a piece of third-party JavaScript. In this case, the third party is Google and the JavaScript file is what handles the loading of assets.

This seems a bit strange to me; on the one hand claiming that third-party JavaScript is bad for performance and on the other, requiring some third-party JavaScript. As Justin says:

For me this is loading one thing too many… the AMP JS library. Surely the document itself is going to be faster than loading a library to try and make it load faster.

On the plus side, this third-party JavaScript is loaded asynchronously. It seems to mostly be there to handle the rendering of embedded content: images, videos, audio, etc.

Embedded content

If you want audio, video, or images on your page, you must use propriet… custom elements like amp-audio, amp-video, and amp-img. In the case of images, I can see how this is a way of getting around the browser’s lookahead pre-parser (although responsive images also solve this problem). In the case of audio and video, the standard audio and video elements already come with a way of specifying preloading behaviour using the preload attribute. Very odd.

Justin again:

I’m not sure if this is solving anything at the moment that we’re not already fixing with something like responsive images.

To use amp-img for images within the flow of a document, you’ll need to specify the dimensions of the image. This makes sense from a rendering point of view—knowing the width and height ahead of time avoids repaints and reflows. Alas, in many of the cases here on adactio.com, I don’t know the dimensions of the images I’m including. So any of my AMP HTML pages that include images will be invalid.

Overall, the way that AMP HTML handles embedded content looks like a whole lot of wheel reinvention. I like the idea of providing custom elements as an option for authors. I hate the idea of making them a requirement.


If you want to provide metadata about your document, AMP HTML currently requires the use of Google’s Schema.org vocabulary. This has a big whiff of vendor lock-in to it. I’ve flagged this up as an issue and Aaron is pushing a change so hopefully this will be resolved soon.


In its initial release, the AMP HTML spec came with some nasty surprises for accessibility. The biggest is probably the requirement to include this in your viewport meta element:


Yowzers! That’s some slap in the face to decent web developers everywhere. Fortunately this has been flagged up and I’m hoping it will be fixed soon.

If it doesn’t get fixed, it’s quite a non-starter. It beggars belief that Google would mandate to authors that they must make their pages inaccessible to pinch/zoom. I would hope that many developers would rebel against such a draconian injunction. If that happens, it’ll be interesting to see what becomes of those theoretically badly-formed AMP HTML documents. Technically, they will fail validation, but for very good reason. Will those accessible documents be rejected?

Please get involved on this issue if this is important to you (hint: this should be important to you).

There are a few smaller issues. Initially the :focus pseudo-class was disallowed in author CSS, but that’s being fixed.

Currently AMP HTML documents must have this line:

<style>body {opacity: 0}</style><noscript><style>body {opacity: 1}</style></noscript>


That’s a horrible conflation of JavaScript availability and CSS. It’s being fixed though, and soon all the opacity jiggery-pokery will only happen via JavaScript, which will be a big improvement: it should either all happen in CSS or all happen in JavaScript, but not the current mixture of the two.


The AMP HTML version of your page is not the canonical version. You can specify where the real HTML version of your document is by using rel="canonical". Great!

But how do you link from your canonical page out to the AMP HTML version? Currently you’re supposed to use rel="amphtml". No, they haven’t checked the registry. Again. I’ll go in and add it.

In the meantime, I’m also requesting that the amphtml value can be combined with the alternate value, seeing as rel values can be space separated:

rel="alternate amphtml" type="text/html"

See? Not that different to RSS:

rel="alterate" type="application/rss+xml"


When I publish something on adactio.com in HTML, it already gets syndicated to different places. This is the Indie Web idea of POSSE: Publish (on your) Own Site, Syndicate Elsewhere. As well as providing RSS feeds, I’ve also got Twitter bots that syndicate to Twitter. An If This, Then That script pushes posts to Facebook. And if I publish a photo, it goes to Flickr. Now that Medium is finally providing a publishing API, I’ll probably start syndicating articles there as well. The more, the merrier.

From that perspective, providing AMP HTML pages feels like just one more syndication option. If it were the only option, and I felt compelled to provide AMP versions of my content, I’d be very concerned. But for now, I’ll give it a whirl and see how it goes.

Here’s a bit of PHP I’m using to convert a regular piece of HTML into AMP HTML—it’s horrible code; it uses regular expressions on HTML which, as we all know, will summon the Elder Gods.

Home-grown and Delicious

I’ve been using Delicious since 2005—back when it was del.icio.us. I have over 2,000 bookmarks stored there. I moved to Magnolia for a while but we all know how that ended.

Back then I wrote:

Really, I should be keeping my links here on adactio.com, maybe pinging Delicious or some other social bookmarking site as a back-up.

Recently Delicious updated its bookmarklet-conjured interface, not for the better. I thought that I could get used to the changes, but I found them getting more annoying over time. Once again, I began to toy with the idea of self-hosting my bookmarks. I even exported all my data into a big XML file.

The very next day, some of Yahoo’s shit hit the web’s fan. Delicious, it was revealed, was to be sunsetted. As someone who doesn’t randomly choose to use meteorological phenomena as verbs, I didn’t know what that meant, but it didn’t sound good.

As the twittersphere erupted in anger and indignation, I was able to share my recently-acquired knowledge:

curl https://{your username}:{your password}@api.del.icio.us/v1/posts/all to get an XML file of your Delicious bookmarks.

A lot of people immediately migrated to Pinboard, which looks like an excellent service (and happens to be the work of Maciej Ceglowski, one of the best bloggers ever to put pixels to screen).

After all that, it turns out that “sunsetting” doesn’t mean “shooting in the head”, it means something more like “flogging off”, as clarified on the Delicious blog. But the damage had been done and, anyway, I had already made up my mind to bring my bookmarks in-house, so I began a fun weekend of hacking.

Setting up a new section of the site for links and importing my Delicious bookmarks was pretty straightforward. Creating a bookmarklet was pretty easy too—I already some experience of that with Huffduffer.

So now I’ll do my bookmarking right here on my own site. All’s well that ends well, right?

Well, not quite. Dom sounded a note of concern:

sigh. There goes the one thing I actually used delicious for, the social network. :(

Paul also pointed to the social aspect as the reason why he’s sticking with Delicious:

Personally, while I’ve always valued the site for its ability to store stuff, what’s always made Delicious most useful to me is its network pages in general, and mine in particular.

But it’s possible to have your Delicious cake and eat it at home. The Delicious API makes it quite easy to post links so I’ve added that into my own bookmarking code. Whenever I post a link here, it will also show up on my Delicious account. If you’re subscribed to my Delicious links, you should notice no change whatsoever.

This is exactly what Steven Pemberton was talking about when I liveblogged his XTech talk two years ago. Another Stephen, the good Mr. Hay, summed up the absurdity of the usual situation:

For a while we’ve posted our data all over the internet on all types of services. These services provide APIs so we can access the data we put into them, so that we can do things with that data. Read that again.

Now I’m hosting the canonical copies of my bookmarks, much like Tantek hosts the canonical copies of his tweets and syndicates them out to Twitter. Delicious gets to have my links as well, and I get to use Delicious as a tool for interacting with my data …only now I’m not limited to just what Delicious can offer me.

Once I had my new links section up and running, I started playing around with the Embedly API (I recently added the excellent oEmbed format to Huffduffer and I was impressed with its power). Whenever I bookmark a page with oEmbed support, I can pull content directly into my site. Take a look at the links I’ve tagged with “sci-fi” to see some examples of embedded Vimeo and Flickr content.

I definitely prefer this self-hosting-with-syndication way of doing things. I can use a service like Delicious without worrying about it going tits-up and taking all my data with it. The real challenge is going to be figuring out a way of applying that model to Twitter and Flickr. I’m curious to see which milestone I’ll hit first: 10,000 tweets or 10,000 photos. Either way, that’s a lot of my content on somebody else’s servers.