Tags: berlin

11

sparkline

Push without notifications

On the first day of Indie Web Camp Berlin, I led a session on going offline with service workers. This covered all the usual use-cases: pre-caching; custom offline pages; saving pages for offline reading.

But on the second day, Sebastiaan spent a fair bit of time investigating a more complex use of service workers with the Push API.

The Push API is what makes push notifications possible on the web. There are a lot of moving parts—browser, server, service worker—and, frankly, it’s way over my head. But I’m familiar with the general gist of how it works. Here’s a typical flow:

  1. A website prompts the user for permission to send push notifications.
  2. The user grants permission.
  3. A whole lot of complicated stuff happens behinds the scenes.
  4. Next time the website publishes something relevant, it fires a push message containing the details of the new URL.
  5. The user’s service worker receives the push message (even if the site isn’t open).
  6. The service worker creates a notification linking to the URL, interrupting the user, and generally adding to the weight of information overload.

Here’s what Sebastiaan wanted to investigate: what if that last step weren’t so intrusive? Here’s the alternate flow he wanted to test:

  1. A website prompts the user for permission to send push notifications.
  2. The user grants permission.
  3. A whole lot of complicated stuff happens behinds the scenes.
  4. Next time the website publishes something relevant, it fires a push message containing the details of the new URL.
  5. The user’s service worker receives the push message (even if the site isn’t open).
  6. The service worker fetches the contents of the URL provided in the push message and caches the page. Silently.

It worked.

I think this could be a real game-changer. I don’t know about you, but I’m very, very wary of granting websites the ability to send me push notifications. In fact, I don’t think I’ve ever given a website permission to interrupt me with push notifications.

You’ve seen the annoying permission dialogues, right?

In Firefox, it looks like this:

Will you allow name-of-website to send notifications?

[Not Now] [Allow Notifications]

In Chrome, it’s:

name-of-website wants to

Show notifications

[Block] [Allow]

But in actual fact, these dialogues are asking for permission to do two things:

  1. Receive messages pushed from the server.
  2. Display notifications based on those messages.

There’s no way to ask for permission just to do the first part. That’s a shame. While I’m very unwilling to grant permission to be interrupted by intrusive notifications, I’d be more than willing to grant permission to allow a website to silently cache timely content in the background. It would be a more calm technology.

Think of the use cases:

  • I grant push permission to a magazine. When the magazine publishes a new article, it’s cached on my device.
  • I grant push permission to a podcast. Whenever a new episode is published, it’s cached on my device.
  • I grant push permission to a blog. When there’s a new blog post, it’s cached on my device.

Then when I’m on a plane, or in the subway, or in any other situation without a network connection, I could still visit these websites and get content that’s fresh to me. It’s kind of like background sync in reverse.

There’s plenty of opportunity for abuse—the cache could get filled with content. But websites can already do that, and they don’t need to be granted any permissions to do so; just by visiting a website, it can add multiple files to a cache.

So it seems that the reason for the permissions dialogue is all about displaying notifications …not so much about receiving push messages from the server.

I wish there were a way to implement this background-caching pattern without requiring the user to grant permission to a dialogue that contains the word “notification.”

I wonder if the act of adding a site to the home screen could implicitly grant permission to allow use of the Push API without notifications?

In the meantime, the proposal for periodic synchronisation (using background sync) could achieve similar results, but in a less elegant way; periodically polling for new content instead of receiving a push message when new content is published. Also, it requires permission. But at least in this case, the permission dialogue should be more specific, and wouldn’t include the word “notification” anywhere.

Webmentions at Indie Web Camp Berlin

I was in Berlin for most of last week, and every day was packed with activity:

By the time I got back to Brighton, my brain was full …just in time for FF Conf.

All of the events were very different, but equally enjoyable. It was also quite nice to just attend events without speaking at them.

Indie Web Camp Berlin was terrific. There was an excellent turnout, and once again, I found that the format was just right: a day of discussions (BarCamp style) followed by a day of doing (coding, designing, hacking). I got very inspired on the first day, so I was raring to go on the second.

What I like to do on the second day is try to complete two tasks; one that’s fairly straightforward, and one that’s a bit tougher. That way, when it comes time to demo at the end of the day, even if I haven’t managed to complete the tougher one, I’ll still be able to demo the simpler one.

In this case, the tougher one was also tricky to demo. It involved a lot of invisible behind-the-scenes plumbing. I was tweaking my webmention endpoint (stop sniggering—tweaking your endpoint is no laughing matter).

Up until now, I could handle straightforward webmentions, and I could handle updates (if I receive more than one webmention from the same link, I check it each time). But I needed to also handle deletions.

The spec is quite clear on this. A 404 isn’t enough to trigger a deletion—that might be a temporary state. But a status of 410 Gone indicates that a resource was once here but has since been deliberately removed. In that situation, any stored webmentions for that link should also be removed.

Anyway, I think I got it working, but it’s tricky to test and even trickier to demo. “Not to worry”, I thought, “I’ve always got my simpler task.”

For that, I chose to add a little map to my homepage showing the last location I published something from. I’ve been geotagging all my content for years (journal entries, notes, links, articles), but not really doing anything with that data. This is a first step to doing something interesting with many years of location data.

I’ve got it working now, but the demo gods really weren’t with me at Indie Web Camp. Both of my demos failed. The webmention demo failed quite embarrassingly.

As well as handling deletions, I also wanted to handle updates where a URL that once linked to a post of mine no longer does. Just to be clear, the URL still exists—it’s not 404 or 410—but it has been updated to remove the original link back to one of my posts. I know this sounds like another very theoretical situation, but I’ve actually got an example of it on my very first webmention test post from five years ago. Believe it or not, there’s an escort agency in Nottingham that’s using webmention as a vector for spam. They post something that does link to my test post, send a webmention, and then remove the link to my test post. I almost admire their dedication.

Still, I wanted to foil this particular situation so I thought I had updated my code to handle it. Alas, when it came time to demo this, I was using someone else’s computer, and in my attempt to right-click and copy the URL of the spam link …I accidentally triggered it. In front of a room full of people. It was midly NSFW, but more worryingly, a potential Code Of Conduct violation. I’m very sorry about that.

Apart from the humiliating demo, I thoroughly enjoyed Indie Web Camp, and I’m going to keep adjusting my webmention endpoint. There was a terrific discussion around the ethical implications of storing webmentions, led by Sebastian, based on his epic post from earlier this year.

We established early in the discussion that we weren’t going to try to solve legal questions—like GDPR “compliance”, which varies depending on which lawyer you talk to—but rather try to figure out what the right thing to do is.

Earlier that day, during the introductions, I quite happily showed webmentions in action on my site. I pointed out that my last blog post had received a response from another site, and because that response was marked up as an h-entry, I displayed it in full on my site. I thought this was all hunky-dory, but now this discussion around privacy made me question some inferences I was making:

  1. By receiving a webention in the first place, I was inferring a willingness for the link to be made public. That’s not necessarily true, as someone pointed out: a CMS could be automatically sending webmentions, which the author might be unaware of.
  2. If the linking post is marked up in h-entry, I was inferring a willingness for the content to be republished. Again, not necessarily true.

That second inferrence of mine—that publishing in a particular format somehow grants permissions—actually has an interesting precedent: Google AMP. Simply by including the Google AMP script on a web page, you are implicitly giving Google permission to store a complete copy of that page and serve it from their servers instead of sending people to your site. No terms and conditions. No checkbox ticked. No “I agree” button pressed.

Just sayin’.

Anyway, when it comes to my own processing of webmentions, I’m going to take some of the suggestions from the discussion on board. There are certain signals I could be looking for in the linking post:

  • Does it include a link to a licence?
  • Is there a restrictive robots.txt file?
  • Are there meta declarations that say noindex?

Each one of these could help to infer whether or not I should be publishing a webmention or not. I quickly realised that what we’re talking about here is an algorithm.

Despite its current usage to mean “magic”, an algorithm is a recipe. It’s a series of steps that contribute to a decision point. The problem is that, in the case of silos like Facebook or Instagram, the algorithms are secret (which probably contributes to their aura of magical thinking). If I’m going to write an algorithm that handles other people’s information, I don’t want to make that mistake. Whatever steps I end up codifying in my webmention endpoint, I’ll be sure to document them publicly.

European tour

I’m recovering from an illness that laid me low a few weeks back. I had a nasty bout of man-flu which then led to a chest infection for added coughing action. I’m much better now, but alas, this illness meant I had to cancel my trip to Chicago for An Event Apart. I felt very bad about that. Not only was I reneging on a commitment, but I also missed out on an opportunity to revisit a beautiful city. But it was for the best. If I had gone, I would have spent nine hours in an airborne metal tube breathing recycled air, and then stayed in a hotel room with that special kind of air conditioning that hotels have that always seem to give me the sniffles.

Anyway, no point regretting a trip that didn’t happen—time to look forward to my next trip. I’m about to embark on a little mini tour of some lovely European cities:

  • Tomorrow I travel to Stockholm for Nordic.js. I’ve never been to Stockholm. In fact I’ve only stepped foot in Sweden on a day trip to Malmö to hang out with Emil. I’m looking forward to exploring all that Stockholm has to offer.
  • On Saturday I’ll go straight from Stockholm to Berlin for the View Source event organised by Mozilla. Looks like I’ll be staying in the east, which isn’t a part of the city I’m familiar with. Should be fun.
  • Alas, I’ll have to miss out on the final day of View Source, but with good reason. I’ll be heading from Berlin to Bologna for the excellent From The Front conference. Ah, I remember being at the very first one five years ago! I’ve made it back every second year since—I don’t need much of an excuse to go to Bologna, one of my favourite places …mostly because of the food.

The only downside to leaving town for this whirlwind tour is that there won’t be a Brighton Homebrew Website Club tomorrow. I feel bad about that—I had to cancel the one two weeks ago because I was too sick for it.

But on the plus side, when I get back, it won’t be long until Indie Web Camp Brighton on Saturday, September 24th and Sunday, September 25th. If you haven’t been to an Indie Web Camp before, you should really come along—it’s for anyone who has their own website, or wants to have their own website. If you have been to an Indie Web Camp before, you don’t need me to convince you to come along; you already know how good it is.

Sign up for Indie Web Camp Brighton here. It’s free and it’s a lot of fun.

The importance of owning your data is getting more awareness. To grow it and help people get started, we’re meeting for a bar-camp like collaboration in Brighton for two days of brainstorming, working, teaching, and helping.

Long day’s journey into Brighton

I spent what I thought would be my last few hours in Berlin wandering around with Jessica, walking in the footsteps of Leibniz. There was scant of evidence of the master’s presence in the house of his student, , but the setting still lent itself to imagining him trying to build his , all the while hampered by the ongoing task of researching the family tree of the blue-blooded nitwits whose pictures still fill the walls of the palace.

After that we made our way to Schönefeld airport, accompanied by Stephanie. It was only once we got there that she realised she was at the wrong airport. Nothing a quick taxi ride couldn’t fix.

Jessica and myself were at the right airport but we clearly chose the wrong airline. Our EasyJet flight was delayed by five hours. But eventually we made it back to England and, after an expensive but comfortable taxi ride (because the train situation was hopeless) we arrived back in Brighton.

I enjoyed my time in Berlin although the Web 2.0 Expo was very much the mixed bag I thought it would be: some excellent presentations coupled with some dull keynotes. Still, it was a good opportunity to catch up with some good friends. I was keeping tracking of other good friends on Twitter: some of them were in Boston for the W3C Tech Plenary; more were in New York for the Future Of Web Design. It was a busy week for conferences. Even if I could master the art of , I’d still have a tough time deciding whether I’d want to be a fly on the wall at the CSS working group, listening to Malarkey interview Zeldman or reading a story about Roy Orbison in clingfilm to thousand puzzled Europeans.

Berlin, day 4

After a late night of German beer, I had my first non-early start since getting to Berlin. By the time I roused myself and made my way to the conference, I had missed most of the morning’s talks. I managed to catch Matt’s talk about the Olinda device. His presentation was excellent, as always.

I spent a little time in the corridors metaphorically picking fleas with my fellow geeks before they wandered off to hear the keynotes. Because I am neither a masochist or lobotomised, I passed on the opportunity to hear the latest and greatest corporate product pitches.

Instead, I regrouped with Jessica and we headed to Potsdamer Platz for a spot of wursty lunch at the Christmas market there. We spent most of the subsequent afternoon exploring the film museum. Much as I enjoyed the paraphernalia from Metropolis and The Cabinet of Doctor Caligari, I was somewhat disappointed that the exhibits from the ’40s had nary a mention of my heroine, Hedy Lamarr (okay, technically she was Austrian so it’s understandable). Still, the opportunity to ogle large-sized projections of Louise Brooks compensated.

After a game of SMS tag with Stephanie, an evening of more metaphorical mutual grooming followed, culminating with cocktails in one of the few tiki bars in Berlin. They had sand on the floor and everything. Not exactly typisch Deutsch but a fun way to wrap up a conference.

Berlin, day 3

For the second morning in a row, I rose at an ungodly hour to make my way to the Web 2.0 Expo and clamber on stage. There wasn’t a huge crowd of people in the room but I was glad that anyone had made the effort to come along so early.

I greeted the attendees, “Guten Morgen, meine Dame und Herren.” That’s not a typo; I know that the plural is “Damen” but this isn’t a very diverse conference.

I proceeded to blather on about microformats and nanotechnology. People seemed to like it. Afterwards Matt told me that the whole buckyball building block analogy I was using reminded him of phenotropics, a subject he’s spoken on before. I need to investigate further… if nothing else so that I can remedy the fact that the concept currently has no page on Wikipedia.

After my talk, I hung around just long enough to catch some of Steve Coast’s talk on OpenStreetMap and Mark’s talk on typography, both of which were excellent. I gave the keynotes a wide berth. Instead I hung out in the splendid food hall of the KaDeWe with Jessica and Natalie.

The evening was spent excercising my l33t dinner-organising skillz when, for the second night in a row, I was able to seat a gathering of geeks in the two digit figures. Berlin is a very accommodating city.

Berlin, day 2

Today the Web 2.0 Expo kicked off for real and I spent the day hanging out in the cavernous isolated venue. It’s a cold concrete brutalist building that makes me feel small and alienated. Actually, most of the time it feels like hanging out in a university, but that might just be all the bad coffee and cigarette smoke.

I started the day far too early by sitting on a panel. Just as happened at the Web 2.0 Expo in San Francisco, I somehow found myself on the opening panel of the design track. The subject matter was pretty similar too. Instead of being called The Hybrid Designer, this one was supposed to be Moving from 1.0 to 2.0 but Leisa and I decided that a better title would be Moving From Islands In The Stream to Super Best Friends’ Web (with the “islands in the stream” portion sung in our best Kenny Rogers and Dolly Parton voices). It was a fun panel to participate in; I’m not sure how much fun it was to listen to.

After that I listened in on David Recorden’s talk on Opening The Social Graph. Much as I dislike that term, the subject matter was great and David is an excellent presenter.

I skipped the next set of sessions to hang out with Carole before wandering into the expo hall to peruse the stands. There I found the people from Mister Wong giving away sandwiches. They reassured me that there were no hard feelings about that blog post.

Overall, the expo hall was pretty dull except for the presence of a radio-controlled blimp. Airships are inherently cool.

Then it was time for the keynotes. I had been dreading these. I would have just skipped them except, because I was going to be doing a two minute slot at the end of the keynotes, I had to be in the room sitting in the front row.

It was as arm-gnawingly bad as I expected during the product pitches from Microsoft, Netvibes and Amazon. The only thing that made it bearable was buzzword bingo. Quite a few people played along (it really does make the time pass faster) although nobody had the balls to stand up and shout “Bingo!”

The keynote segment was redeemed by the presence of Kathy Sierra. She gave a talk on Creating Passionate Users that was, as always, wonderful. She was a breath of fresh air in amongst all the self-congratulatory guff.

Then it was time for Ten Great Ideas In Twenty Minutes. Apparently the plan was for speakers to explain in two minutes why attendees should go to their talks. But I asked Brady beforehand if the idea could be a different one from my talk and he said Sure.

So I read a short story about a great idea: wrapping Roy Orbison in clingfilm. Despite my microphone cutting out halfway through (which was a technical hitch rather than censorship, I am assured), I managed to do it just about in time. I had been timing it the night before in my hotel room and a lot of the chapters from the Roy Orbison in Clingfilm novel can be read in under two minutes if you’re fast enough.

Perhaps I should explain myself…

I figured that everyone in the audience had a brochure that listed descriptions of each talk so I didn’t see the point in repeating easily-discoverable information. Given that people already knew the subject matter of the talks, the only reason for having the two minute blurbs must be to assess the speakers themselves; whether they will be entertaining and/or articulate. It’s the singer, not the song. So I figured that anybody who enjoyed hearing me read a story about Roy Orbison wrapped in clingfilm would probably get a kick out of my talk on The Beauty in Standards.

Anyway, isn’t Web 2.0 supposed to be all about social media and disruption? Frankly, I can’t think of a better definition of Web 2.0 than Roy Orbison in clingfilm.

After the two minute synopses, I went downstairs to deliver my talk. Not many people attended. Funny that.

Stephanie was there and, as usual, she did an excellent job of liveblogging the talk.

I heard later that none of the talks in that slot were very full except for the session on OpenSocial, which was rammed. I also heard it was quite lame—a repeat of the video that’s already online combined with plodding walkthroughs of demo apps.

I was planning to head straight back to my hotel after my talk but I got sucked in by Matt’s excellent talk on Coding on the Shoulders of Giants. Then I went back to my hotel before gathering together fifteen geeks and seeking out a good restaurant where we could fill our bellies with bodenständig German dishes. There was an official conference party happening as well but seeing as they couldn’t stretch to allowing non-attendees like Jessica in, I figured it probably wasn’t worth going to. Instead I argued with Tom and Cal about semantic markup and microformats over dinner.

Speaking of which, I’m talking first thing tomorrow on Microformats: the Nanotechnology of the Semantic Web so I’d better get my beauty sleep.

Berlin, day 1

Since arriving in Berlin this morning I have…

  1. eaten at a cute little imbiss,
  2. eaten a slice of with a cup of good coffee and
  3. eaten ludicrous amounts of stick-to-your-ribs gut bürgerliche Küche at a restaurant with some friends while sucking down .

I have yet to…

  1. figure out why I’ve agreed at the last minute to be on a panel at 9am tomorrow morning,
  2. go over my slides for my presentation tomorrow afternoon and
  3. figure out how I’m going to fill my two minutes in the “ten great ideas in twenty minutes” slot.

I’m thinking I could either…

  1. rant about portable social networks, the password anti-pattern and how Web standards and microformats can save us all or
  2. read out a short story from Roy Orbison in Clingfilm.

Berlin schedule

I’m off to Berlin tomorrow where I’ll spend the week immersed in the first European Web 2.0 Expo. I’m hoping that it won’t be the same mixed bag as the US counterpart: despite some good stuff, the lows were very low indeed.

I’ve been nominally serving on the board of advisors, helping to put together the design track. If nothing else, I passed along the names of Brian Suda, Mark Boulton and Jan Eric Hellbusch so the topics of microformats, typography and accessibility should be well covered. I’ll also be giving a couple of talks that I’ve already road-tested; Microformats: the Nanotechnology of the Semantic Web and The Beauty in Standards.

A full schedule is listed on the conference website but it’s marked up as a dead end. It always strikes me as a shame when someone goes to the bother of publishing event information without sprinkling the few extra class names needed to create an hCalender. Here’s a hint to any conference organisers out there: Dmitry Baranovskiy’s conference schedule creator is rather excellent. Brian and myself used it to output a nice hCalendar version of the expo schedule.

I’ve added some CSS and put the markup online. If you’re in Berlin and you want a quick glance at what’s on, here’s a suitably short URL:

icanhaz.com/berlin

From there you can download the schedule or better yet, subscribe to the schedule. That way, if there are any changes to the line-up, I’ll edit the HTML and you’ll get those changes reflected in your calendar.

Bienen fliegen

My brief excursion to Berlin is at an end and I’m back in Brighton.

The prize-giving ceremony for the BIENE accessibility awards went well. It was a very professional affair in nifty surroundings. Champagne, canapés and short films from the folks at Ehrensenf made for a most pleasant awards ceremony. After the ceremony itself, a plentiful supply of food, beer and music ensured that the whole evening was enjoyable.

The highly valued prizes went to some very deserving websites. I can vouch for the fact that the jury was pretty strict in its judgement. Even as the prizes were being handed out on stage, the sweet taste of victory was tempered by some words advising where improvements could still be made.

Most of the winners sported valid markup; usually XHTML Transitional, sometimes even XHTML Strict. Quite a lot of the sites offered text-resizing facilities, though I wonder if that’s something best left to user agents. Joe will pleased to note that many of the sites also offered zoom layouts.

The Pfizer website, winner of a golden Biene, includes a remarkable section that sets out to translate those bits of paper you get with your prescription into plain language… and sign language! The whole thing is done with Flash and it works wonderfully well with screenreaders. From a technical viewpoint, I’m really glad that I now have an example I can point to, should I ever find myself in one of those “Flash is inherently inaccessible” arguments.

I also felt that it was very important that the prize-winning websites should be well-crafted with strong visual design. The Barmer website is not only accessible, it looks good too. It’s extremely bulletproof with a semi-liquid layout. There’s more semi-liquid goodness to be had at the site of the Bundesrat—the federal council of Germany. I’m really impressed with the clarity and cleanliness of the design.

My personal favourite is the website of the Media Management department of the Wiesbaden Technical College. I like the nice clean design. They also offer material in plain language and sign language. It scales nicely, it’s usable and it’s accessible. But what impressed me most was the story behind the site.

The website was created by students. A small group put the whole thing together in three months. They did this as just 12.5% of their coursework, so there was a ton of other work they needed to attend to at the same time. Under the guidance of professor Stephan Schwarz, they learned about structuring documents with markup and styling with CSS. The end result is something that would put many “professional” agencies to shame. What a debut! An accessible, good-looking site from people who have learned Web design the right way, without ever having to nest a table.

I’m just blown away by their achievement. I requested, and was granted, the honour of awarding them their silver Biene on stage. That meant I had to speak German in front of a roomful of people (and television cameras) but I made it through without stumbling too much.

At South by Southwest earlier this year, Andys Budd and Clarke gave a talk on Web Design Superheroes. The students from Fachhochschule Wiesbaden are my heroes. If they represent the next generation of designers, the Web is in very good hands indeed.

Straight out of Wiesbaden

Hauptstadt

I lived in Germany for about five or six years in the nineties. In all that time, while I was ensconsed in the beautiful Black Forest town of Freiburg, I never once made it to the capital. Now I’m finally here.

I was invited to come to Berlin to be part of the jury for the highly-prized Biene awards. This is quite an honour. In the Biene awards, the emphasis is on accessibility and the criteria are really quite strict. It’s no cliché to say that just being nominated is quite an achievement.

One of the restictions on entries for the awards is that the site is primarily in German. I suspect that it’s my familiarity with the language that secured my place on the jury. The only problem is that I haven’t spoken German for six years.

Yesterday was judgment day. The jury gathered to debate and discuss the relative merits of the sites on offer. I had absolutely no problem understanding what everyone else was saying but as soon as I opened my mouth to add my opinion, I found that words and grammar were failing me at every turn. It was quite frustrating. I know if I was here for a few more days, it would all come back to me but having to dust down the German-speaking part of my brain after an interval of half a decade felt like quite a tough task.

I learned most of my German from sitting in pubs chatting with Germans, which is why I’m still fairly crap at reading and writing in the language. I usually find that my German improves greatly after one or two beers. Strangely though, after another three or four beers, I can’t understand a word anyone is saying. Komisch, nicht wahr?

The prize-giving ceremony will take place tonight. I can’t give away any of the results yet; that’s verboten. But I’ll definitely be blogging about some of the sites as soon as the pre-ceremony gag order is removed.

Until then, I have a few hours to explore Berlin. The good people from Aktion Mensch are putting me up at the ludicrously swanky Westin Grand, once the crown jewel of East Berlin. Its central locataion means that I’m just a short stroll away from the Brandenburg gate and plenty of other must-see attractions. Flickr demands pictorial evidence of such visual delights: I must obey.