Not The Post I Wanted To Be Writing… – Infrequently Noted
Phew! Alex seems to have calmed down. He’s responding to my concerns about exposing URLs in progressive web apps, but thankfully without the absolutist rhetoric or insults. Progress!
5th | 10th | 15th | 20th | 25th | 30th | ||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
12am | |||||||||||||||||||||||||||||||
4am | |||||||||||||||||||||||||||||||
8am | |||||||||||||||||||||||||||||||
12pm | |||||||||||||||||||||||||||||||
4pm | |||||||||||||||||||||||||||||||
8pm |
Phew! Alex seems to have calmed down. He’s responding to my concerns about exposing URLs in progressive web apps, but thankfully without the absolutist rhetoric or insults. Progress!
Y’know, I think PPK might be on to something here. It’s certainly true that developers have such an eversion to solving a problem twice that some users end up paying the cost (like in the examples of progressive enhancement here).
I will be pondering upon this.
I really, really like the approach that this JavaScript library is taking in treating Ajax as a progressive enhancement:
Turbolinks intercepts all clicks on
a href
links to the same domain. When you click an eligible link, Turbolinks prevents the browser from following it. Instead, Turbolinks changes the browser’s URL using the History API, requests the new page usingXMLHttpRequest
, and then renders the HTML response.During rendering, Turbolinks replaces the current
body
element outright and merges the contents of thehead
element. The JavaScriptwindow
anddocument
objects, and the HTMLhtml
element, persist from one rendering to the next.
Here’s the mustard it’s cutting:
It depends on the HTML5 History API and Window.requestAnimationFrame. In unsupported browsers, Turbolinks gracefully degrades to standard navigation.
This approach matches my own mental model for building on the web—I might try playing around with this on some of my projects.
So remember when I was talking about “the ends justify the means” being used for unwise short-term decisions? Here’s a classic example. Chris thinks that Progressive Web Apps should be made mobile-only (at least to start with …something something something the future):
For now, PWAs need to be the solution for the next mobile users.
End users deserve to have an amazing, form-factor specific experience.
I couldn’t disagree more. End users deserve to have an amazing experience no matter the form-factor of their device.
From the people who brought you…
The Command Line
…comes a new wave of typing out instructions:
⚡
C H A T B O T S
⚡ 🤖 ⚡
I highly recommend Remy’s State Of The Gap post—it’s ace. He summarises it like this:
I strongly believe in the concepts behind progressive web apps and even though native hacks (Flash, PhoneGap, etc) will always be ahead, the web, always gets there. Now, today, is an incredibly exciting time to be build on the web.
I agree completely. That might sound odd after I wrote about Regressive Web Apps, but it’s precisely because I’m so excited by the technologies behind progressive web apps that I think it’s vital that we do them justice. As Remy says:
Without HTTPS and without service workers, you can’t add to homescreen. This is an intentionally high bar of entry with damn good reasons.
When the user installs a PWA, it has to work. It’s our job as web developers to provide the most excellent experience for our users.
It has to work.
That’s why I don’t agree with Dion’s metrics for what makes a progressive web app:
If you deliver an experience that only works on mobile is that a PWA? Yes.
I think it’s important to keep quality control high. Being responsive is literally the first item in the list of qualities that help define what a progressive web app is. That’s why I wrote about “regressive” web apps: sites that are supposed to showcase what we can do but instead take a step backwards into the bad old days of separate sites for separate device classes: washingtonpost.com/pwa, m.flipkart.com, lite.5milesapp.com, app.babe.co.id, m.aliexpress.com.
A lot of people on Twitter misinterpreted my post as saying “the current crop of progressive web apps are missing the mark, therefore progressive web apps suck”. What I was hoping to get across was “the current crop of progressive web apps are missing the mark, so let’s make better ones!”
Now, I totally understand that many of these examples are a first stab, a way of testing the waters. I absolutely want to encourage these first attempts and push them further. But I don’t think that waiving the qualifications for progressive web apps helps achieves that. As much as I want to acknowledge the hard work that people have done to create those device-specific examples, I don’t think we should settle for anything less than high-quality progressive web apps that are as much about the web as they are about apps.
Simply put, in this instance, I don’t think good intentions are enough.
Which brings me to the second part of Regressive Web Apps, the bit about Chrome refusing to show the “add to home screen” prompt for sites that want to have their URL still visible when launched from the home screen.
Alex was upset by what I wrote:
if you think the URL is going to get killed on my watch then you aren’t paying any attention whatsoever.
so, your choices are to think that I have a secret plan to kill URLs, or conclude I’m still Team Web.
I’m galled that anyone, particularly you @adactio, would think the former…but contrarianism uber alles?
I am very, very sorry that I upset Alex like this.
But I stand by my criticism of the actions of the Chrome team. Because good intentions are not enough.
I know that Alex is huge fan of URLs, and of the web. Heck, just about everybody I know that works on Chrome in some capacity are working for the web first and foremost: Alex, Jake, various and sundry Pauls. But that doesn’t mean I’m going to stay quiet when I see the Chrome team do something I think is bad for the web. If anything, it’s precisely because I hold them to a high standard that I’m going to sound the alarm when I see what I consider to be missteps.
I think that good people can make bad decisions with the best of intentions. Usually it involves long-term thinking—something I think is very important. “The ends justify the means” is a way of thinking that can create a lot of immediate pain, even if it means a better future overall. Balancing those concerns is front and centre of the Chromium project:
As browser implementers, we find that there’s often tension between (a) moving the web forward and (b) preserving compatibility. On one hand, the web platform API surface must evolve to stay relevant. On the other hand, the web’s primary strength is its reach, which is largely a function of interoperability.
For example, when Alex talks of the Web Component era as though it were an inevitability, I get nervous. Not for myself, but for the millions of Opera Mini users out there. How do we get to a better future without leaving anyone behind? Or do we sacrifice those people for the greater good? Do the needs of the many outweigh the needs of the few? Do the ends justify the means?
Now, I know for a fact that the end-game that Alex is pursuing with web components—and the extensible web manifesto in general—is a more declarative web: solutions that first get tackled as web components end up landing in browsers. But to get there, the solutions are first created using modern JavaScript that simply doesn’t work everywhere. Is that the price we’re going to have to pay for a better web?
I hope not. I hope we can find ways to have our accessible cake and eat it too. But it will be really, really hard.
Returning to progressive web apps, I was genuinely shocked and appalled at the way that the Chrome team altered the criteria for the “add to home screen” prompt to discourage exposing URLs. I was also surprised at how badly the change was communicated—it was buried in a bug report that five people contributed to before pushing the change. I only found out about it through a conversation with Paul Kinlan. Paul encouraged me to give feedback, and that’s what I did on my website, just like Stuart did on his.
Of course the Chrome team are working on ways of exposing URLs within progressive web apps that are launched in from the home screen. Opera are working on it too. But it’s a really tricky problem to solve. It’s not enough to say “we’ll figure it out”. It’s not enough to say “trust us.”
I do trust the people I know working on Chrome. I also trust the people I know at Mozilla, Opera and Microsoft. That doesn’t mean I’m going to let their actions go unquestioned. Good intentions are not enough.
As Alex readily acknowledges, the harder problem (figuring out how to expose URLs) should have been solved first—then the change to the “add to home screen” metrics would be uncontentious. Putting the cart before the horse, discouraging display:browser
now, while saying “trust us, we’ll figure it out”, is another example of saying the ends justify the means.
But the stakes are too high here to let this pass. Good intentions are not enough. Knowing that the people working on Chrome (or Firefox, or Opera, or Edge) are good people is not reason enough to passively accept every decision they make.
Alex called me out for not getting in touch with him directly about the Chrome team’s future plans with URLs, but again, that kind of rough consensus to do something is trumped by running code. Also, I did talk to Chrome people—this all came out of a discussion with Paul Kinlan. I don’t know who’s who in the company’s political hierarchy and I don’t think I should need an org chart to give feedback to Google (or Mozilla, or Opera, or Microsoft).
You’ll notice that I didn’t include Apple there. I don’t hold them to the same high standard. As it turns out, I know some very good people at Apple working on WebKit and Safari. As individuals, they care about the web. But as a company, Apple has shown indifference towards web developers. As Remy put it:
Even getting the hint of interest from Apple is a process of dumpster-diving the mailing lists scanning for the smallest hint of interest.
With that in mind, I completely understand Alex’s frustration with my post on “regressive” web apps. Although I intended it as a push towards making better progressive web apps, I can see how it could be taken as confirmation by those who think that progressive web apps aren’t worth investing in. Apple, for example. As it is, they’ll have to be carried kicking and screaming into adding support for Service Workers, manifest files, and other building blocks. From the reaction to my post from at least one WebKit developer on Twitter, not only did I fail to get across just how important the technologies behind progressive web apps are, I may have done more harm than good, giving ammunition to sceptics.
Still, I hope that most people took my words in the right spirit, like Addy:
We should push them to do much better. I’ll file bugs. Per @adactio post, can’t forget the ‘Progressive’ part of PWAs
Seeing that reaction makes me feel good …but seeing Alex’s reaction makes me feel bad. Very bad. I’m genuinely sorry that I made Alex feel that way. It wasn’t my intention but, well …good intentions are not enough.
I’ve been looking back at what I wrote, trying to see it through Alex’s eyes, looking for the parts that could be taken as a personal attack:
Chrome developers have decided that displaying URLs is not “best practice” … To declare that all users of all websites will be confused by seeing a URL is so presumptuous and arrogant that it beggars belief. … Withholding the “add to home screen” prompt like that has a whiff of blackmail about it. … This isn’t the first time that Chrome developers have made a move against the address bar. It’s starting to grind me down.
Some pretty strong words there. I stand by them, but the tone is definitely strident.
When we criticise something—a piece of software, a book, a website, a film, a piece of music—it’s all too easy to forget that there are real people behind it. But that isn’t the case here. I know that there are real people working on Chrome, because I know quite a few of those people. I also know that their intentions are good. That’s not a reason for me to remain silent—that’s a reason for me to speak up.
If I had known that my post was going to upset Alex, would I have still written it? That’s a tough one. On the one hand, this is a topic I care passionately about. I think it’s vital that we don’t compromise on the very things that make the web great. On the other hand, who knows if what I wrote will make the slightest bit of difference? In which case, I got the catharsis of getting it off my chest but at the price of upsetting somebody I respect. That price feels too high.
I love the fact that I can publish whatever I want on my own website. It can be a place for me to be enthusiastic about things that excite me, and a place for me to rant about things that upset me. I estimate that the enthusiastic stuff outnumbers the ranty stuff by about ten to one, but negativity casts a disproportionately large shadow.
I need to get better at tempering my words. Not that I’m going to stop criticising bad decisions when I see them, but I need to make my intentions clearer …because just having good intentions is not enough. Throughout this post, I’ve mentioned repeatedly how much I respect the people I know working on the Chrome team. I should have said that in my original post.
Matthias Beitl takes a stab at trying to tackle the tricky UI problem of exposing the URLs of Progressive Web Apps. This stuff is hard.
Here’s a handy directory of scripts that set out to solve one problem without any dependencies. Useful for poking at, picking apart, and learning from.
I know exactly how Tim feels. It’s hard not to feel guilty when you’re reading something instead of spending the time doing “real work”, but it always ends up being time well spent:
Reading time can be hard to justify, even to oneself. There is no deadline. It’s not going to move any immediate projects forward (most likely). And it often feels like a waste of time, especially if your interests are diverse. But it’s important. Most great work is the product of collaborative thinking.
Ensure that your class names never go out of sync with your style declarations with this one simple trick:
Take any CSS rule you want to apply, replace : by -, and dots by -dot-, and you get the name of the corresponding universal css classname.
The only thing missing is immutability, so I would suggest also putting !important
after each declaration in the CSS. Voila! No more specificity battles.
A nice little collection of interaction patterns with built-in accessibility and no dependencies.
Peppers.
The Working Draft podcast is usually in German, but this episode is in English. It was recorded in a casual way by a bunch of people soaking up the sun sitting outside the venue at Beyond Tellerrand. Initially that was PPK and Chris, but then I barged in half way through. Good fun …if you’re into nerdy discussions about browsers, standards, and the web. And the sound quality isn’t too bad, considering the circumstances under which this was recorded.
I’ve got a fairly simple posting interface for my notes. A small textarea, an optional file upload, some checkboxes for syndicating to Twitter and Flickr, and a submit button.
It works fine although sometimes the experience of uploading a file isn’t great, especially if I’m on a slow connection out and about. I’ve been meaning to add some kind of Ajax-y progress type thingy for the file upload, but never quite got around to it. To be honest, I thought it would be a pain.
But then, in his excellent State Of The Gap hit parade of web technologies, Remy included a simple file upload demo. Turns out that all the goodies that have been added to XMLHttpRequest
have made this kind of thing pretty easy (and I’m guessing it’ll be easier still once we have fetch
).
I’ve made a little script that adds a progress bar to any forms that are POSTing data.
Feel free to use it, adapt it, and improve it. It isn’t using any ES6iness so there are some obvious candidates for improvement there.
It’s working a treat on my little posting interface. Now I can stare at a slowly-growing progress bar when I’m out and about on a slow connection.
I’ve been updating my book sites over to HTTPS:
They’re all hosted on the same (virtual) box as adactio.com—Ubuntu 14.04 running Apache 2.4.7 on Digital Ocean. If you’ve got a similar configuration, this might be useful for you.
First off, I’m using Let’s Encrypt. Except I’m not. It’s called Certbot now (I’m not entirely sure why).
I installed the Let’s Encertbot client with this incantation (which, like everything else here, will need root-level access so if none of these work, retry using sudo
in front of the commands):
wget https://dl.eff.org/certbot-auto
chmod a+x certbot-auto
Seems like a good idea to put that certbot-auto
thingy into a directory like /etc
:
mv certbot-auto /etc
Rather than have Certbot generate conf files for me, I’m just going to have it generate the certificates. Here’s how I’d generate a certificate for yourdomain.com
:
/etc/certbot-auto --apache certonly -d yourdomain.com
The first time you do this, it’ll need to fetch a bunch of dependencies and it’ll ask you for an email address for future reference (should anything ever go screwy). For subsequent domains, the process will be much quicker.
The result of this will be a bunch of generated certificates that live here:
/etc/letsencrypt/live/yourdomain.com/cert.pem
/etc/letsencrypt/live/yourdomain.com/chain.pem
/etc/letsencrypt/live/yourdomain.com/privkey.pem
/etc/letsencrypt/live/yourdomain.com/fullchain.pem
Now you’ll need to configure your Apache gubbins. Head on over to…
cd /etc/apache2/sites-available
If you only have one domain on your server, you can just edit default.ssl.conf
. I prefer to have separate conf files for each domain.
Time to fire up an incomprehensible text editor.
nano yourdomain.com.conf
There’s a great SSL Configuration Generator from Mozilla to help you figure out what to put in this file. Following the suggested configuration for my server (assuming I want maximum backward-compatibility), here’s what I put in.
Make sure you update the /path/to/yourdomain.com
part—you probably want a directory somewhere in /var/www
or wherever your website’s files are sitting.
To exit the infernal text editor, hit ctrl and o, press enter
in response to the prompt, and then hit ctrl and x.
If the yourdomain.com.conf
didn’t previously exist, you’ll need to enable the configuration by running:
a2ensite yourdomain.com
Time to restart Apache. Fingers crossed…
service apache2 restart
If that worked, you should be able to go to https://yourdomain.com
and see a lovely shiny padlock in the address bar.
Assuming that worked, everything is awesome! …for 90 days. After that, your certificates will expire and you’ll be left with a broken website.
Not to worry. You can update your certificates at any time. Test for yourself by doing a dry run:
/etc/certbot-auto renew --dry-run
You should see a message saying:
Processing /etc/letsencrypt/renewal/yourdomain.com.conf
And then, after a while:
** DRY RUN: simulating 'certbot renew' close to cert expiry
** (The test certificates below have not been saved.)
Congratulations, all renewals succeeded.
You could set yourself a calendar reminder to do the renewal (without the --dry-run
bit) every few months. Or you could tell your server’s computer to do it by using a cron
job. It’s not nearly as rude as it sounds.
You can fire up and edit your list of cron
tasks with this command:
crontab -e
This tells the machine to run the renewal task at quarter past six every evening and log any results:
15 18 * * * /etc/certbot-auto renew --quiet >> /var/log/certbot-renew.log
(Don’t worry: it won’t actually generate new certificates unless the current ones are getting close to expiration.) Leave the cronrab editor by doing the ctrl o, enter, ctrl x dance.
Hopefully, there’s nothing more for you to do. I say “hopefully” because I won’t know for sure myself for another 90 days, at which point I’ll find out whether anything’s on fire.
If you have other domains you want to secure, repeat the process by running:
/etc/certbot-auto --apache certonly -d yourotherdomain.com
And then creating/editing /etc/apache2/sites-available/yourotherdomain.com.conf
accordingly.
I found these useful when I was going through this process:
That last one is good if you like the warm glow of accomplishment that comes with getting a good grade:
For extra credit, you can run your site through securityheaders.io to harden your headers. Again, not as rude as it sounds.
You know, I probably should have said this at the start of this post, but I should clarify that any advice I’ve given here should be taken with a huge pinch of salt—I have little to no idea what I’m doing. I’m not responsible for any flame-bursting-into that may occur. It’s probably a good idea to back everything up before even starting to do this.
Yeah, I definitely should’ve mentioned that at the start.
Pixelated Star Wars characters almost, but not quite, in chronological order.
Short ribs, salad, and grilled veggies.
A feast o’ veggies.
Dave explains the thinking behind his responsive table pattern I linked to a while back. He’s at pains to point out that you should always make sure a pre-made pattern is right for you instead of just deploying it no-questions-asked:
Using prefabricated, road tested solutions from Apple’s Human Interface Guidelines, Google’s Material Design, Twitter’s Bootstrap, and Brad Frost’s Responsive Patterns is always a good place to start, but don’t settle there. My biggest advice would be to turn off the 27” display and use your sites and projects on your phone, there’s lots of low hanging fruit that could give way to new patterns, tailor-suited to your content.
A profile of Chesley Bonestell. It’s amazing to think how much of his work was produced before we had even left this planet.
Remy looks at the closing gap between native and web. Things are looking pretty damn good for the web, with certain caveats:
The web is the long game. It will always make progress. Free access to both consumers and producers is a core principle. Security is also a core principle, and sometimes at the costs of ease to the developer (but if it were easy it wouldn’t be fun, right?).
That’s why there’ll always be some other technology that’s ahead of the web in terms of features, but those features give the web something to aim for:
Flash was the plugin that was ahead of the web for a long time, it was the only way to play video for heavens sake!
Whereas before we needed polyfills like PhoneGap (whose very reason for existing is to make itself obsolete), now with progressive web apps, we’re proving the philosophy behind PhoneGap:
If the web doesn’t do something today it’s not because it can’t, or won’t, but rather it is because we haven’t gotten around to implementing that capability yet.
Sausage and egg.
The markup here (with proprietary inline attributes for styling) is a terrible idea but the demo that accompanies is great at showing how flexbox works …I just wish it didn’t try to abstract away the CSS. This is so close to being a really good learning tool for flexbox.
Chicken with chickpeas and chorizo.
Jason takes good look at the browser support for autocomplete values and then makes a valiant attempt to make up for the complete lack of documentation for Safari’s credit card scanning.
Wishing I could be at @UpFrontConf right now to see @LotteJackson’s talk, From Pages To Patterns.
A really interesting proposal for more logic constructs in CSS: when/else conditions. At first glance, this looks like it would complicate the language (and one of the most powerful features of CSS is its simplicity), but when you dig a bit deeper you realise that there’s nothing new enabled by this extra syntax—it actually simplifies what’s already possible.
This looks like a really interesting server-side framework for Ruby developers. The documentation is nice and clear, and puts progressive enhancement at the heart of its approach.
Snook has been on a roll lately, sharing lots of great insights into front-end development. This is a particularly astute post about that perennial issue of naming things.
A heartfelt call to web developers to consider the needs of the many and varied people trying to use what we build.
None of this is about Javascript. None of this is about CSS transforms or WebGL. None of this is about technology at all.
It is about making products that serve all users equally. It is about putting ourselves in others’ shoes. It is about trying to imagine the frustration and difficulty of using our products when the conditions aren’t what we’re used to. It is about being human.
Got a sneak peek at @lottejackson’s talk for tomorrow’s @UpFrontConf—it’s going to be so good!
I love this illustration that Jess made of my Resilience talk at the Render conference.
Ariel and Lisa have redesigned the excellent Spacehack site and it’s looking stellar!
There were plenty of talks about building for the web at this year’s Google I/O event. That makes a nice change from previous years when the web barely got a look in and you’d be forgiven for thinking that Google I/O was an event for Android app developers.
This year’s event showed just how big Google is, and how it doesn’t have one party line when it comes to the web and native. At the same time as there were talks on Service Workers and performance for the web, there was also an unveiling of Android Instant Apps—a full-frontal assault on the web. If you thought it was annoying when websites door-slammed you with intrusive prompts to install their app, just wait until they don’t need to ask you anymore.
I've been "Maybe I'll go Android" for awhile but today's announcement of http:// links getting hijacked into apps got me all like Nope.
— Dave Rupert (@davatron5000) May 18, 2016
Peter has looked a bit closer at Android Instant Apps and I think he’s as puzzled as I am. Either they are sandboxed to have similar permission models to the web (in which case, why not just use the web?) or they allow more access to native APIs in which case they’re a security nightmare waiting to happen. I’m guessing it’s probably the former.
Meanwhile, a different part of Google is fighting the web’s corner. The buzzword du jour is Progressive Web Apps, originally defined by Alex as:
A lot of those points are shared by good native apps, but the first and last points in that list are key features of the web: being responsive and linkable.
Alas many of the current examples of so-called Progressive Web Apps are anything but. Flipkart and The Washington Post have made Progressive Web Apps that are getting lots of good press from Google, but are mobile-only.
Looking at most of the examples of Progressive Web Apps, there’s an even more worrying trend than the return to m-dot subdomains. It looks like most of them are concentrating so hard on the “app” part that they’re forgetting about the “web” bit. That means they’re assuming that modern JavaScript is available everywhere.
Alex pointed to shop.polymer-project.org as an example of a Progressive Web App that is responsive as well as being performant and resilient to network failures. It also requires JavaScript (specifically the Polymer polyfill for web components) to render some text and images in a browser. If you’re using the “wrong” browser—like, say, Opera Mini—you get nothing. That’s not progressive. That’s the opposite of progressive. The end result may feel very “app-like” if you’re using an approved browser, but throwing the users of other web browsers under the bus is the very antithesis of what makes the web great. What does it profit a website to gain app-like features if it loses its soul?
I’m getting very concerned that the success criterion for Progressive Web Apps is changing from “best practices on the web” to “feels like native.” That certainly seems to be how many of the current crop of Progressive Web Apps are approaching the architecture of their sites. I think that’s why the app-shell model is the one that so many people are settling on.
Personally, I’m not a fan of the app-shell model. I feel that it prioritises exactly the wrong stuff—the interface is rendered quickly while the content has to wait. It feels weirdly like a hangover from Appcache. I also notice it being used as a get-out-of-jail-free card, much like the ol’ “Single Page App” descriptor; “Ah, I can’t do progressive enhancement because I’m building an app shell/SPA, you see.”
But whatever. That’s just, like, my opinion, man. Other people can build their app-shelled SPAs and meanwhile I’m free to build websites that work everywhere, and still get to use all the great technologies that power Progressive Web Apps. That’s one of the reasons why I’ve been quite excited about them—all the technologies and methodologies they promote match perfectly with my progressive enhancement approach: responsive design, Service Workers, good performance, and all that good stuff.
I hope we’ll see more examples of Progressive Web Apps that don’t require JavaScript to render content, and don’t throw away responsiveness in favour of a return to device-specific silos. But I’m not holding my breath. People seem to be so caught up in the attempt to get native-like functionality that they’re willing to give up the very things that make the web great.
For example, I’ve seen people use a meta viewport declaration to disable pinch-zooming on their sites. As justification they point to the fact that you can’t pinch-zoom in most native apps, therefore this web-based app should also prohibit that action. The inability to pinch-zoom in native apps is a bug. By also removing that functionality from web products, people are reproducing unnecessary bugs. It feels like a cargo-cult approach to building for the web: slavishly copy whatever native is doing …because everyone knows that native apps are superior to websites, right?
Here’s another example of the cargo-cult imitation of native. In your manifest JSON file, you can declare a display
property. You can set it to browser
, standalone
, or fullscreen
. If you set it to standalone
or fullscreen
then, when the site is launched from the home screen, it won’t display the address bar. If you set the display
property to browser
, the address bar will be visible on launch. Now, personally I like to expose those kind of seams:
The idea of “seamlessness” as a desirable trait in what we design is one that bothers me. Technology has seams. By hiding those seams, we may think we are helping the end user, but we are also making a conscience choice to deceive them (or at least restrict what they can do).
Other people disagree. They think it makes more sense to hide the URL. They have a genuine concern that users will be confused by launching a website from the home screen in a browser (presumably because the user’s particular form of amnesia caused them to forget how that icon ended up on their home screen in the first place).
Fair enough. We’ll agree to differ. They can set their display
property how they want, and I can set my display
property how I want. It’s a big web after all. There’s no one right or wrong way to do this. That’s why there are multiple options for the values.
Or, at least, that was the situation until recently…
Remember when I wrote about how Chrome on Android will show an “add to home screen” prompt if your Progressive Web App fulfils a few criteria?
Well, those goalposts have moved. There is now a new criterion:
display
value of browser
.Chrome developers have decided that displaying URLs is not “best practice”. It was filed as a bug.
A bug.
Displaying URLs.
A bug.
I’m somewhat flabbergasted by this. The killer feature of the web—URLs—are being treated as something undesirable because they aren’t part of native apps. That’s not a failure of the web; that’s a failure of native apps.
Now, don’t get me wrong. I’m not saying that everyone should be setting their display
property to browser
. That would be far too prescriptive. I’m saying that it should be a choice. It should depend on the website. It should depend on the expectations of the users of that particular website. To declare that all users of all websites will be confused by seeing a URL is so presumptuous and arrogant that it beggars belief.
I wouldn’t even have noticed this change of policy if it weren’t for the newly-released Lighthouse tool for testing Progressive Web Apps. The Session gets a good score but under “Best Practices” there was a red mark against the site for having display: browser
. Turns out that’s the official party line from Chrome.
Just to clarify: you can have a site that has literally no HTML or turns away entire classes of devices, yet officially follows “best practices” and gets rewarded with an “add to home screen” prompt. But if you have a blazingly fast responsive site that works offline, you get nothing simply because you don’t want to hide URLs from your users:
I want people to be able to copy URLs. I want people to be able to hack URLs. I’m not ashamed of my URLs …I’m downright proud.
Stuart argues that this is a paternal decision:
The app manifest declares properties of the app, but the
display
property isn’t about the app; it’s about how the app’s developer wants it to be shown. Do they want to proudly declare that this app is on the web and of the web? Then they’ll add the URL bar. Do they want to conceal that this is actually a web app in order to look more like “native” apps? Then they’ll hide the URL bar.
I think there’s something to that, but digging deeper, developers and designers don’t make decisions like that in isolation. They’re generally thinking about what’s best for users. So, yes, absolutely, different apps will have different display
properties, but that shouldn’t be down to the belief system of the developer; it should be down to the needs of the users …the specific needs of the specific users of that specific app. For the Chrome team to come down on one side or the other and arbitrarily declare that one decision is “correct” for every single Progressive Web App that is ever going to be built …that’s a political decision. It kinda feels like an abuse of power to me. Withholding the “add to home screen” prompt like that has a whiff of blackmail about it.
The other factors that contribute to the “add to home screen” prompt are pretty uncontroversial:
This isn’t the first time that Chrome developers have made a move against the address bar. It’s starting to grind me down.
Up until now I’ve been a big fan of Progressive Web Apps. I understood them to be combining the best of the web (responsiveness, linkability) with the best of native (installable, connectivity independent). Now I see that balance shifting towards the native end of the scale at the expense of the web’s best features. I’d love to see that balance restored with a little less emphasis on the “Apps” and a little more emphasis on the “Web.” Now that would be progressive.
Hmmm …I think Jeffrey might have just given me my new job title.
Happy birthday, @clagnut!
On the beach.
I am shocked and disgusted by this arbitrary decision by the Chrome team. If your Progressive Web App doesn’t set its manifest to obscure its URL, you get punished by missing out on the add to home screen prompt.
Strongly disagree with Lighthouse wanting “Manifest’s display property set to standalone/fullscreen to allow launching without address bar.”
Running https://thesession.org through https://github.com/GoogleChrome/lighthouse and getting a pretty good score.
A typically superb article by Aaron. Here, he breaks down a resilient approach to building for the web by examining the multiple ways you could add a button to a page. There’s a larger lesson here too:
We don’t control where our web-based products go or how our users access them. All we can do is imagine as many less-than-perfect scenarios as possible and do our best to ensure our creations will continue to do what they’re supposed to do. One of the easiest ways to do that is to be aware of and limit our dependencies.
Unfollowing people on Twitter who think their Game Of Thrones spoilers are oh-so-clever.
Here’s a fantastic and free little book by Adam Scott. It’s nice and short, covering progressive enhancement, universal JavaScript, accessibility, and inclusive forms.
Download it now and watch this space for more titles around building inclusive web apps, collaboration, and maintaining privacy and security.
Did I mention that it’s free?
I’ll be speaking to students in Vasilis’s class for Communication and Multimedia Design in Amsterdam right before CSS Day. There are (free) tickets available if you’re around. I’ll be talking about digital preservation and long-term thinking on the web.
This meetup is, like all other Icons Meetups, free to attend for everyone. For students and lecturers of CMD Amsterdam, of course. But also for all professional (digital) designers who want to be inspired.
A fascinating thought experiment from Ted Chiang:
So let’s imagine a world in which Chinese characters were never invented in the first place. Given such a void, the alphabet might have spread east from India in a way that it couldn’t in our history, but, to keep this from being an Indo-Eurocentric thought experiment, let’s suppose that the ancient Chinese invented their own phonetic system of writing, something like the modern Bopomofo, some thirty-two hundred years ago. What might the consequences be?
Bidding farewell to Boston after a thoroughly lovely week.
Short ribs.
Watching @beep grill up some asparagus and short ribs.
I’ve been poking around at Google’s information on “instant apps” since they announced it at Google I/O. My initial impressions mirror Peter’s.
Either they allow access to more device APIs (which could be a massive security hole) or else they’re more or less websites.
Bahstahn.
Turbo-charging the clam chowder.
Fried oysters.
Rorschach is making herself comfortable.
Yoghurt with walnuts, grapes, thyme and honey.
While the open web still exists, we really dropped the ball protecting and strengthening it. Fewer people’s first choice for publishing is to start a web site hosted at their own domain. Like the destruction of Pennsylvania Station, sometimes you only know in hindsight that you’ve made a mistake. We were so caught up in Twitter and Facebook that we let the open web crumble. I’m not giving up — I think we can get people excited about blogging and owning their own content again — but it would have been easier if we had realized what we lost earlier.
By publishing to my own web site first…
- I feel like I’m curating a library rather than throwing loose papers into a raging torrent.
- I have the ability to quickly move to another platform if I so wish
- I can choose how things look and feel
- I can track, or not track, any metric I’d like to
- I can publish several different types of media: photos, audio
- I can turn discussion on or off
Derponauts: The Movie
Sampling some Aeronaut beers.
Hanging out at Harvard: @wordridden, @beep and @drinkerthinker.
Clearleft sighting in Net magazine—two Bens and a Clare.
This stone.
Stone ligature.
Cape Cod clams.
Doing Cambridge with @beep and @drinkerthinker.
A good introduction to the Indie Web approach:
This post was primarily directed at friends and colleagues that already blog in other spaces, and wonder why/how they would re-post content to Medium or elsewhere.
When I wrote a few words about progressive enhancement recently, I linked to Karolina’s great article The Web Isn’t Uniform. I was a little reluctant to link to it, not because of the content—which is great—but because of its location on Ev’s blog. I much prefer to link directly to people’s own websites (I have a hunch that those resources tend to last longer too) but I understand that Medium offers a nice low barrier to publishing.
That low barrier comes at a price. It means you have to put up with anyone and everyone weighing in with their own hot takes. The way the site works is that anyone who writes a comment on your article is effectively writing their own article—you don’t get to have any editorial control over what kind of stuff appears together with your words. There is very little in the way of community management once a piece is published.
Karolina’s piece attracted some particularly unsavoury snark—tech bros disagreeing in their brash bullying way. I linked to a few comments, leaving out the worst of the snark, but I couldn’t resist editorialising:
Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.
I knew even when I was writing it that it was unproductive, itself a snarky remark. Two wrongs don’t make a right. But I wanted to acknowledge that not only was bad behaviour happening, but that I was seeing it, and I wasn’t ignoring it. I guess it was mostly intended for Karolina—I wanted to extend some kind of acknowledgment that the cumulative weight of those sneering drive-by reckons is a burden that no one should have to put up with.
Tempted to @-mention orgs who’s employees abuse me in comments under my posts. Then I remember about million more interesting things to do.
— fantastic ms. (@fox) April 29, 2016
“Everyday, a dude goes out of their way to tell you you’re wrong. Women’s life on the Internet.” A novel.
— fantastic ms. (@fox) April 29, 2016
I’m literally done reading the comments for my article. It saddens me that even high-profile Web folk fails to see what I meant…
— fantastic ms. (@fox) April 26, 2016
…and only wishes to argue about their favourite and beloved JavaScript. See here and my reply below: https://t.co/2pA3RZZHKk
— fantastic ms. (@fox) April 26, 2016
№1 rule of posting controversial content: NEVER read the comments*
— fantastic ms. (@fox) April 25, 2016
*of random dudes who misunderstood the point and are trying to mock you.
I literally wrote JS is great but the point is understanding who you build for and be empathetic. Still people call me a hater.
— fantastic ms. (@fox) April 24, 2016
Funny enough it was 98% men trying to tell me I don’t understand how the web works.
— fantastic ms. (@fox) April 24, 2016
Guess what? Stop reading in between the lines.
So many people decided to snarkily disagree with me without trying to grasp I’m advocating for the users, not saying JavaScript is bad.
— fantastic ms. (@fox) April 23, 2016
Probably going to have white male dudes tweeting at me how much they disagree for eternity.
— fantastic ms. (@fox) April 23, 2016
I knew that when I wrote about Medium being “where the opinions of self-entitled dudes flow like rain from the tech heavens” that I would (rightly) get pushback, and sure enough, I did …on Medium. Not on Twitter or anywhere else, just Medium.
I syndicate my posts to Ev’s blog, so the free-for-all approach to commenting doesn’t bother me that much. The canonical URL for my words remains on my site under my control. But for people posting directly to Medium and then having to put up with other people casually shitting all over their words, it must feel quite disempowering.
I have a similar feeling with Twitter. I syndicate my notes there and if the service disappeared tomorrow, I wouldn’t shed any tears. There’s something very comforting in knowing that any snarky nasty responses to my words are only being thrown at copies. I know a lot of my friends are disheartened about the way that Twitter has changed in recent years. I wish I could articulate how much better it feels to only use Twitter (or Medium or Facebook) as a syndication tool, like RSS.
There is an equal and opposite reaction too. I think it’s easier to fling off some thoughtless remarks when you’re doing it on someone else’s site. I bet you that the discourse on Ev’s blog would be of a much higher quality if you could only respond from your own site. I find I’m more careful with my words when I publish here on adactio.com. I’m taking ownership of what I say.
And when I do lapse and write snarky words like “Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.”, at least I’m owning my own snark. Still, I will endeavour to keep my snark levels down …but that doesn’t mean I’m going to turn a blind eye to bad behaviour.
The newest Kirby Ferguson video looks at remixing through the lens of the newest Star Wars film.
Nudibranchia or other opisthobranchia compared to the various looks of David Bowie.
I met Zero! Yay! Thanks, @wilto.
Here’s a nice little pattern from Dave—showing data tables one column at a time on smaller screens.
I was inspired by @misprintedtype and @fox — two very inspiring people — to write some rambling words…
Breakfast sandwich.
Here’s another version of my talk Resilience—the same one I gave at Beyond Tellerrand—this time from the Render conference in Oxford.
Maciej’s first report from Antarctica is here. Put the kettle on and settle in for a grand read.
Al runs through the process of updating GEL—the BBC’s Global Experience Language design system. I particularly like the thought that’s gone into naming type sizes.
I gave the closing talk at the Render conference in Oxford a few weeks back. It was a very smoothly-run event, the spiritual successor to jQuery UK.
In amongst the mix of talks there were a few emerging themes. Animation was covered from a few different angles by Val and Sara. Bruce, Jake, Ola, and I talked about Service Workers and offline functionality. But there were also some differences of opinion.
In her great talk—I’m Offline, Cool! Now What?—Ola outlined the many and varied offline use cases that drove the creation and philosophy of Hoodie. She described all the reasons why people need the web; for communication, for access to information, for empowerment, and for love. “Hell, yes!” I thought.
But then she said:
So since when is helping people to fulfil a basic need, progressive enhancement?
And even more forcefully:
This is why I think, putting offline first in the progressive enhancement slot is pure bullshit.
Strong words indeed! And I have to say I was a little puzzled by them.
Ola had demonstrated again and again just how fragile the network could be. That is absolutely correct. All too often, we make the assumption that people using our sites have a decent network connection. That’s not a safe assumption to make.
But the suggested solution—to rely on technologies like local storage, Service Workers, or other APIs—assumes a certain level of JavaScript capabilities in the devices and browsers out there. That’s an unsafe assumption to make.
I remember discussing this with Alex from Hoodie a while back. I was confused by the cognitive dissonance I was observing. It seems to me that, laudable as Hoodie’s offline-first goals are, they are swapping out one unstable dependency—the network—for a different unstable dependency—a set of JavaScript APIs.
(I remember Alex pointed out that Hoodie was intended primarily for web apps rather than web sites, and my response—predictably enough—was to say “Define web app”.)
I think I understand why Ola reacted so strongly to the suggestion that offline functionality should be added as an enhancement. I’ve seen the same reaction when I’ve said that beautiful typography on the web is an enhancement. I think that when I say something is an enhancement, what people hear is that something is just an enhancement. It sounds belittling. That’s not my intention, but I can understand how it could come across like that. Perhaps this is one reason why some people have a real issue with the term “progressive enhancement”.
I wish we could make offline functionality a requirement. But the reality is that not everyone is using a browser that supports the necessary technology. I wish we could make beautiful typography a requirement. But, again, the reality is that there will always be some browsers or devices that won’t be capable of executing that typography. Accepting these facets of reality might seem like admissions of defeat, but I actually find it quite liberating.
In her brilliant talk at Render, Ashley G. Williams channeled Carl Sagan, quoting from his book The Demon-Haunted World:
It is far better to grasp the universe as it really is than to persist in delusion, however satisfying and reassuring.
That’s how I feel we should approach building for the web. Let’s accept that network connections are unevenly distributed. Let’s also accept that browser features are unevenly distributed. Pretending that millions of Opera Mini users don’t exist isn’t a viable strategy. They too are people who want to communicate, to access information, to be empowered, and to love.
Pointing out that you can’t always rely on client-side JavaScript shouldn’t be taken as an admonishment. It’s an opportunity.
Karolina Szczur wrote a wonderful piece on Ev’s blog called The Web Isn’t Uniform. She noticed how many sites—Facebook, AirBnB, Basecamp—failed to even render some useful information if the JavaScript fails to load. It’s a situation that many of us—with our fast connections, capable browsers, and modern devices—might never even notice.
It’s a privilege to be able to use breaking edge technologies and devices, but let’s not forget basic accessibility and progressive enhancement. Ultimately, we’re building for the users, not for our own tastes or preferences.
Karolina asks that we, as makers of the web, have a little more empathy. If the comments on her article are anything to go by, that’s a tall order. All the usual tropes are rolled out—there’s the misunderstanding that progressive enhancement means making sure everything works without JavaScript (it doesn’t; it’s about the core functionality), and the evergreen argument that as soon as you’re building a web “app”, that best practices, good engineering, and empathy can go out the window…
I strongly disagree that this has anything to do at all about empathy. Instead, it’s all about resources and priorities. Making a JS app is already hard enough, duplicating all that work so that it also works without JS is quite often just not practical.—Sacha Greif
But requiring that a site be functional when JavaScript is disabled, may not be a valid requirement anymore. HTML and CSS were originally created and designed for documents, not applications. Many websites these days should be considered apps rather then docs.—Dan Shappir
What you’re suggesting is that all these companies should write all their software twice, once in javascript and again in good ol’ html with forms, to cater to that point-whatever-percentage that has decided to break their own web browser by turning one of the three fundamental web technologies off. In what universe is this a reasonable request?—Erlend Halvorsen
JavaScript is as important as HTML. This is modern internet. If someone doesn’t have JavaScript, they should not be using the new applications that were possible because of JavaScript.—HarshaL
I am a web developer. I build web applications not web sites. What you say may be true for web sites with static pages displaying images and text.—R. Fancsiki
Ah, Medium! Where the opinions of self-entitled dudes flow like rain from the tech heavens.
While they were so busy defending the lack of basic functionality in all the examples that Karolina listed, they failed to notice the most important development:
Only couple hours later, thanks to @sstephenson, @basecamp is way more accessible without JS! That’s how you do it 👏 pic.twitter.com/cMsYGFhiKK
— fantastic ms. (@fox) April 23, 2016
Let’s build a web that works for everyone. That doesn’t mean everyone has to have the same experience. Let’s accept that there are all sorts of people out there accessing the web with all sorts of browsers on all sorts of devices.
What a fantastic opportunity!
Marvellous insights from Mark on how the robustness principle can and should be applied to styeguides and pattern libraries (‘sfunny—I was talking about Postel’s Law just this morning at An Event Apart in Boston).
Being liberal in accepting things into the system, and being liberal about how you go about that, ensures you don’t police the system. You collaborate on it.
So, what about the output? Remember: be ’conservative in what you do’. For a design system, this means your output of the system – guidelines, principles, design patterns, code, etc etc. – needs to be clear, unambiguous, and understandable.
Shout-out to @LotteJackson’s ALA article from @beep #aeabos
http://alistapart.com/article/from-pages-to-patterns-an-exercise-for-everyone
A straightforward little pattern library. There’s also the story of making it a living style guide.
Second coffee of the morning.
Our Harry’s in the New York Times! Well, an article on dark patterns is in the New York Times, and Harry is Mr. Dark Patterns.
One man threw a ball, and another man hit it with a stick, and then another man caught it, and then we all cheered.
At the ball game.
The Fenway Photobomb.
Going to Fenway.
I have a new role at Clearleft. It’s not a full-time role. It’s in addition to my existing role of …um …whatever it is I do at Clearleft.
Anyway, my new part-time role is that of being a content buddy. Sounds a little dismissive when I put it like that. Let me put in capitals…
My new part-time role is that of being a Content Buddy.
This is Ellen’s idea. She’s been recruiting Content Guardians and Content Buddies. The Guardians will be responsible for coaxing content out of people, encouraging to write that blog post, article, or case study. The role of the Content Buddy is to help shepherd those pieces into the world.
I have let it be known throughout the office that I am available—day or night, rain or shine—for proof-reading, editing, and general brain-storming and rubber-ducking.
On my first official day as a Content Buddy on Friday I helped Ben polish off a really good blog post (watch this space), listened to a first run-through of Charlotte’s upcoming talk at the Up Front conference in Manchester (which is shaping up to be most excellent), and got together with Paul for a mutual brainstorming session for future conference talks. The fact that Paul is no longer a full-time employee at Clearleft is a mere technicality—Content Buddies for life!
Paul is preparing a talk on design systems for Smashing Conference in Freiburg in September. I’m preparing a talk on the A
element for the HTML Special part of CSS Day in Amsterdam in just one month’s time (gulp!). We had both already done a bit of mind-mapping to get a jumble of ideas down on paper. We learned that from Ellen’s excellent workshop.
Then we started throwing ideas back and forth, offering suggestions, and spotting patterns. Once we had lots of discrete chunks of stuff outlined (but no idea how to piece them together), we did some short intense spurts of writing using the fiendish TheMostDangerousWritingApp.com. I looked at Paul’s mind map, chose a topic from it for him, and he had to write on that non-stop for three to five minutes. Meanwhile he picked a topic from my mind map and I had to do the same. It was exhausting but also exhilarating. Very quickly we had chunks of content that we could experiment with, putting them in together in different ways to find different narrative threads. I might experiment with publishing them as short standalone blog posts.
The point was not to have polished, finished content but rather to get to the “shitty first draft” stage quickly. We were following Hemingway’s advice:
Write drunk, edit sober.
…but not literally. Mind you, I could certainly imagine combining beer o’clock on Fridays with Content Buddiness. That wasn’t an option on this particular Friday though, as I had to run off to band practice with Salter Cane. A very different, and altogether darker form of content creation.
Shipping up to Boston. brb
For your information, the Let’s Encrypt client is now called Certbot for some reason.
Carry on.
Looking good, giant dog mural, looking good.
Ah, how I wish that this were published at a long-lived URL:
The one part of the web that I believe is truly genius, and that keeps standing the test of time, is the URI. The Web gave us a way to point to anything, forever. Everything else about the web has changed and grown to encyclopedic lengths, but URIs have been killing it for decades.
And yet the numbers show we’re hell-bent on screwing all that up with link-shorteners, moving URIs without redirection, and so forth. As always happens in technology we’ve taken a simple idea and found expedient ways to add fragility and complexity to it.
Shane gave a talk recently where he outlined his reasons for publishing on the indie web:
Most people reading this will probably have an account at most or all of these sites: Facebook, Instagram, Twitter, YouTube, Vimeo, Tumblr, Wordpress. Many also had accounts at Friendster, Tribe, MySpace, Delicious, Magnolia, Gowalla, Geocities. But no one has an account at any of those (on the second list) anymore. And all of the content that we created on those sites is gone.
All of those super emo feeling you posted to MySpace, they’re all gone. Some of the great web designers of our generation got started on Geocities. That stuff is gone forever. And sure, it was sparkling animated GIFs and neon colors. But that’s important history. Yahoo bought it, left it alone for a while, and then decided one day to turn it off.
Prompted by the way Craig is handling the shutdown of hi.co, Glenn Fleishman takes a look at other digital preservation efforts and talk to Laura Welcher at the Long Now Foundation.
A time capsule is bottled optimism. It makes material the belief that human beings will survive long enough to retrieve and decode artifacts of the distant past.
Now this is how you shut down a service:
Web projects often lack hard edges. They begin with clarity but end without. We want to close Hi.co with clarity. To properly bookend the website.
And nary a trace of “We are excited to announce…” or “Thank you for joining us on our incredible journey…”
(Such a shame that the actual shut-down notice is only on Ev’s blog, but hopefully Craig will write something on his own site too.)
Remember: life is ten per cent what happens to you, ten per cent how you respond to it, and eighty per cent how good your reflexes are when the Tall Ones come at your throat with their pincers.
The history of Facebook’s attempt to steamroll over net neutrality in India …and how they failed in that attempt, thanks to a grassroots campaign.
Crucially, Facebook itself would decide which sites were included on the platform. The company had positioned Internet.org as a philanthropic endeavour — backed by Zuckerberg’s lofty pronouncements that “connectivity is a human right” — but retained total control of the platform.
There’s a lot I disagree with here. I don’t think this pattern library process is very elegant or scalable, and it certainly wouldn’t work for me.
But I’m still linking to it. Why? Because I think it’s absolutely wonderful that people share their processes like this. It doesn’t matter one whit whether or not it would work for me.
Frontend development may have gotten a lot more complicated, but the simple premise of sharing what you’ve learned hasn’t.
I couldn’t agree more!
Marco is spot on here. The New York Times article he’s responding to is filled with a weird Stockholm syndrome—the one bit of the web that’s still free of invasive tracking and surveillance is where they wish a centralised power (like Apple) would come in and lock down. Madness!
Data data data. Publishers crave data — but one of the things I love about podcasts is that the format blocks the collection of most data, because there is no code that gets executed. JavaScript has brought the web to the brink of ruin, but there’s no JavaScript in podcasting. Just an RSS feed and MP3 files.
An engaging look at the history of word processing, word processed by Josephine Livingstone.
Za’atar zucchini.
Cumin fish and rice.
Oh, how I wish I could make it to this event!
June 8th-9th at Internet Archive, featuring Vint Cerf, Brewster Kahle, and more.
We are bringing together a diverse group of Web architects, activists, engineers, archivists, scholars, journalists, and other stakeholders to explore the technology required to build a Decentralized Web and its impact.
Talk prep, phase 1: doodling.
If you don’t comment your CSS, you’ll confuse other people looking at your code, and, more embarrassingly, you’ll confuse future you. If you do comment CSS, everybody will be less confused, and things will be accidentally broken less often. You will be popular and generally well-liked, and people will remember to send you cards on your birthday. Comment more.
Some good advice here on how to write better comments in CSS.
Indie Web Camp Düsseldorf took place last weekend and it was—no surprise—really excellent.
It felt really good to have one in Germany again so soon after the last one in Nuremberg. Lots of familiar faces showed up as well as plenty of newcomers.
I’m blown away by how much gets done in two short days, especially from people who start the weekend without a personal website and end it with something to call their own. Like Julie’s new site for example (and once again she took loads of great photos).
My own bit of hacking was quite different to what I got up to in Nuremberg. At that event, I was concentrating on the interface, adding sparklines and a bio to my home page. This time round I concentrated more on the plumbing. I finally updated some the code that handles webmentions. I first got it working a few year’s back at an Indie Web Camp here in Brighton, but I hadn’t really updated the code in a while. I’m much happier with the way it’s working now.
I also updated the way I’m syndicating my notes to Twitter, specifically how I send photos. Previously I was using the API method /statuses/updatewithmedia
.
When I was at the Mobile @Scale event at Facebook’s London office a while back, Henna Kermani gave a talk about the new way that Twitter handles file uploads. There’s a whole new part of the API for handling that. When she got off stage, I mentioned to her that I was still using the old API method and asked how long it would be until it was switched off. She looked at incredulously and said “It’s still working‽ I thought it had been turned off already!”
That’s why I spent most of my time at Indie Web Camp Düsseldorf updating my PHP. Switching over to the TwitterOAuth library made it a bit less painful—thanks to Bea for helping me out there.
When it came time to demo, I didn’t have much to show. On the surface, my site looked no different. But I feel pretty good about finally getting around to changing the wiring under the hood.
Besides, there were plenty of other great demos. There was even some more sparklining. Check out this fantastic visualisation of the Indie Web Camp IRC logs made by Kevin …who wasn’t even in Düsseldorf; he participated remotely.
If you get the chance to attend an Indie Web Camp I highly, highly recommend it. In the meantime you can start working on your personal site. Here’s a quick primer I wrote a while back on indie web building blocks. Have fun!
Lovely, lovely pictures from last weekend’s brilliant Indie Web Camp in Düsseldorf.
Here’s the video of the talk I just gave at the Beyond Tellerrand conference in Düsseldorf: Resilience.
Really glad I had the opportunity to meet @djrrb, @_lilchen, and @cattsmall during this year’s @btconf—jolly nice people.
Home-cooked food.
Waiting for an aeroplane to take me home.
Really enjoyed opening up day two of @BTconf in Düsseldorf to a very welcoming and appreciative audience.
Rachel and Drew have been beta-testing Mark’s Fractal project for organising a library of components for Perch’s interface. Sounds like it’s working out very, very well indeed!
Had a really great chat about open device labs with @RaquelPGodinho during @BTconf.
Full of ramen (again) after another lovely lunch in Takumi.
Tonkatsu ramen.
Kara-age.
Absolutely brilliant stuff from Mandy (again). A long hard at today’s tech industry’s narrow approach to bots and artificial intelligence compared to some far more interesting and imaginative approaches in fiction:
So in addition to frightening ramifications for privacy and information discovery, they also reinforce gendered stereotypes about women as servants. The neutral politeness that infects them all furthers that convention: women should be utilitarian, performing their duties on command without fuss or flourish. This is a vile, harmful, and dreadfully boring fantasy; not the least because there is so much extraordinary art around AI that both deconstructs and subverts these stereotypes. It takes a massive failure of imagination to commit yourself to building an artificial intelligence and then name it “Amy.”
I really enjoyed chatting to Ade on The Design Jones podcast. I rambled on about design, the web, and all that stuff.
It’s on Soundcloud and here’s the podcast feed.
This is so cool! The logs of the Indie Web Camp IRC channel visualised as a series of sparklines in the style of Joy Division/Jocelyn Bell Burnell.
Some smart thoughts on web fonts.
Demo time!
Hacking.
Indie Web Camping.
Spargelzeit!
Asparagus and pasta.
Post-IndieWebCamp beers in Düsseldorf.
Kicking off Indie Web Camp Düsseldorf with intros and site demos.
Live from Düsseldorf… http://indiewebcamp.com/live
Pre-IndieWebCamp beers in Düsseldorf.
Going to Düsseldorf. brb
I still haven’t seen any new Game Of Thrones episodes, but I did see King Lear so I’m all caught up on familial intrigue, murder and gore.
If you want to go to the Indie Web Summit on June 3rd to 5th (and you should), there’s a travel assistance fund:
If you are a member of a group that is typically underrepresented (e.g. if you are not straight, white, cis and male), and otherwise could not afford to travel to IndieWeb Summit on your own, an anonymous donor has established a $1000 fund to assist individuals from underrepresented backgrounds with travel and/or lodging costs for the Indieweb Summit in Portland.
Hidden little details that make a big difference for screen readers.
A website is only as beautiful as the underlying markup.
Ice cream!
Pork chop.
An unusual souvenir from last week’s Craft conference in Budapest.
“A single hypertext link could lead to an enormous, unbounded world.”
—@TimBerners_Lee
The Perch Control Panel is progressively enhanced. Almost all functionality of Perch is available even if you completely disable JavaScript, or if JavaScript fails to load.
Intermission.
Fish pie and a pint.
Excited about seeing King Lear at Brighton’s Theatre Royal tonight …featuring Moff Jerjerrod, commander of the second Death Star, as Lear.
Tomorrow evening it’s Homebrew Website Club in Brighton:
https://indiewebcamp.com/events/2016-05-04-homebrew-website-club
This is my kind of talk—John Snow’s cholera map, the Yucca Mountain think-tank, the Pioneer plaque, the Voyager record, the Drake equation, the Arecibo signal, and the love song of J. Alfred Prufrock.
♫ These are a few of my fav-our-ite things! ♫
A glanceable one-stop-shop for how today’s browsers are dealing with today’s accessibility features. Then you can dive deeper into each one.
In this English language alternative to latitude and longitude coordinates, the Clearleft office is located at:
cross.rooms.quick
Walking to work.
Mike’s blog is back on the Indie Web.
As someone who designs things for a living, there is a certain amount of professional pride in creating one’s own presence on the internet. It’s kind of like if an architect didn’t design their own house.
I’ve been on the web for most of my life, but, without a site to call home, I haven’t been of the web for far too long.
Buckminsterfullerene is such a beautiful piece of graphic design.
Every game of football is a little tribute to C60.
Just heard the sad news about Harry Kroto. Such a loss to science …and to our species.
Enjoying the all-time-low Europe-wide roaming charges that kicked in yesterday …until the UK decides to leave the EU, that is.