Making web apps? Care about SEO? Here’s Google’s advice:
Use feature detection & progressive enhancement techniques to make your content available to all users.
Making web apps? Care about SEO? Here’s Google’s advice:
Use feature detection & progressive enhancement techniques to make your content available to all users.
A single page showing all the weights available from Google fonts at a glance.
Paul gives the lowdown on the Google+ responsive relaunch. They set themselves this performance budget:
And this bit is crucial:
Ben and Erin are shipping experimental support for AMP in the latest version of Known, but Ben has some concerns about the balance of power tilting towards one major player, in this case Google:
But it’s Google’s whitelist of approved ad providers that’s most concerning:
We’ve shipped support for AMP because we see potential here, and recognize that something should be done to improve the experience of loading independently-published content on the web. But attempting to bake certain businesses into a web standard is a malformed idea that is doomed to fail. If this is not corrected in future versions of the specification, we will withdraw support.
But he shares my hope that AMP could serve as a demo of what the web could be if we developers had the will and political clout to see it through:
I wonder if what AMP really does is remind us how we’ve failed to build a performant web… we know how to, but all too often we just choose not to (or lose the argument) and fill our sites with cruft that kills performance, and with it our visitors’ experience.
It’s official: hash bang URLs are an anti-pattern, and if you want your content indexed by Google, use progressive enhancement:
Since the assumptions for our 2009 proposal are no longer valid, we recommend following the principles of progressive enhancement.
Tom’s thoughts on what AMP means for developers and publishers. He was initially sceptical but now he’s cautiously optimistic. Like me, he’s hoping that AMP can force the hand of those third-party advertisers to get their act together.
Publisher’s development teams are very capable of creating fast experiences for mobile users, but they don’t have the clout to coordinate all the additional cruft that is added to the page. However, if all the different publishers dev team’s got together and put their weight behind a single implementation, then we can force third parties to change their habits.
An alternate version of AMP HTML that works in more parsers and user agents.
The AMP project have “A new approach to web performance” making your website dependent on Google. The Be Nice AMP Project follow the old approach: Make your site fast following best practice guidelines and be independent of Google.
This doorslam turned out to be bad for business.
Google Fonts aren’t renowned for their quality but this is a beautiful demonstration of what you can accomplish with them.
Did you know Google runs a free an open image resizing service?
I did not! This could be quite useful. Seeing as it’s an https endpoint, it could be especially useful on https sites that pull images from http domains (and avoid those mixed-content warnings).
The key change in all of this, I think, is that Google has gone from a world of almost perfect clarity - a text search box, a web-link index, a middle-class family’s home - to one of perfect complexity - every possible kind of user, device, access and data type. It’s gone from a firehose to a rain storm. But on the other hand, no-one knows water like Google. No-one else has the same lead in building understanding of how to deal with this. Hence, I think, one should think of every app, service, drive and platform from Google not so much as channels that might conflict but as varying end-points to a unified underlying strategy, which one might characterize as ‘know a lot about how to know a lot’.
It looks like Google is going to start explicitly labelling slow sites as such in their search results (much like they recently started explicitly labelling mobile-friendly sites). So far it’s limited to Google’s own properties but it could be expanded.
Personally, I think this is a fair move. If the speed of a site were used to rank sites differently, I think that might be going too far. But giving the user advanced knowledge and leaving the final decision up to them …that feels good.
There are some good points here comparing HTTP2 and SPDY, but I’m mostly linking to this because of the three wonderful opening paragraphs:
A very long time ago —in 1989 —Ronald Reagan was president, albeit only for the final 19½ days of his term. And before 1989 was over Taylor Swift had been born, and Andrei Sakharov and Samuel Beckett had died.
In the long run, the most memorable event of 1989 will probably be that Tim Berners-Lee hacked up the HTTP protocol and named the result the “World Wide Web.” (One remarkable property of this name is that the abbreviation “WWW” has twice as many syllables and takes longer to pronounce.)
Tim’s HTTP protocol ran on 10Mbit/s, Ethernet, and coax cables, and his computer was a NeXT Cube with a 25-MHz clock frequency. Twenty-six years later, my laptop CPU is a hundred times faster and has a thousand times as much RAM as Tim’s machine had, but the HTTP protocol is still the same.
I completely share Bruce’s concern about the year-zero thinking that’s accompanying a lot of the web components marketing:
Snarking aside, why do so few people talk about extending existing HTML elements with web components? Why’s all the talk about brand new custom elements? I don’t know.
I’m a fan of web components. But I’m increasingly worried about the messaging surrounding them.
Many of the free fonts available from Google are pretty bad, but this site showcases how some of them can be used to great effect.
I don’t tend to be a “magic pill” kind of believer, but I can honestly say that embracing progressive enhancement can radically change your business for the better. And I’m glad to see Google agrees with me.
Google has updated its advice to people making websites, who might want to have those sites indexed by Google. There are two simple bits of advice: optimise for performance, and use progressive enhancement.
Just like modern browsers, our rendering engine might not support all of the technologies a page uses. Make sure your web design adheres to the principles of progressive enhancement as this helps our systems (and a wider range of browsers) see usable content and basic functionality when certain web design features are not yet supported.
Stuart has written some wise words about making privacy the differentiator that can take on Facebook and Google.
He also talks about Aral’s ind.ie project; all the things they’re doing right, and all things they could do better:
The ind.ie project is to open source as Brewdog are to CAMRA.
This is what Scott Jenson has been working on—a first stab at just-in-time interactions by having physical devices broadcasting URLs.
Walk up and use anything
Kubrickian pictures taken by the Google robot wherein it captures its own reflection.
An early look at the just-in-time interactions that Scott has been working on:
Nearby works like this. An enabled object broadcasts a short description of itself and a URL to devices nearby listening. Those URLs are grabbed and listed by the app, and tapping on one brings you to the object’s webpage, where you can interact with it—say, tell it to perform a task.
A nice stroll around Marseilles at night without any of the traditional danger.
I despair sometimes.
Here’s a ridiculous Heath-Robinsonesque convoluted way of getting the mighty all-powerful Googlebot to read the web thangs you’ve built using the new shiny client-side frameworks like Angular, Ember, Backbone…
Here’s another idea: output your HTML in HTML.
You might want to untick the checkbox at the bottom of this screen:
Based upon my activity, Google may show my name and profile photo in shared endorsements that appear in ads.
I have a lot of admiration for Reverend Dan Catt.
I don’t want to be in a position where I say “Hey, I’m working at Google, no no, don’t worry, the good bit of Google”, because goodness knows I did enough of that at Yahoo.
We have lost an ally in the fight to maintain net neutrality. I wonder how Vint Cerf feels about his employer’s backtracking.
The specific issue here is with using a home computer as a server. It’s common for ISPs to ban this activity, but that doesn’t change the fact that it flies in the face of the fundamental nature of the internet as a dumb network.
I think the natural end point to owning your own data is serving your own data—something that Steven Pemberton talked about in his fateful talk.
We must fight these attempts to turn the internet into controlled system of producers and consumers.
Stuart nails it: the real problem with delegating identity is not what some new app will do with your identity details, it’s what the identity provider—Twitter, Google, Facebook—will do with the knowledge that you’re now using some new app.
This is why I want to use my own website as my identity provider.
Looks like Google are offering responsive (or at least adaptive) ad sizes.
A superb piece by Marco Arment prompted by the closing of Google Reader. He nails the power of RSS:
RSS represents the antithesis of this new world: it’s completely open, decentralized, and owned by nobody, just like the web itself. It allows anyone, large or small, to build something new and disrupt anyone else they’d like because nobody has to fly six salespeople out first to work out a partnership with anyone else’s salespeople.
And he’s absolutely on the money when he describes what changed:
RSS, semantic markup, microformats, and open APIs all enable interoperability, but the big players don’t want that — they want to lock you in, shut out competitors, and make a service so proprietary that even if you could get your data out, it would be either useless (no alternatives to import into) or cripplingly lonely (empty social networks).
I share his anger.
Well, fuck them, and fuck that.
Google’s track record is not looking good. There seems to be a modus operandi of bait-and-switch: start with open technologies (XMPP, CalDav, RSS) and then once they’ve amassed a big enough user base, ditch the standards.
Google’s plan to bring internet connectivity to remote areas by using balloons wafting in the stratosphere.
Considering that Google seems to put as much time and effort into its April Fool’s jokes as it does into its real projects, you’d be forgiven for assuming this was a spoof.
Good news from Google: it’s going to start actively penalising sites for perpetrating the worst practices for mobile e.g. redirecting a specific “desktop” URL to a the homepage of the mobile site, or for shoving a doorslam “download our app” message at users.
I wish that we could convince people not to do that crap on the basis of it being, well, crap. But when all else fails, saying “Google says so” carries a lot of weight (see also: semantics, accessibility, yadda, yadda, yadda).
The litany of open standards that Google has been abandoning: RSS, XMPP, WebDav…
The accidental beauty in Google’s autosuggest algorithm.
A good history lesson in rendering engines: KHTML, WebKit, and now, Blink.
Charles Arthur analyses the data from Google’s woeful history of shutting down its services.
So if you want to know when Google Keep, opened for business on 21 March 2013, will probably shut - again, assuming Google decides it’s just not working - then, the mean suggests the answer is: 18 March 2017. That’s about long enough for you to cram lots of information that you might rely on into it; and also long enough for Google to discover that, well, people aren’t using it to the extent that it hoped.
Prepare to lose yourself for hours as you keep hitting “take me somewhere else” through these most bizarre and wonderful Google street view locations.
Related to my rant on links that aren’t actually links: buttons that aren’t actually buttons.
Communal satellite eyes. A Mac screensaver is also available.
I’ve been thinking about getting a birdhouse.
A fascinating piece by James on trap streets, those fictitious places on maps that have no corresponding territory.
Beautiful thoughtful work from the BERGians.
In the hippest areas for Street Art, life-sized pictures of people found on Google’s Street View are printed and posted without authorization at the same spot where they were taken.
Google’s datadump makes for a fascinating—and worrying—bit of data dumpster diving.
Robin Sloan compares Facebook and Google in an interesting way:
Really, Facebook is the world’s largest photo sharing site—that also happens to be a social network and a login system.
Google is getting good, really good, at building things that see the world around them and actually understand what they’re seeing.
Advice on creating responsive designs from Google. It’s not exactly the best tutorial out there (confusing breakpoints with device widths) but it’s great to see the big guns getting involved.
Glenn gives a rational thoughtful explanation of why he’s as pissed off as I am about Google’s destruction of the Social Graph API.
An in-depth look at where Google is going wrong.
Jason’s rip-roaring presentation from Defcon last year.
Google are shutting down the Social Graph API. Twunts.
As if you needed another reason why QR codes are shit ..are you certain you’ve proofed it?
2951 images at 12 frames per second. Each image is the “related image” of the image before according to Google image search. The first image is simply a transparent PNG.
Stef does some data-sleuthing and uncovers some shocking behaviour on the part of Google in Kenya.
What would Google+, YouTube and Facebook have looked like in 1997?
Everyone has their bullshit. You can simply decide whose you’re willing to tolerate.
This move by Google to start executing some POST requests makes me very uneasy: the web is agreement and part of that agreement is that POST requests are initiated by the user.
Aral takes the words right out of my mouth. This is pretty much exactly how I feel about Dart.
An excellent article that examines the supposed benefits of publishing through someone else’s app store instead of the web.
John pushes back against the idea that browser innovation is moving too slow.
Performance shit just got real.
You can now sign up with Google to have your site pass every request through them and get your documents served up optimised.
Great news! Google Analytics now tracks page load times.
The threat to Google Videos shows businesses are not suitable cultural custodians — they can’t be held accountable to the public.
A supremely useful tool from Google for measuring performance.
A nice overview of the increasing importance of UX on the web, written by Bobbie with soundbites from Andy.
Yeah, it’s an April Fool’s video (lamest day on the internet) but this is amusing.
How cool is this‽ You can create your own custom “huffduff it” link for items in Google Reader.
The Google voicemail transcript, which begins at 11 minutes in, cracked me up.
Some of the more unusual moments in time that have been captured by Google Street View. There’s something very Gibsonian about this.
Tim Bray calmly explains why hash-bang URLs are a very bad idea.
This is what we call “tight coupling” and I thought that anyone with a Computer Science degree ought to have been taught to avoid it.
So why use a hash-bang if it’s an artificial URL, and a URL that needs to be reformatted before it points to a proper URL that actually returns content?
Out of all the reasons, the strongest one is “Because it’s cool”. I said strongest not strong.
An interesting, if necessarily somewhat complicated-looking, API from Google: analyse your user's past behaviour to predict future outcomes.
If you aren't already marking up addresses in hCard, you really, really, really should start.
Google reaffirms its commitment to net neutrality ...except when it comes to wireless broadband, of course, because that's *totally* different, right? This disgusts me.
Well: this is an odd one: the entire duration of the trans-siberian railway on video and simultaneous map.
A new HTML5 resource from Paul Irish and other Googlers.
Steve Faulkner has created a petition to let Google know what screenreader users think of Chrome's appalling lack of basic accessibility hooks.
Google-hosted free-as-in-beer webfonts.
Mozilla, Opera and Google are collaborating on an open format for audio and video for the web (a wrapper for Vorbis for audio and VP8 for video).
An excellent way to do geolocation even in browser that don't support it natively.
A lesson from Google Buzz: a large sampling isn't always a representative sampling.
Before we point the finger and laugh at the Facebook users leaving confused comments on Read Write Web, we should look to our own experiences with Google Buzz.
Erin explains exactly how badly Google have messed up privacy concerns with Buzz.
A frightening tale of just how badly Google messed up with the lack of privacy controls on Buzz.
Best. Bug report. Ever.
Using Google Chrome Frame in IE will give users of assistive technology the same shitty to non-existent experience they would get in the actual Google Chrome browser.
A tool from Google to help you see how your microformated content is showing up.
Foreheadslappingly stupid behaviour from the Associated Press.
A Quicksilver rival from Google.
Standalone embeddable widgets from Google that you can drop into any web page. The maps widget finally frees the maps API from the tyranny of coupling a domain with an API key.
A superb call to arms on the importance of "fat pipe, always on, get out of my way."
Douglas is featured in The New York Times (and look: there's Dustin behind him).
A nice overview of Glenn's XFN Firefox plug-in.
A person-specific portal generated using Google's Social Graph API. And it's less than 5K!
Douglas explains why he's leaving Google. "I won’t miss a design philosophy that lives or dies strictly by the sword of data."