Cargo cultism is not a strategy:
Apple and Google get it wrong just as often as the rest of us.
Cargo cultism is not a strategy:
Apple and Google get it wrong just as often as the rest of us.
Like Brad, I switched to Firefox for web browsing and Duck Duck Go for searching quite a while back. I highly recommend it.
I can’t decide if this is industrial sabotage or political protest. Either way, I like it.
99 second hand smartphones are transported in a handcart to generate virtual traffic jam in Google Maps.Through this activity, it is possible to turn a green street red which has an impact in the physical world by navigating cars on another route to avoid being stuck in traffic
Dan responds to an extremely worrying sentiment from Alex:
The sentiment about “engine diversity” points to a growing mindset among (primarily) Google employees that are involved with the Chromium project that puts an emphasis on getting new features into Chromium as a much higher priority than working with other implementations.
Needless to say, I agree with this:
Proponents of a “move fast and break things” approach to the web tend to defend their approach as defending the web from the dominance of native applications. I absolutely think that situation would be worse right now if it weren’t for the pressure for wide review that multiple implementations has put on the web.
The web’s key differentiator is that it is a part of the commons and that it is multi-stakeholder in nature.
Can you believe we used to willingly tell Google about every single visitor to basecamp.com by way of Google Analytics? Letting them collect every last byte of information possible through the spying eye of their tracking pixel. Ugh.
In this new world, it feels like an obligation to make sure we’re not aiding and abetting those who seek to exploit our data. Those who hoard every little clue in order to piece of together a puzzle that’ll ultimately reveal all our weakest points and moments, then sell that picture to the highest bidder.
We have to stop confusing the excesses of capitalism with the hallmarks of quality. Sometimes Google aren’t better, they’re just more pervasive.
cough AMP cough
Every day, millions of people rely on independent websites that are mostly created by regular people, weren’t designed as mobile apps, connect deeply to culture, and aren’t run by the giant tech companies. These are a vision of not just what the web once was, but what it can be again.
This really hits home for me. Anil could be describing The Session here:
They often start as a labor of love from one person, or one small, tightly-knit community. The knowledge or information set that they record is considered obscure or even worthless to outsiders, until it becomes so comprehensive that its collective worth is undeniable.
This is a very important message:
Taken together, these sites are as valuable as any of the giant platforms run by the tech titans.
The dominant narrative for the growth of the World Wide Web, the graphical, user-friendly version of the internet created by Tim Berners-Lee in 1989, is that its success has been propelled by Silicon Valley venture capitalism at its most rapacious. The idea that currently prevails is that the internet is best built by venture-backed startups competing to offer services globally through category monopolies: Amazon for shopping, Google for search, Facebook for social media. These companies have generated enormous profits for their creators and early investors, but their “surveillance capitalism” business model has brought unanticipated harms.
It doesn’t have to be this way, says Ethan Zuckerman:
A public service Web invites us to imagine services that don’t exist now, because they are not commercially viable, but perhaps should exist for our benefit, for the benefit of citizens in a democracy. We’ve seen a wave of innovation around tools that entertain us and capture our attention for resale to advertisers, but much less innovation around tools that educate us and challenge us to broaden our sphere of exposure, or that amplify marginalized voices. Digital public service media would fill a black hole of misinformation with educational material and legitimate news.
On the internet, everything can appear equally legitimate. Breitbart resembles the BBC. The fictitious Protocols of the Elders of Zion look as valid as an ADL report. And the rantings of a lunatic seem as credible as the findings of a Nobel Prize winner. We have lost, it seems, a shared sense of the basic facts upon which democracy depends.
Amnesty International have released a PDF report on the out-of-control surveillance perpetrated by Google and Facebook:
Google and Facebook’s platforms come at a systemic cost. The companies’ surveillance-based business model forces people to make a Faustian bargain, whereby they are only able to enjoy their human rights online by submitting to a system predicated on human rights abuse. Firstly, an assault on the right to privacy on an unprecedented scale, and then a series of knock-on effects that pose a serious risk to a range of other rights, from freedom of expression and opinion, to freedom of thought and the right to non-discrimination.
This page on the Amnesty International website has six tracking scripts. Also, consent to accept tracking cookies is assumed (check dev tools). It looks like you can reject marketing cookies, but I tried that without any success.
The stone PDF has been thrown from a very badly-performing glass house.
A good overview of the unfair playing field of web browsers, dominated by the monopolistic practices by Google and Apple.
Mozilla is no longer fighting for market share of its browser: it is fighting for the future of the web.
It’s nice to see that the Chrome browser will add interface enhancements to show whether you can expect a site to load fast or slowly.
Just a shame that the Google search team aren’t doing this kind of badging …unless you’ve given up on your website and decided to use Google AMP instead.
Maybe the Chrome team can figure out what the AMP team are doing to get such preferential treatment from the search team.
Google’s pissing over HTML again, but for once, it’s not by making up
A new way to help limit which part of a page is eligible to be shown as a snippet is the “
data-nosnippet” HTML attribute on
This is a direct contradiction of how
data-* attributes are intended to be used:
…these attributes are intended for use by the site’s own scripts, and are not a generic extension mechanism for publicly-usable metadata.
What over a decade of number-crunching analytics has taught me is that spending an hour writing, sharing, or helping someone is infinitely more valuable than spending that hour swimming through numbers. Moreover, trying to juice the numbers almost invariably divorces you from thinking about customers and understanding people. On the surface, it seems like a convenient proxy, but it’s not. They’re just numbers. If you’re searching for business insights, talking to real people beats raw data any day. It’s not as convenient, but when is anything worth doing convenient?
This is good news. I have third-party cookies disabled in my browser, and I’m very happy that it will become the default.
It’s hard to believe that we ever allowed third-party cookies and scripts in the first place. Between them, they’re responsible for the worst ills of the World Wide Web.
Ignore the clickbaity headline and have a read of Whitney Kimball’s obituaries of Friendster, MySpace, Bebo, OpenSocial, ConnectU, Tribe.net, Path, Yik Yak, Ello, Orkut, Google+, and Vine.
I’m sure your content on Facebook, Twitter, and Instagram is perfectly safe.
The ellipsis is the new hamburger.
It’s disappointing that Apple, supposedly a leader in interface design, has resorted to such uninspiring, and I’ll dare say, lazy design in its icons. I don’t claim to be a usability expert, but it seems to me that icons should represent a clear intention, followed by a consistent action.
I have a proposal that I think might alleviate some of the animosity around Google AMP. You can jump straight to the proposal or get some of the back story first…
But I cannot get behind AMP.
Instead of competing on its own merits, AMP is unfairly propped up by the search engine of its parent company, Google. That makes it very hard to evaluate whether AMP is being used on its own merits. Instead, the evidence suggests that most publishers of AMP pages are doing so because they feel they have to, rather than because they want to. That’s a real shame, because as a library of web components, AMP seems pretty good. But there’s just no way to evaluate AMP-the-format without taking into account AMP-the-ecosystem.
Google AMP ostensibly exists to make the web faster. Initially the focus was specifically on mobile performance, but that distinction has since fallen by the wayside. The idea is that by using AMP’s web components, your pages will be speedy. Though, as Andy Davies points out, this isn’t always the case:
This is where I get confused… https://independent.co.uk only have an AMP site yet it’s performance is awful from a user perspective - isn’t AMP supposed to prevent this?
According to Google’s own Page Speed Insights audit (which Google recommends to check your performance), the AMP version of articles got an average performance score of 87. The non-AMP versions? 95.
Publishers who already have fast web pages—like The Guardian—are still compelled to make AMP versions of their stories because of the search benefits reserved for AMP. As Terence Eden reported from a meeting of the AMP advisory committee:
We heard, several times, that publishers don’t like AMP. They feel forced to use it because otherwise they don’t get into Google’s news carousel — right at the top of the search results.
Some people felt aggrieved that all the hard work they’d done to speed up their sites was for nothing.
The Google AMP team are at pains to point out that AMP is not a ranking factor in search. That’s true. But it is unfairly privileged in other ways. Only AMP pages can appear in the Top Stories carousel …which appears above any other search results. As I’ve said before:
Now, if you were to ask any right-thinking person whether they think having their page appear right at the top of a list of search results would be considered preferential treatment, I think they would say hell, yes! This is the only reason why The Guardian, for instance, even have AMP versions of their content—it’s not for the performance benefits (their non-AMP pages are faster); it’s for that prime real estate in the carousel.
Content that “opts in” to AMP and the associated hosting within Google’s domain is granted preferential search promotion, including (for news articles) a position above all other results.
That’s not the only way that AMP pages get preferential treatment. It turns out that the secret to the speed of AMP pages isn’t the web components. It’s the prerendering.
If you’ve ever seen an AMP page in a list of search results, you’ll have noticed the little lightning icon. If you’ve ever tapped on that search result, you’ll have noticed that the page loads blazingly fast!
That’s not down to AMP-the-format, alas. That’s down to the fact that the page has been prerendered by Google before you even went to it. If any page were prerendered that way, it would load blazingly fast. But currently, this privilege is reserved for AMP pages only.
If, after tapping through to that AMP page, you looked at the address bar of your browser, you might have noticed something odd. Even though you might have thought you were visiting The Washington Post, or The New York Times, the URL of the (blazingly fast) page you’re looking at is still under Google’s domain. That’s because Google hosts any AMP pages that it prerenders.
Google calls this “the AMP cache”, but it would be better described as “AMP hosting”. The web page sent down the wire is hosted on Google’s domain.
Here’s that AMP letter again:
When a user navigates from Google to a piece of content Google has recommended, they are, unwittingly, remaining within Google’s ecosystem.
Through gritted teeth, I will refer to this as “the AMP cache”, because that’s what everyone else calls it. But make no mistake, Google is hosting—not caching—these pages.
But why host the pages on a Google domain? Why not prerender the original URLs?
The pitch I think site owners are hearing is: let us host your pages on our domain and we’ll promote them in search results AND preload them so they feel “instant.” To opt-in, build pages using this component syntax.
But perhaps we could de-couple the AMP format from the AMP cache.
That’s what Terence suggests:
My recommendation is that Google stop requiring that organisations use Google’s proprietary mark-up in order to benefit from Google’s promotion.
Instead of granting premium placement in search results only to AMP, provide the same perks to all pages that meet an objective, neutral performance criterion such as Speed Index.
It’s been said before but it would be so good for the web if pages with a Lighthouse score over say, 90 could get into that top search result area, even if they’re not built using Google’s AMP framework. Feels wrong to have to rebuild/reproduce an already-fast site just for SEO.
Here’s the problem…
Let’s say Google do indeed prerender already-fast pages when they’re listed in search results. You, a search user, type something into Google. A list of results come back. Google begins pre-rendering some of them. But you don’t end up clicking through to those pages. Nonetheless, the servers those pages are hosted on have received a GET request coming from a Google search. Those publishers now know that a particular (cookied?) user could have clicked through to their site. That’s very different from knowing when someone has actually arrived at a particular site.
And that’s why Google host all the AMP pages that they prerender. Given the privacy implications of prerendering non-Google URLs, I must admit that I see their point.
Still, it’s a real shame to miss out on the speed benefit of prerendering:
Prerendering AMP documents leads to substantial improvements in page load times. Page load time can be measured in different ways, but they consistently show that prerendering lets users see the content they want faster. For now, only AMP can provide the privacy preserving prerendering needed for this speed benefit.
Why is Google’s AMP cache just for AMP pages? (Y’know, apart from the obvious answer that it’s in the name.)
What if Google were allowed to host non-AMP pages? Google search could then prerender those pages just like it currently does for AMP pages. There would be no privacy leaks; everything would happen on the same domain—google.com or ampproject.org or whatever—just as currently happens with AMP pages.
Don’t get me wrong: I’m not suggesting that Google should make a 1:1 model of the web just to prerender search results. I think that the implementation would need to have two important requirements:
This could be a
meta element. Maybe something like:
<meta name="caches-allowed" content="google">
This would have the nice benefit of allowing comma-separated values:
<meta name="caches-allowed" content="google, yandex">
(The name is just a strawman, by the way—I’m not suggesting that this is what the final implementation would actually look like.)
If not a
meta element, then perhaps this could be part of
robots.txt? Although my feeling is that this needs to happen on a document-by-document basis rather than site-wide.
Many people will, quite rightly, never want Google—or anyone else—to host and serve up their content. That’s why it’s so important that this behaviour needs to be opt-in. It’s kind of appalling that the current hosting of AMP pages is opt-in-by-proxy-sort-of.
Which pages should be blessed with hosting and prerendering? The fast ones. That’s sorta the whole point of AMP. But right now, there’s a lot of resentment by people with already-fast websites who quite rightly feel they shouldn’t have to use the AMP format to benefit from the AMP ecosystem.
Page speed is already a ranking factor. It doesn’t seem like too much of a stretch to extend its benefits to hosting and prerendering. As mentioned above, there are already a few possible metrics to use:
Ah, but what if a page has good score when it’s indexed, but then gets worse afterwards? Not a problem! The version of the page that’s measured is the same version of the page that gets hosted and prerendered. Google can confidently say “This page is fast!” After all, they’re the ones serving up the page.
That does raise the question of how often Google should check back with the original URL to see if it has changed/worsened/improved. The answer to that question is however long it currently takes to check back in on AMP pages:
Each time a user accesses AMP content from the cache, the content is automatically updated, and the updated version is served to the next user once the content has been cached.
This proposal does not solve the problem with the address bar. You’d still find yourself looking at a page from The Washington Post or The New York Times (or adactio.com) but seeing a completely different URL in your browser. That’s not good, for all the reasons outlined in the AMP letter.
In fact, this proposal could potentially make the situation worse. It would allow even more sites to be impersonated by Google’s URLs. Where currently only AMP pages are bad actors in terms of URL confusion, opening up the AMP cache would allow equal opportunity URL confusion.
What I’m suggesting is definitely not a long-term solution. The long-term solutions currently being investigated are technically tricky and will take quite a while to come to fruition—web packages and signed exchanges. In the meantime, what I’m proposing is a stopgap solution that’s technically a lot simpler. But it won’t solve all the problems with AMP.
This proposal solves one problem—AMP pages being unfairly privileged in search results—but does nothing to solve the other, perhaps more serious problem: the erosion of site identity.
Currently, Google can assess whether a page should be hosted and prerendered by checking to see if it’s a valid AMP page. That test would need to be widened to include a different measurement of performance, but those measurements already exist.
I can see how this assessment might not be as quick as checking for AMP validity. That might affect whether non-AMP pages could be measured quickly enough to end up in the Top Stories carousel, which is, by its nature, time-sensitive. But search results are not necessarily as time-sensitive. Let’s start there.
Currently, AMP pages can be prerendered without fetching anything other than the markup of the AMP page itself. All the CSS is inline. There are no initial requests for other kinds of content like images. That’s because there are no
img elements on the page: authors must use
amp-img instead. The image itself isn’t loaded until the user is on the page.
If the AMP cache were to be opened up to non-AMP pages, then any content required for prerendering would also need to be hosted on that same domain. Otherwise, there’s privacy leakage.
This definitely introduces an extra level of complexity. Paths to assets within the markup might need to be re-written to point to the Google-hosted equivalents. There would almost certainly need to be a limit on the number of assets allowed. Though, for performance, that’s no bad thing.
Make no mistake, figuring out what to do about assets—style sheets, scripts, and images—is very challenging indeed. Luckily, there are very smart people on the Google AMP team. If that brainpower were to focus on this problem, I am confident they could solve it.
There will be technical challenges, but hopefully nothing insurmountable.
I honestly can’t see what Google have to lose here. If their goal is genuinely to reward fast pages, then opening up their AMP cache to fast non-AMP pages will actively encourage people to make fast web pages (without having to switch over to the AMP format).
I’ve deliberately kept the details vague—what the opt-in should look like; what the speed measurement should be; how to handle assets—I’m sure smarter folks than me can figure that stuff out.
I would really like to know what other people think about this proposal. Obviously, I’d love to hear from members of the Google AMP team. But I’d also love to hear from publishers. And I’d very much like to know what people in the web performance community think about this. (Write a blog post and send me a webmention.)
What am I missing here? What haven’t I thought of? What are the potential pitfalls (and are they any worse than the current acrimonious situation with Google AMP)?
I would really love it if someone with a fast website were in a position to say, “Hey Google, I’m giving you permission to host this page so that it can be prerendered.”
I would really love it if someone with a slow website could say, “Oh, shit! We’d better make our existing website faster or Google won’t host our pages for prerendering.”
And I would dearly love to finally be able to embrace AMP-the-format with a clear conscience. But as long as prerendering is joined at the hip to the AMP format, the injustice of the situation only harms the AMP project.
Google, open up the AMP cache.
Reinventing the web the long way around, in a way that gives Google even more control of it. No thanks.