Tags: seo

21

sparkline

Sunday, October 29th, 2017

The meaning of AMP

Ethan quite rightly points out some semantic sleight of hand by Google’s AMP team:

But when I hear AMP described as an open, community-led project, it strikes me as incredibly problematic, and more than a little troubling. AMP is, I think, best described as nominally open-source. It’s a corporate-led product initiative built with, and distributed on, open web technologies.

But so what, right? Tom-ay-to, tom-a-to. Well, here’s a pernicious example of where it matters: in a recent announcement of their intent to ship a new addition to HTML, the Google Chrome team cited the mood of the web development community thusly:

Web developers: Positive (AMP team indicated desire to start using the attribute)

If AMP were actually the product of working web developers, this justification would make sense. As it is, we’ve got one team at Google citing the preference of another team at Google but representing it as the will of the people.

This is just one example of AMP’s sneaky marketing where some finely-shaved semantics allows them to appear far more reasonable than they actually are.

At AMP Conf, the Google Search team were at pains to repeat over and over that AMP pages wouldn’t get any preferential treatment in search results …but they appear in a carousel above the search results. Now, if you were to ask any right-thinking person whether they think having their page appear right at the top of a list of search results would be considered preferential treatment, I think they would say hell, yes! This is the only reason why The Guardian, for instance, even have AMP versions of their content—it’s not for the performance benefits (their non-AMP pages are faster); it’s for that prime real estate in the carousel.

The same semantic nit-picking can be found in their defence of caching. See, they’ve even got me calling it caching! It’s hosting. If I click on a search result, and I am taken to page that has a URL beginning with https://www.google.com/amp/s/... then that page is being hosted on the domain google.com. That is literally what hosting means. Now, you might argue that the original version was hosted on a different domain, but the version that the user gets sent to is the Google copy. You can call it caching if you like, but you can’t tell me that Google aren’t hosting AMP pages.

That’s a particularly low blow, because it’s such a bait’n’switch. One of the reasons why AMP first appeared to be different to Facebook Instant Articles or Apple News was the promise that you could host your AMP pages yourself. That’s the very reason I first got interested in AMP. But if you actually want the benefits of AMP—appearing in the not-search-results carousel, pre-rendered performance, etc.—then your pages must be hosted by Google.

So, to summarise, here are three statements that Google’s AMP team are currently peddling as being true:

  1. AMP is a community project, not a Google project.
  2. AMP pages don’t receive preferential treatment in search results.
  3. AMP pages are hosted on your own domain.

I don’t think those statements are even truthy, much less true. In fact, if I were looking for the right term to semantically describe any one of those statements, the closest in meaning would be this:

A statement used intentionally for the purpose of deception.

That is the dictionary definition of a lie.

Update: That last part was a bit much. Sorry about that. I know it’s a bit much because The Register got all gloaty about it.

I don’t think the developers working on the AMP format are intentionally deceptive (although they are engaging in some impressive cognitive gymnastics). The AMP ecosystem, on the other hand, that’s another story—the preferential treatment of Google-hosted AMP pages in the carousel and in search results; that’s messed up.

Still, I would do well to remember that there are well-meaning people working on even the fishiest of projects.

Except for the people working at the shitrag that is The Register.

(The other strong signal that I overstepped the bounds of decency was that this post attracted the pond scum of Hacker News. That’s another place where the “well-meaning people work on even the fishiest of projects” rule definitely doesn’t apply.)

Tuesday, September 5th, 2017

AMPersand. — Ethan Marcotte

I’ve had a few conversations with members of the Google AMP team, and I do believe they care about making the web better. But given how AMP pages are privileged in Google’s search results, the net effect of the team’s hard, earnest work comes across as a corporate-backed attempt to rewrite HTML in Google’s image. Now, I don’t know if these new permutations of AMP will gain traction among publishers. But I do know that no single company should be able to exert this much influence over the direction of the web.

Thursday, March 23rd, 2017

Need to Catch Up on the AMP Debate? | CSS-Tricks

Funnily enough, I led a brown bag lunch discussion about AMP at work just the other day. A lot of it mirrored Chris’s thoughts here. It’s a complicated situation that has lots of people worried.

Tuesday, January 3rd, 2017

Does Google execute JavaScript? | Stephan Boyer

Google may or may not decide to run your JavaScript, and you don’t want your business to depend on its particular inclination of the day. Do server-side/universal/isomorphic rendering just to be safe.

Wednesday, August 24th, 2016

Official Google Webmaster Central Blog: Helping users easily access content on mobile

Two pieces of good news from Google:

  1. 85% of websites qualify as mobile-friendly, so there’s no longer a need to explicitly label them as such in search results.
  2. Google will down-rank sites that have annoying pop-overs demanding you download an app or sign up to an email newsletter when you’re trying to read the damn page.

Wednesday, March 9th, 2016

An update (March 2016) on the current state & recommendations for JavaScript …

Making web apps? Care about SEO? Here’s Google’s advice:

Use feature detection & progressive enhancement techniques to make your content available to all users.

Tuesday, March 1st, 2016

Spamduffing

Running The Session and Huffduffer is immensely rewarding …most of the time. There are occasions when the actions of a few bad apples make it a real pain in the bum.

Yes, I’m talking about SEO spammers.

Huffduffer tends to get it worse than The Session, but even then it’s fairly manageable—just a sign-up or two here or there. This weekend though, there was a veritable spam tsunami. I was up late on Friday night playing a constant game of whack-a-mole with thousands of spam postings by newly-created accounts. (I’m afraid I inadvertently may have deleted some genuine new accounts in the trawl; if you signed up for Huffduffer last Friday and can’t access your account now, I’m really, really sorry.)

Normally these spam SEO accounts would have some pattern to them—either they’d be from the same block of IP addresses or they’d have similar emails. But these all looked different enough to thwart any quick fixes. I knew I’d be spending my Saturday writing some spam-blocking code.

Most “social” websites have a similar sign-up flow: you fill in a form with your details (including your email address), and then you have to go to your email client to click a link to verify that you are indeed who you claim to be. The cynical side of me thinks that this is mostly to verify that you providing a genuine email address so that the site can send you marketing crap.

Neither Huffduffer nor The Session includes that second step of confirming your email address. The only reason for providing your email address is so that you can reset your password if you ever forget it.

I’ve always felt that making a new user break out of the sign-up flow to go check their email was a bit shit. It also strikes me as following the same logic as CAPTCHAs (which I hate): “Because of the bad actions of a minority, we’re going to punish the majority by making them prove to us that they’re human.” It’s such a machine-centric way of thinking.

But after the splurge of spam on Huffduffer, I figured I’d have no choice but to introduce that extra step. Just as I was about to start coding, I thought to myself “No, this is wrong. There must be another way.”

I thought a bit more about the problem. The issue wasn’t so much about spam sign-ups per se. Like I said, there’s always been a steady trickle and it isn’t too onerous to find them and delete them. The problem was the sheer volume of spam posts in a short space of time.

I ended up writing some different code with this logic:

  1. When someone posts to Huffduffer, check to see if they’ve posted at least ten items in the past;
  2. If they have, grab the timestamps for the last ten posts;
  3. Calculate the cumulative elapsed time between those ten posts;
  4. If it’s less than 100 seconds (i.e. an average of one post every ten seconds), delete the user …and delete everything they’ve ever posted.

It worked. I watched as new spam sign-ups began to hammer the site with spam postings …only to self-destruct when they hit the critical mass of posts over time.

I’m still getting SEO spammers signing up but now they’re back to manageable levels. I’m glad that I didn’t end up having to punish genuine new users of Huffduffer for the actions of a few SEO marketing bottom-feeders.

Wednesday, January 13th, 2016

Delicious Changes | The Official Delicious Blog

The first big change you’ll notice is our transition from the javascript front-end framework that has been powering the content at https://www.delicious.com. The engineers who crafted this version of the site are incredibly talented, and their code is amazing. It’s beautiful and powerful, but it has posed several significant challenges for us. For example, the search engines have a real problem reading our content, hindering users’ efforts to use Google or Bing to find what they’re looking for on Delicious.

Saturday, October 17th, 2015

Official Google Webmaster Central Blog: Deprecating our AJAX crawling scheme

It’s official: hash bang URLs are an anti-pattern, and if you want your content indexed by Google, use progressive enhancement:

Since the assumptions for our 2009 proposal are no longer valid, we recommend following the principles of progressive enhancement.

Friday, December 19th, 2014

Implement Server-Side Rendering for SEO · Issue #9938 · emberjs/ember.js

The motivation seems entirely misplaced to me (SEO? Really?) but never mind: the end result could be the holy grail of JavaScript MVC frameworks — code that runs on the server and the client. That would get you the reach and initial rendering speed of progressive enhancement, combined with the power of client-side application logic once the page has loaded.

Watch this space.

Monday, October 27th, 2014

Official Google Webmaster Central Blog: Updating our technical Webmaster Guidelines

Google has updated its advice to people making websites, who might want to have those sites indexed by Google. There are two simple bits of advice: optimise for performance, and use progressive enhancement.

Just like modern browsers, our rendering engine might not support all of the technologies a page uses. Make sure your web design adheres to the principles of progressive enhancement as this helps our systems (and a wider range of browsers) see usable content and basic functionality when certain web design features are not yet supported.

Thursday, November 7th, 2013

Prerender - AngularJS SEO, BackboneJS SEO, or EmberJS SEO

I despair sometimes.

Here’s a ridiculous Heath-Robinsonesque convoluted way of getting the mighty all-powerful Googlebot to read the web thangs you’ve built using the new shiny client-side frameworks like Angular, Ember, Backbone…

Here’s another idea: output your HTML in HTML.

That solution works for machines and humans. As a bonus, outputting your HTML in HTML avoids turning JavaScript into a single point of failure.

Wednesday, June 12th, 2013

Common mistakes in smartphone sites

Good news from Google: it’s going to start actively penalising sites for perpetrating the worst practices for mobile e.g. redirecting a specific “desktop” URL to a the homepage of the mobile site, or for shoving a doorslam “download our app” message at users.

I wish that we could convince people not to do that crap on the basis of it being, well, crap. But when all else fails, saying “Google says so” carries a lot of weight (see also: semantics, accessibility, yadda, yadda, yadda).

Monday, October 18th, 2010

End hover abuse now : Cennydd Bowles on user experience

An excellent little rant by Cennydd that I agree with 100%: hovering does not demonstrate user intent.

Thursday, July 8th, 2010

Non Hover | Trent Walton

A timely reminder: don't hide information behind mouseover events.

Tuesday, November 3rd, 2009

Perfect Pitch

We were having a chat in the Clearleft office today about site stats and their relative uselessness; numbers about bounce rates are like eyetracking data—without knowing the context, they’re not going to tell you anything.

Anyway, I was reminded that I have an account over at Google Webmaster Tools set up for three of my sites: adactio.com, huffduffer.com and thesession.org. I logged in today for the first time in ages and started poking around.

I noticed that I had some unread messages. Who knew that Google Webmaster Tools has a messaging system? I guess all software really does evolve until it can send email.

One of the messages had the subject line Blocked URLs:

For legal reasons, we’ve excluded from our search results content located at or under the following URL/directory:

http://www.thesession.org/discussions/display/21250

This content has been removed from all Google search results.

Cause: Somone has filed a DMCA complaint against your site.

What now?

I visited the URL and found a fairly tame discussion about Perfect Pitch. Here’s the only part of the discussion that references an external resource in a non-flattering light:

I think that is referring to www.PerfectPitch.com. I’m not saying anything about such commercially-oriented courses because I don’t know them, but I think we’d all be wise to bear in mind the general comments voiced in the first two posts on this thread.

That single reference to a third-party site is, apparently, enough to trigger a DMCA complaint.

Google link to the complaint on Chilling Effects but that just says The cease-and-desist or legal threat you requested is not yet available. It does, however, list the party who sent the complaint: Boucherle.

By a staggering coincidence, Gary Boucherle of American Educational Music, Inc. is registered as the owner of perfectpitch.com.

So let’s get this straight. In a discussion about perfect pitch, someone mentions the website perfectpitch.com. They don’t repost any materials from the site. They don’t even link to the site. They don’t really say anything particularly disparaging. But it all takes is for the owner of perfectpitch.com to abuse the Digitial Millenium Copyright Act with a spurious complaint and just like that, Google removes the discussion from its search index.

To be fair, Google also explain how to file a counter-complaint. However, the part about agreeing to potentially show up in a court in California is somewhat off-putting for those of us, like me, who live outside the United States of America.

There is another possible explanation for this insane over-reaction; one that would explain why the offended party sent the complaint to Google rather than going down the more traditional route of threatening the ISP

The Session has pretty good Google juice. The markup is pretty lean, the content is semantically structured and there’s plenty of inbound links. Could it be that the owner of perfectpitch.com sent a DMCA complaint to Google simply because another site was getting higher rankings for the phrase “perfect pitch”? If so, then that’s a whole new level of SEO snake-oilery.

Hmmm… that gives me an idea.

If you have a blog or other personal publishing platform, perhaps you would like to write a post titled Perfect Pitch? Feel free to republish anything from this post, which is also coincidentally titled Perfect Pitch. And feel free to republish the contents of the original discussion on The Session titled, you guessed it: Perfect Pitch.

Update: Thanks for inbound links, everyone. The matter is now being resolved. I have received an apology from Gary Bourcherle who was being more stupid than evil.

Wednesday, October 14th, 2009

Optimisation

Derek Powazek gave up smoking recently so any outward signs of irritability should be forgiven. That said, the anger in two of his recent posts is completely understandable: Spammers, Evildoers, and Opportunists and the follow-up, SEO FAQ.

His basic premise is money spent on hiring someone who labels themselves as an SEO expert would be better spent in producing well marked-up relevant content. I think he’s right. In the comments, the more reasonable remarks are based on semantics. Good SEO, they argue, is all about producing well marked-up relevant content.

Fair enough. But does it really need its own separate label? Personally, I would always suggest hiring a good content strategist or copy writer over hiring an SEO consultant any day. Here’s why:

Google—or at least the search arm of the company—is dedicated to a simple goal: giving people the most relevant content for their search. Google search is facilitated by ‘bots and algorithms, but it is fundamentally very human-centric.

Search Engine Optimisation is an industry based around optimising for the ‘bots and algorithms at Google.

But if those searchbots are dedicated to finding the best content for humans, why not cut out the middleman and go straight to optimising for humans?

If you optimise for people, which usually involves producing well marked-up relevant content, then you will get the approval of the ‘bots and algorithms by default …because that’s exactly the kind of content that they are trying to find and rank. This is the approach taken by Aarron Walter in his excellent book Building Findable Websites.

On Twitter, Mike Migurski said:

I think SEO is just user-centered design for robots.

…which would make it robot-centred design. But that’s only half the story. SEO is really robot-centred design for robots that are practising user-centred design.

Ask yourself this: do you think Wikipedia ever hired an SEO consultant in order to get its high rankings on Google?

Monday, May 4th, 2009

Dyson ball

When I was in Japan last year, I noticed that most advertisements don’t mention URLs. Instead, they simply show what to search for. The practice seems to be gaining ground over here too. Advertising for the government’s Act on CO2 campaign didn’t include a URL—just an entreaty to search for the phrase.

The current television advertising for the latest Dyson vacuum cleaner finishes with the message to search for “dyson ball.” Sure enough, the number one search result goes straight to the Dyson website …for now. That might change if Google were to implement any kind of smart synonym swapping. There would be quite a difference in scale if the word “ball” were interchangeable with the word “.”

Thursday, January 31st, 2008

Waxy.org: Daily Log: The Times (UK) Spamming Social Media Sites

Andy Baio does a nice bit of investigative journalism in exposing the social network spammer hired by The Times. The internet treats crass marketing as damage and routes around it.

Tuesday, October 16th, 2007

Google

What would happen if Google tried to apply SEO techniques to itself?