I’ve been struck low by a cold, all sniffles and blocked nose. That’s why I wasn’t able to make it to last night’s Skillswap talk by Rosie Freshwater on Search Engine Optimisation.
There was a great crossover in our objectives: I wanted to make well-structured, semantically accurate documents free of code-bloat and she wanted to have well-written search-engine friendly pages. A combination of XHTML and CSS served us both well. As an added bonus, the site also fares well in the accessibility stakes.
In fact, this is the tack I take when I’m trying to convince people (clients, fellow developers) that websites should be accessible.
But once I explain that the Googlebot is just such a device, I find that people are suddenly much more interested in this whole accessibility/standards compliance thing.
"Although I could list about a thousand reasons that sites should be built with XHTML/CSS, the main motivation, and reason the client would accept a higher browser requirement (5.0 and up) was search engine rankings. I’m no search engine expert myself, but I believe with less source code for spiders to look at, semantic hints such as <h> tags and page content near the top of the file, we should do well."
It’s turning out to be a great resource for all things related to design, web standards and accessibility: a resource that I can then plunder for myself.