Jonathan Hassell is going to talk about user-generated content. He’s from the BBC. Aren’t we all, dear, aren’t we all? Specifically he’s with the User Experience and Design department.
Now onto user-generated content. He reiterates what I was saying about the importance of open content. Content must be accessible even before you put an interface on it. All the interface layers need to work together; web page, web browser and operating system. But we’re here to talk about content, not interface.
Blogs, Bebo and YouTube contain user-generated content but even basic accessibility hooks like
alt text is missing. Whose job is it? There are two things: the tools and the site. Again, he reiterates a point from my keynote: ATAG is more relevant than WCAG. Yes, it is up to the site owners to provide the ability to make accessible content. But is it their responsibility to actually add the accessibility hooks to the user’s content? The DDA is very unclear on this point. It’s like the argument about whether ISPs are responsible for customers accessing illegal content.
Jonathan is posing a lot of questions here today. He wants to know if disabled users will be left behind by Web 2.0. Talking about people with literacy difficulties, he points to the lack of spell-checking in
textareas on social networking sites like Facebook (hmmm… I think this an OS issue myself).
Here’s an interesting twist: BSL users are putting videos on YouTube. Who provides the text transcription or voiceover?
Jonathan thinks that the Assistive Technology chain has broken down. Modern AT can’t handle non-text content like video and games well.
But it isn’t all bad news. Remember, the opportunities offered by rich media like video is a boon to people with learning disabilities. And video offers BSL users the opportunity to get their language out there on the Web for the first time.
Let’s ditch the phrase “it isn’t accessible.” Nothing is accessible to everyone. Instead, let’s say “it isn’t usable by someone with this particular disability.”
It’s hard enough for organisations to provide transcripts and captioning; what about when it’s user-generated content? You can engage the community but even then, it will always be behind the original rich media.
Now Jonathan steps beyond inclusion and looks to the future. He shows some games that have been created for deaf children. He demos a game that is accessible to children learning BSL. You construct sentences with nouns, verbs and tenses; then click a button to see that sentence signed by a cartoon character. (This is pretty cool. Frankly, I could imagine using this myself just for the fun of it.)
One last demo. It’s a science game for blind children who have never used a computer. The game must explain the grammar of 30 years of computer games while explaining scientific concepts like force and inertia. The visual elements exist purely for anyone accompanying the blind user. This is a fully-fledged game with mechanics, physics and feedback… all using stereo sound. Tones, words and direction are used to create an interactive environment. Done well, sound can be layered to provide a lot of information. Just imagine how this could be applied to virtual worlds like Second Life.
That was an inspiring way to end!