Tags: sonification

2

sparkline

Sonic sparklines

I’ve seen some lovely examples of the Web Audio API recently.

At the Material conference, Halldór Eldjárn demoed his Poco Apollo project. It generates music on the fly in the browser to match a random image from NASA’s Apollo archive on Flickr. Brian Eno, eat your heart out!

At Codebar Brighton a little while back, local developer Luke Twyman demoed some of his audio-visual work, including the gorgeous Solarbeat—an audio orrery.

The latest issue of the Clearleft newsletter has some links on sound design in interfaces:

I saw Ruth give a fantastic talk on the Web Audio API at CSS Day this year. It had just the right mixture of code and inspiration. I decided there and then that I’d have to find some opportunity to play around with web audio.

As ever, my own website is the perfect playground. I added an audio Easter egg to adactio.com a while back, and so far, no one has noticed. That’s good. It’s a very, very silly use of sound.

In her talk, Ruth emphasised that the Web Audio API is basically just about dealing with numbers. Lots of the examples of nice usage are the audio equivalent of data visualisation. Data sonification, if you will.

I’ve got little bits of dataviz on my website: sparklines. Each one is a self-contained SVG file. I added a script element to the SVG with a little bit of JavaScript that converts numbers into sound (I kind of wish that the script were scoped to the containing SVG but that’s not the way JavaScript in SVG works—it’s no different to putting a script element directly in the body). Clicking on the sparkline triggers the sound-playing function.

It sounds terrible. It’s like a theremin with hiccups.

Still, I kind of like it. I mean, I wish it sounded nicer (and I’m open to suggestions on how to achieve that—feel free to fork the code), but there’s something endearing about hearing a month’s worth of activity turned into a wobbling wave of sound. And it’s kind of fun to hear how a particular tag is used more frequently over time.

Anyway, it’s just a silly little thing, but anywhere you spot a sparkline on my site, you can tap it to hear it translated into sound.

Connections: Weak Signals

Tuesday evening saw the inaugural Connections event at 68 Middle Street, home to Clearleft. It was a rousing success—much fun was had by all.

There was a great turn-out. Normally I’d expect a fairly significant no-show rate for a free event (they’re often oversubscribed to account for this very reason), but I was amazed how many people braved the dreadful weather to come along. We greeted them all with free beer, courtesy of Clearleft.

The talks had a nice yin and yang quality to them. Honor talked about darkness. Justin talked about light. More specifically, Honor talked about dark matter and Justin talked about Solarpunk.

Honor made plentiful use of sound during her presentation. Or rather, plentiful use of electromagnetic signals converted into sound: asteroseismology from the sun; transient luminous events in the Earth’s upper atmosphere; the hailstorm as Cassini pirouettes through Saturn’s rings; subatomic particle collisions sonified. They all combined to eerie effect.

Justin’s talk was more down to Earth, despite sounding like a near-future science-fiction scenario: individuals and communities harnessing the power of the photovoltaic solar panel to achieve energy-independence.

There was a beer break between the talks and we had a joint discussion afterwards, with questions from the audience. I was leading the discussion, and to a certain extent, I played devil’s advocate to Justin’s ideas, countering his solar energy enthusiasm with nuclear energy enthusiasm—I’m on Team Thorium. (Actually, I wasn’t really playing devil’s advocate. I genuinely believe that nuclear energy is the cleanest, safest source of energy available to us and that an anti-nuclear environmentalist is a contradiction in terms—but that’s a discussion for another day.)

There was a bittersweet tinge to the evening. The first Connections event was also Honor’s last public speaking engagement in Brighton for a while. She is bidding farewell to Lighthouse Arts and winging her way to a new life in Singapore. We wish her well. We will miss her.

The evening finished with a facetious rhetorical question from the audience for Honor. It was related to the sonification of particle collisions like the ones that produced evidence for “the God particle”, the Higgs boson. “Given that the music produced is so unmusical”, went the question, “does that mean it’s proof that God doesn’t exist?”

We all had a laugh and then we all went to the pub. But I’ve been thinking about that question, and while I don’t have an answer, I do have a connection to make between both of the talks and algorithmically-generated music. Here goes…

Justin talked about the photovoltaic work done at Bell Labs. An uncle of Ray Kurzweil worked at Bell Labs and taught the young Kurzweil the basics of computer science. Soon after, Ray Kurzweil wrote his first computer program, one that analysed works of classical music and then generated its own music. Here it is.