Subscribe: The Man in Blue
Added By: Feedage Forager Feedage Grade B rated
Language: English
analyser  audio  code  data  day  drawing  frequency data  mix  music  new  people  source code  source  visualisation  web 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: The Man in Blue

The Man in Blue

Distilling fine web design since 1863


Deep + Vocal + Bliss DJ Mix
(image) " />

It's been a while since I did a house music DJ mix, so I thought I'd put together some tunes that I'm feeling at the start of 2013. This one starts off deep, goes vocal and ends on a blissful high, so take a listen, or download all 50 minutes as an MP3.


  1. Per Byhring - Ettertid (Russ Chimes Remix)
  2. Duke Dumont - Need U 100%
  3. Martyn - We Are You In The Future
  4. Bob Sinclar - I Feel for You (Ben Delay Dub Mix)
  5. Julio Bashmore - Au Seve
  6. Candi Station - Hallelujah Anyway (Larse Vocal Mix)
  7. Hot Chip - How Do You Do (Todd Terje Remix)
  8. Max Lyazgin & John Martin - Good Morning
  9. The Other Tribe - Sing With Your Feet (Instrumental)
  10. M83 - Midnight City (Sharam Jey Remix)
  11. Tensnake - Coma Cat (Stanton Warriors Re Bump)
  12. The Presets - Promises (Lifelike Remix)

(image) " />

For a new project I'm working on I'd like to create an audio-reactive visualisation using analysis of the music that's playing. The new Web Audio API lets you do this pretty easily in the browser (Webkit-only at the moment).

I'm posting my documented code here with an explanation for anyone else who would like an easy snippet to get them started down the music visualisation path. You can view a demo here and download the source code here.

The main concept in the Web Audio API is that you wire up a bunch of modules together. The output from one module going into the input of another module. To this with an

The code for all that looks something like this:

var audio = document.getElementById('music'); var audioContext = new webkitAudioContext(); analyser = audioContext.createAnalyser(); var source = audioContext.createMediaElementSource(audio); source.connect(analyser); analyser.connect(audioContext.destination);

Once you've done that, the analyser is now listening for any output from the

Once you set the you can query the analyser at any time and check what the current frequency data of the sound is. If we put that analysis into a simple draw() loop then we can graph out what the sound looks like:

function draw() { // Setup the next frame of the drawing webkitRequestAnimationFrame(draw);  // Create a new array that we can copy the frequency data into var freqByteData = new Uint8Array(analyser.frequencyBinCount); // Copy the frequency data into our new array analyser.getByteFrequencyData(freqByteData);  // Clear the drawing display canvasContext.clearRect(0, 0, canvas.width, canvas.height);  // For each "bucket" in the frequency data, draw a line corresponding to its magnitude for (var i = 0; i < freqByteData.length; i++) { canvasContext.fillRect(i, canvas.height - freqByteData[i], 1, canvas.height); } }

You can see a demo of this running in my experiments section. I've just done a very simple line rendering of the frequency data but your imagination is the limit. Download the source code and see what you can do with it!

Stay tuned for the prettiness that I've actually got planned for this code.

Perfect Summer DJ Mix
(image) " />

Download the Perfect Summer DJ Mix (175MB)

It finally seems like Sydney's entered Summer. To commemorate the good weather and sunshine vibes, I thought I'd upload an old mix. Perfect Summer is designed for putting on as you drive to the beach or relax on the sand with a cool pina colada.

It's probably one of the last mixes that I did on turntables with all original vinyl, back in 2007. You can even hear the crackle of dust under the needle in the opening refrains of Solar Stone's Seven Cities.

Anyway, download Perfect Summer (or stream it) and get out in the sun while it's still here!


  1. Solar Stone - Seven Cities (Solaris Heights Mix)
  2. Kings Of Tomorrow - Finally (Rulers Of The Deep Mix)
  3. Orinoko - Island (Alternative Dub Mix)
  4. Graham & Blades - Funky Summa
  5. Plan B - #2
  6. Nalin & Kane - Beach Ball (DJ Icey's "The Sea" Mix)
  7. Nalin & Kane - Beach Ball (South Beach Vacation Mix)
  8. Orbital - Frenetic (12" Mix)
  9. Punks - Break Me With You
  10. Golan Globus - Blazer (Version 2)
  11. Maurice & Noble - Hoochie Coochie (Arcaic Mix)
  12. Chris Lake - Changes
  13. Underworld - Two Months Off
  14. Black Rock - Tiger

Opening Titles for Web Directions South 2011
(image) " />

See the in-browser opening titles

Another year and another great Web Directions. Of course, for me, Web Directions isn't complete without frantic late-night coding sessions in aid of completing a madly inspired conference-related project.

Last year I created the Web Directions 2010 opening titles in 3 days from start to finish. This year I gave myself a little more time but made it no less ambitious by expanding the production onto 2 screens; thereby requiring it to run on two laptops simultaneously. (And stay in sync!)

With the number of things that could fall over -- browsers crashing, projections being out of sync, people hating it -- my nerves were ringing right up until I hit "play". Luckily it came off without a hitch, and you can see the results in the video of the performance below. (Or if your computer's feeling adventurous you can check it out in your browser.)

src="" width="640" height="480" frameborder="0">

SlashGlobe: A 3D CSS experiment
(image) " />

See the neon-spinny-3D-globe experiment

The third dimension isn't my strong suit; I'm fine with pixels but not with voxels. To start combatting this I thought I'd have a play around with the 3D CSS transforms that are available in the latest Webkit browsers (Chrome & Safari; sorry Firefox!)

What I set out to achieve was a neon-spinny-globe thing and I pretty much got there. I even managed to throw some Max Headroom aesthetic into the mix.

This experiment comes as-is, with all sorts of browser exclusions (and is best viewed in Chrome), so don't bug me if it doesn't work for you. Otherwise, enjoy the neon spinnyness.

Desktop Wallpaper: Space Frames
(image) " />

I'm a big fan of a black desktop. It's easy on the eyes and icons are highly visible. But you have to have a dash of colour to add personality, so it's a fine balance between style and stark.

A few years ago, when I was heavily into Processing I wrote a program that dynamically generated spacey looking spectrum wallpapers, with the aim of offering it to the public so that 6 billion people could each have their own unique desktop wallpaper. Didn't work out. (But I used one of its outputs as my own wallpaper for the last 4 years).

A bout of recent procrastination made me realise that I hadn't changed my wallpaper in far too long and this time instead of breaking open some code I headed to Illustrator to make some jaunty little geometrics.

Now, I've got a new wallpaper and you can have one too! I'm calling it Space Frames and you can download it by clicking on the image below. Bon appetit!


Visualising drawings in 3D with DrawPad
(image) " />

Last February at Ignite Sydney we thought we'd try something a little different to get the crowd involved.

While most people were downing a few drinks, a bunch of lovely lads and ladesses with iPads were circulating through the audience asking people to draw (with their fingers) what inspires them. On the iPads was a drawing application that recorded the time and position that each stroke of their fingers made and that data was used to create a 3D timelapse visualisation of their drawing on the big screen behind the stage. I've put together a little video of the end result:

src="" width="640" height="480" frameborder="0">

It was actually quite exciting to see what people came up with. And equally exciting was seeing the looks on their faces as they interacted with the iPads and then waited with anticipation to see how their drawings would be interpreted by the foreign shapes appearing on the screen.

(image) To pull off this stunt we used all open web technologies: a webpage running my "DrawPad" Canvas application that allowed people to draw, and captured the movements of their fingers; storage of the stroke data in JSON on the backend (thanks to Tim Lucas); and visualisation of those strokes in 3D using WebGL via Mr. Doob's wonderful three library.

To get people drawing, I took a look at 37signals' Chalk but it lacked one important feature: multi-touch drawing, so I decided to write my own drawing app.

If you're never done multi-touch event handling it can be a mysterious process (it certainly was to me) but once you get your head around the notion of an event object that contains multiple points of interaction, then it's actually quite fun.

(image) I've uploaded the source code for the drawing app (and the 3D visualiser) into a DrawPad Github project if you want to take a closer look (or improve it in the countless ways that it could be improved upon). Certainly the 3D visualiser is a result of cramped deadlines. I really would have loved to create the stroke paths as true 3D meshes, but had to settle for a series of spheres that follow the path of the stroke instead. (Kind of like voxels.)

But at the end of the day the technology didn't matter. What mattered was the outcome: an easy-to-use way for people to draw what they wanted, and a pretty-as-a-picture translation of what they drew. The fact that it was done in a browser made no difference at all.

Anatomy of a Mashup: Definitive Daft Punk visualised
(image) " />

See "Definitive Daft Punk" visualised in realtime »

I've always believed in the strong connection between sound and vision. Music videos are like little slices of synchronous art, designed to please all of your senses. (Go ahead, lick your TV next time "Poker Face" comes on!)

Every so often I delve into music making, but aside from the cover art for those releases my music has remained very separate from my visual design work. Now and into the future I plan to rectify this, and first cab off the ranks is a data visualisation I've had in my head for a while.

The art of the mashup has come to the fore in pop culture of recent years, but beyond Biggie Smalls crooning over Elton's keys I feel that the general public understands little of the nuance that goes into constructing a complex mashup from tiny pieces of songs.

In order to explain the layering and interplay that goes into something like a Girl Talk album or The 139 Mix Tape I decided to take my own mashup of Daft Punk's discography -- Definitive Daft Punk -- and reveal its entire structure: the cutting, layering, levels and equalisation of 23 different songs. By dividing up the sound data for each song and computing its appearance in realtime, the resulting visualisation gives you an understanding of the unique anatomy of this particular mashup.

The entire piece is composed from the latest HTML5 and CSS3 technology (canvas, audio, transforms & transitions) so you'll need a newer browser to view it in. I recommend Chrome because it pulls off the best performance with my mangled code. All of the waveform and spectrum visualisation is performed in realtime, so your browser is rendering a music video on the fly!

Hopefully it gives you a new insight into the artform of the mashup, otherwise you can just stare at the pretty shapes.

Source Code: Not a movie about programming
(image) " />

One of the advantages of having 2 billion people on the Internet is that every so often one of them sends you something for free. In this case, the kind people at Hopscotch sent me a couple of passes to a preview screening of Source Code. (And only briefly mentioned, in a passing, fleeting manner that I might -- if the right mood struck me -- want to blog about it)

When I sat down in the cinema last week to watch the movie, I was immediately on the back foot when trying to objectively assess it. For the past 6 months I've pretty much compared every movie I've seen to Inception and found them all wanting. It's certainly one of my favourite movies of the last 5 years. The reason it so affected my objectivity in this case is because Source Code has been labelled as "Inception, but better". Now that's pretty much sacrilege in my books, so I naturally wanted it to suck.

Although the two movies have some similar themes -- alternate realities, immersion in technology, subjective perceptions -- they are quite different films. For me, Inception felt like a brilliant idea that was taken to its furthest extreme, each scene taking you further down the rabbit hole. In Source Code I feel like a brilliant idea has been treated in a shallow manner. It's like Inception was a cult film that had been carefully crafted for mainstream appeal; whereas Source Code felt like a cult film that had been compromised for mainstream appeal.

Other comparisons that sprang to mind as I was watching Source Code included Groundhog Day and Quantum Leap (which got a cheeky nod in the movie via the inclusion of Scott Bakula as a voice actor), and there's definitely elements of Cyberpunk in there -- the digital environments of Neuromancer and Snowcrash spring to mind.

Aside from my bias towards Inception, there's also another admission I'll make that will get 50% of you readers offside. It is this: I didn't really like Moon. (The director -- Duncan Jones' -- previous film) The problem I have with Moon is the same one I have with Source Code, namely that it peaks too early. Key points of the plot are made aware to you halfway through the movie and from there on out it feels like you're just watching a tired story play out to its inevitable conclusion. Great idea. Poor structure.

My final conclusion on this film (based on my own experience and from observing others) is that if you liked Inception you'll probably like this less, and if you didn't like Inception then you'll probably like this more. Either way, it's still worth seeing, but depending on which camp you sit in you might want to save it for the couch.

Realtime visualisation of @replies
(image) " />

See @replies visualised in realtime

Back on September 15, 2010, it was "R U OK?" Day. This is a national day of action which is designed to raise awareness of suicide by encouraging people to reach out and make contact with others by asking "Are you OK?"

We were commissioned by the R U OK? organisation to create a visualisation that highlighted the connections people were making throughout the day, and after a number of rounds of brainstorming (and budget cuts) we chose to highlight the connections that were being made over Twitter.

Because of fears about subject matter we weren't allowed to highlight content contained in any tweets, so to gather our data we performed a search for all tweets in Australia that are directed at someone (@replies), geocoded both ends of the conversation, recorded the timestamp of the message, and map this connection in time and space. For each location we also perform frequency analysis on the tweets and provide a tag cloud of the most used words (available if you hover over a city).

All of this is done in realtime, so you can see actual conversations as they are formed. Beyond the realtime aspect, we also give you two weeks worth of historical data so that you can see the rise and fall of activity throughout the day and across weeks. Each of these hourly periods also has a unique pattern of replies that crisscross the country, showing up when people from other cities strike up conversation with one another.

It's interesting to see the times of day when people are on Twitter the most, and also see the days of the week when activity is high. On the actual R U OK? Day we saw a dramatic (~2x) spike in @reply activity which we could hopefully attribute to the spread of R U OK? Day.

The Achilles heel of this visualisation, however, is the amount of processing that it has to do. Firstly, it fetches a search from the Twitter API several times a minute (you have to be careful to stay within the rate limits), then for each tweet it has to check whether the sender and receiver have location data, and if they do, geocode both locations.

Geocoding is an expensive operation and because I'm geocoding roughly 200 points a minute, we quickly fall afoul of hourly geocoding rate limits. To counteract this, I have it setup to try geocoding via Google Maps, and than fail over to Yahoo! once we hit the limit. Somewhere along the way I implemented caching of geocoding results and we now have a pretty handy database of geocoded strings.

The price for daisy chaining all these APIs is fragility. It's a lot of work to keep this system up, and as a result I'll be taking the whole thing offline in about a week. (Also, it's costing me a fair bit in App Engine hosting charges.)

Still, if you want to check it out, you can for now. Thereafter, I'll replace it with a video of what it once was.

Update 2011-04-19: I've replaced it with a video of what it once was.