A recitative summer: part one

And so, dear readers, Spring has sprung. And, as surely as eggs follow eggs, a young man's thoughts turn towards his Mmus recital. Having watched the RWCMD undergraduates do their inspirational thing last week, it's clear that I'm going to have to pull something not inconsiderable out of the bag this September, if I'm going to  leave college with my head held anywhere much above my knock-like knees.

So. As I spend the summer developing my recitative ideas, I have decided to blog about what I, as an artist of international renown, like to call my 'creative process'. This will serve two purposes: firstly, to educate the masses as to what happens inside the minds of us enlightened creative types. Secondly - and perhaps most importantly - if I do this then hopefully I'll remember all of the decisions that I'll make over the next couple of months, and as such I'll have something to talk about in my viva.

All of which is a ridiculously long preamble to announce the triumvirate of projects that I am planning to put on for my Mmus final recital. The first of these is a classic; an electroacoustic piece for 3 flutes and 3 speakers, which will probably be about 4 minutes long. The second is an audio-visual piece for four screens and four speakers, which will run for around 10-15 minutes. More of them another time, however. My third, and final, idea, concerns audience participation and interactivity. I'm hoping to create some kind of multimedia shebang whereby audience members can manipulate, and even part-compose, a piece of sonic art that fills the space that they're moving around in.

At this early stage, I'm planning to use PureData (what else) for the musical aspect of the piece, setting off pre-composed cells of sound in a semi-generative manner. I'm also thinking to create some relatively abstract visuals using Unity3D, in order to provide some visual feedback for what's going on sonically.

I got on board the train of interactivity thanks, in part, to Kilshaw's short course on xcoding for the iphone, for which I got this far, but no further:



To be honest, I'm not a massive fan of the iphone, and have no real desire to develop anything for it, so I doubt I ever will make anything beyond the little toy you see above. However, it got me thinking: if I could achieve similar results in some engine that ran on an actual-factual real life computer, then I'd be getting somewhere. All of a sudden, a world full of different control and diffusion options would be open to me. As such, I downloaded the free version of Unity3D, messed about with it, and in a couple of afternoons managed to knock up the test that you see below:



I know, I know, it's nothing special. But imagine if, instead of a single midi note, each planet was responsible for setting off a different compositional cell. And imagine that the planets were distinguishable from each other. And that the graphics changed every time you did something. And that you had a wii remote in your hands, and that it was all projected onto the pristine white wall of an art gallery. And that you could use the wii remote to control different sonic effects and variables. And other cool stuff, that I haven't come up with yet...

This is the road I'm going to go down, anyway. The first step will be to get the wiimote talking to Unity. I'm going to start by following Jeff Winder's helpful youtube tutorials on how to do this, which can be found here. As I'm doing this, I'll think more about my concept and the grand installation that I hope these nascent ideas will metamorphose into by the end of the summer.

Anyway, that's idea 1. Stay tuned for more unfathomable waffle, both on this and on my other 2 pieces! Stay safe, everyone x

Composing your future


Thank you, Royal Welsh College of Music and Drama Prospectus 2010!