Lately I have been loving a few truly innovative audio apps for the iPhone, none having to do with it being an iPod.
I had always thought that mobile audio creation software were frivolous party tricks. Hey, look at me, I can play Baby Got Back on my 3″-wide keyboard! But that’s changed.
A while back I wrote about an idea for including audio processing code in the header of MP3 files. The premise was that, in addition to creating a music track, the artist would provide parameters for real-time playback modification based on user input, randomness, or anything else. The song would never (or at least wouldn’t ever have to) be the same.
The team at RJDJ have taken this idea to the extreme. The free and pay RJDJ apps in the iTunes store both provide “scenes”, akin to music tracks, complete with artwork. These scenes are nothing but audio processing algorithms.
All input happens via the lavalier microphone on the iTunes earbuds. Basically the scenes take the ambient noise surrounding you and remix it. Some of the scenes do this subtly, some are more musical, but all of them make you the focal point of the remix — not so much a musician as a conductor. I’ve listened to the noise of the L train, walking down the street, and the cacophony of three kids at dinner time. It is completely entrancing. Location-based remixing.
So, to our list of traditional musical interfaces — stick hitting animal skin, horse hair pulled across wire — we add one’s physical movement through life’s soundscape.
Here’s a more musical scene based on my eastward walk through the city a few days ago*. Note especially the interpolation of me almost being hit by a cab crossing Michigan Ave. at 1:16 (red marker on map). The horn makes the piece, in my opinion, but the beauty of this particular scene is how the bleeps and bloops are modulated by the ambient street noise.
Of course this map isn’t connected in any way to playback control, but with the iPhone’s GPS it seems like an obvious evolution of the RJDJ app. The possibilities are many. How about a View in Google Maps button in iTunes? Or a site that aggregates user-created tracks and plots them over one another on a map, a personal-social musical-spatial mashup. Dan Hill’s city of sound, indeed.
There are some other apps of note too.
Bloom is a generative music app from none other than Brian Eno, working with Brian Chilvers. You initiate notes of music by touching the screen. Each note plays and interacts with other notes in expanding concentric circles, like dropping pebbles in a pond. As with scenes in RJDJ, the parameters of note interaction are constrained by “moods”. These are the algorithms that govern the evolution of the sounds you start off. Spore for music. (Not a coincidence that Eno did the music for Spore, of course.)
Ocarina is one of those apps that makes you love the creators for thinking of it. Basically Ocarina turns your iPhone into a high-tech flute. OK, you say, I can see touching the screen like you cover the holes of a woodwind, but where do you blow? Why, the microphone of course! They’ve turned the lack of a wind guard on the iPhone mic into a feature! Light exhalation makes less noise on the mic and produces a lower intensity of the current note combination, and conversely. It’s brilliant really.
* There’s no easy way to export audio from RJDJ, but this handy tool allows you to parse the backup file that the iPhone generates on your machine. You can pluck out the .wav files right from the RJDJ folder.