how it works

-

// 2011/03/28 - 18:10 / 83.201.68.135
// 2012/03/05 - 20:39 / 81.251.105.133
[Back to RoadMusic|http://roadmusic.fr]
[http://nujus.net/~petesinc/roadmusic_autosync/media/wallpaper.jpg]






!!!How it works
RoadMusic uses PureData [Pd-extended] with video processing library: Gridflow on Ubuntu 10.0.4
Different sonification strategies are employed to create a rich musical proposition. No recorded sound is used in the processes.

!!!The Data is the Waveform.
[http://nujus.net/~petesinc/roadmusic_autosync/media/pics-RM-0.3/onde-200.jpg]
Data from accelerometers is written into lookup tables, then read as audio. So while melody is generated algorithmically, sound varies in timbre in response to the road surface, or larger movements of the car. Although this audification is on a microscopic level, barely perceptible variations, it means that the sound and therefore the listening experience is never quite the same.
These wavetable oscillators are implemented as instruments (15) producing a wide variety of sounds.

!!!Data from the accelerometers is “cooked”:
[http://nujus.net/~petesinc/roadmusic_autosync/media/pics-RM-0.3/flux-200.jpg] [http://nujus.net/~petesinc/roadmusic_autosync/media/pics-RM-0.3/events-100.jpg]
#Each data stream is scaled and can be used as a controller for any parameter of any instrument. For example acceleration, or g-force, can be mapped to amplitude, pitch, delay speed, tempo…
#Events are detected within these streams: a bump, an acceleration... These events are used to trigger sounds, to improvise on melodies etc.
#Events are counted to calculate statistics – bumpiness, bendiness etc.- streams representing variations on a macroscopic scale.
#Threshold values applied to the stats, produce a new set of events used to orchestrate the ensemble – switching different instruments on and off according to abstracted characteristics of road.
!!!The landscape is analyzed in two ways:
[http://nujus.net/~petesinc/roadmusic_autosync/media/pics-RM-0.3/blob2.png]
#Blob-tracking captures large moving objects, mostly cars in the opposite lane, extracting their xyz coordinates. Psycho-acoustic cues (panning, doppler shift) create the impression that a sound follows an object outside the car.
#Average, RGB levels are extracted from the image and used as controllers. An event is detected when there is a change in the dominant color.
!!!Composing
[http://nujus.net/~petesinc/roadmusic_autosync/media/pics-RM-0.3/router-400.jpg]

Data to instrument-parameter routing is defined via a matrix. Different versions can be saved and are switchable while driving.

[Back to RoadMusic|http://roadmusic.fr]