These images are the test renders for the second of my forthcoming works, entitled In Perpetuity: The Linden Trees. The composition is the second in a two part series exploring the concept of time and scale in both artistic motivation and compositional practice. As with the previous work, the visuals are generated through gradual evolutions in particle behaviour derived from UK population data for the last century. Particle structures are generated through both flocking and orbital simulation, the parameters of which are directly controlled by the input data. The audio component of the work will incorporate granular material generated through a range of mapped and unmapped processes to facilitate the formation and dissolution of audiovisual congruence.

Read the rest of this entry »

Image: The Early Lilacs

April 4, 2011


These images form the test renders for my forthcoming work, entitled In Perpetuity: The Early Lilacs. The composition explores the the concept of time and scale in both artistic motivation and compositional practice. The visuals for the work are generated from gradual evolutions in particle behaviour derived from UK population data for the last century. As with previous compositions, the audio component will incorporate granular material generated through a range of mapped and unmapped processes to permit the formation and dissolution of perceptual congruence between medium morphologies.

Read the rest of this entry »

Code: PMLib 0.2

March 10, 2011


The PMLib library facilitates the creation of complex parameter mapping systems within processing.

Version 0.2 introduces three new classes: PMLEvent, PMLSequencer and PMLCurve. The PMLEvent object allows the storage of an array of float values alongside a tag and frame identifiers. The PMLSequencer object permits the storage and playback of PMLEvent objects on a per-frame timeline. Event sequences may be loaded from and saved to text files and the PMLSequencer object offers numerous utility functions for the combination and modification of stored sequences. Value ramping has been introduced across the library, facilitated by the PMLCurve object.

Download the library and documentation here.

Read on for a brief tutorial.

Image: Oskar Fischinger – Road Map, 1961

Read the rest of this entry »

Video: Mapping Studies

March 8, 2011

Completed in November 2010, the ‘Mapping Study’ series of videos document my examination of parameter mapping strategy for audiovisual art.

The series consists of five works separated into four studies, each examining a discrete methodology for data-driven media generation facilitated by parameter mapping. Each composition uses identical input data, with the mapping strategy, visual rendering system and audio synthesis process being subject to variation. All of the visuals were generated using  a custom Java particle library within Processing, while audio was synthesised using an implementation of Nathan Wolek’s excellent granular toolkit externals within Max/MSP.

Read the rest of this entry »

Code: PMLib

June 24, 2010

The PMLib library facilitates the creation of complex parameter mapping systems within Processing.

The library includes four classes: PMLMatrix, PMLRange, PMLFloat and PMLRoute. The PMLMatrix object provides the core parameter mapping functionality with full input and output scaling and flexible tag-based channel mapping. The PMLFloat and PMLRange objects are utility objects; the former allowing a tag to be stored alongside a float value, and the latter allowing a value range to be defined. The PMLRoute object allows multiple PMLFloat objects to be collated based on their tag value.

Download the library and documentation here.

Read on for a brief tutorial.

Read the rest of this entry »

Video: Dione

May 15, 2010

Completed in March 2010, Dione is the final part of a trilogy exploring parameter mapping techniques between visual particles and audio grains. The composition process for Dione extended my exploration of behavioural simulation as a compositional tool by incorporating both orbital and flocking behaviours. Like its predecessors, the visuals for Dione were generated by assigning force magnitudes to each element within the behavioural simulation and using these forces to modify the velocities of a set of dynamically created particles. The soundtrack for Dione was created through the layering of independently generated granular textures based on perceptual correspondences between aural and visual densities. The decision to generate content for each medium separately was motivated by a desire to evaluate the importance of low-level parameter mapping within the composition process.

Although the behavioural simulation used within Dione is greatly enhanced from that within Rhea, the two pieces remain aesthetically similar. The inclusion of brownian motion within the particle decay phase softens the trajectories and appears to give the visuals a ‘noisier’ appearance. Similarly, although the soundtrack used similar samples to its predecessor, a more spectrally dense composition was sought. This was achieved through the use of noise-based samples and very dense grain clouds. It would appear, however, that by discarding parameter mapping within the creation of audio textures, the individual meso-structures of each component was reduced. Subsequently the overall macro-structure of the composition seems less dynamic as a result.

Dione may be viewed on Vimeo here.

Video: Rhea

March 17, 2010

Completed in February 2010, Rhea is the second part of a trilogy exploring parameter mapping between visual particles systems and granular synthesis. Like its forebear Io, the visuals in Rhea are generated by exerting a range of attractive and repulsive forces upon a set of dynamically created particles. The resultant trajectories are then rendered to create abstract, evolving imagery. The soundtrack was generated through strict sound-object and meso-level mappings between visual properties and granular synthesis parameters.

With the creation of Rhea, I was keen to develop the fundamental concepts explored in Io whilst improving the dynamism of the visuals and reducing the predictability that was so inherent to the piece. Having identified the core simulation to be the primary weakness, I developed a simulation of orbital behaviour, to which I attached multiple emitters. Similarly, attractive and repulsive forces where attached to different orbital components allowing the particle trajectories to be dynamically modified. The resultant system is far more generative than its predecessor; user interaction is resigned to controlling force magnitudes and particle behaviour.

With a more dynamic simulation at its core, both audio and visual components within Rhea appear far more dynamic and interesting than those within Io and the piece seems more successful as a result. Indeed, from a personal perspective, Rhea is the piece that I was always intending to create, I just needed to make some mistakes first!

Rhea may be viewed on Vimeo here.

Follow

Get every new post delivered to your Inbox.