Cluster – Mira
Cluster is an immersive audiovisual site-specific installation that explores relationships between space, time and perception. Geometric schematization of architecture using light instruments transforms the space into a container for the abstract language of light and sound.
The audiovisual discourse generated in real time using software created by Playmodes, researches the possibilities of formal clusters of oscillators applied to the control of light, sound and atonality to the limits of synaesthetic perception.
The installation we propose for the MIRA 2016 festival in Barcelona is a reinterpretation of the architecture itself. The columns room, a remnant of Fabra & Coats industrial past, induces us to generate a piece of light and sound, where the space itself proposes the layout of elements, and the algorithmic contents.
Through 100 tubes of 1.5m RGB LEDs we propose to outline the architecture of the columns room, highlighting its vertical lines.
At the sound level, we created a sonification engine which transforms the behaviors of light into sound.
As in most of our projects, the relationship between light and sound, visual and auditory, is lead to an extreme such that the elemental algorithm is exactly the same for the generative systems of audio and light, resulting in a synesthetic experience absorbent for the observer.
Audiovisual discourse is generated in real time through a software application created specifically for the installation. This software, based on the modeling of transverse waves, generates a low frequency oscillator (LFO) for each LED. Also, the luminance of each LED is analyzed by the audio engine and transformed into a sound vibration: Cluster sound is a direct transposition of light into the audible spectrum. By altering the frequencies, amplitudes, quantization and the multiple relations existing between the thousands of oscillators, complex waves are generated that give shape to the different sequences of the audiovisual discourse.
A technical plan with the layout of elements in space, and the list of used hardware.
A scheme representing the data flow between the different software and hardware elements.
To make this installation possible, we developed an openframeworks application that generates clusters of phase-shifted oscillators. All this oscillator data is encoded as a moving image and sent through syphon.
The resulting syphon texture is analyzed using a pixel-mapping software, and mapped pixel-by-pixel to the LEDs.
At the same time, the syphon texture is analized in a custom made MAX/MSP sound engine, which transforms luminance of each pixel into volume data for a parallel set of audible oscillators. This means that the same control data for the LEDs is used to control sound, resulting in a highly synesthetic installation, where a change in a single LED means also a change in sound.
A view of the oscillation generator (top), the resulting video texture (bottom left), and the 3D simulation.
The oscillation generator (OpenFrameworks), the sound engine (MAX/MSP), the simulator (Unity).
A detail of the generated oscillators. On top left, the resulting texture sent through syphon. Below, the modulator and carrier oscillators. On top right, the modulator expressed as a 2D matrix.