Playmodes Studio
about

Menu
  • Light
  • Sound
  • Coding
  • Mapping
  • Graphics
  • Hardware
  • Interactive
  • Education
  • Show all
DSC00013DSC00016DSC00022DSC00030DSC00031DSC00032DSC00034DSC00036DSC00037DSC00039DSC00042DSC00044DSC00047DSC00054DSC00062DSC00066DSC00071DSC00074DSC00081DSC00086DSC00092DSC00097DSC00098DSC00101DSC00103DSC00108DSC00115DSC00120DSC00122DSC00123DSC00134DSC00135DSC00139DSC00142DSC00144DSC00155DSC00157DSC00167DSC00169DSC00173DSC00175DSC00176DSC00180DSC00181DSC00183DSC00184DSC00187DSC00189DSC00191DSC00196DSC00199DSC00200DSC00202DSC00208DSC00209DSC00211DSC00212DSC00213DSC00216DSC00221DSC00222DSC00223DSC00224DSC00226DSC00228DSC00229DSC00231DSC00232DSC00235DSC00242DSC00245DSC00251

Espills

Light Sculpture Ateneu de Celrà, September 2018


“Espills” is a solid light dynamic sculpture. Built using laser beams, laser scanners and robotic mirrors, it is inspired by crystalline formations. A set of geometric figures that float in the air and which suggest, in an abstract way, the transmutation of matter from chaos to order. Dust becoming crystal, being eroded and becoming sand again.

Each visual representation integrates its own sound design through sonification algorithms that transform light into music, completing this alchemical landscape.

Espills is also a digital handicraft exercise. Most of the elements that make up the work, both hardware and software, have been designed and built by ourselves: robotic mirrors, light drawing tools, laser modules, audio synthesizers, scenic media…

This is an ongoing research, and the piece haven’t reached its final form yet. At this point, we managed to create a realtime audiovisual instrument that can be played live, and we made the first explorations with it, giving as a result the different scenes you can see on the video documentation. Nevertheless, we understand there’s a bigger potential on the expressiveness of the instrument, and the project will still evolve through future iterations.

In order to make possible this installation, we first started by sketching ideas and forms. From geometry to narrative concepts. A 3D simulation helped us visualizing what the final form of what we had in mind could be:

Captura de pantalla 2018-07-11 a las 21.37.14 WhatsApp Image 2018-08-15 at 11.16.19

 

Once ideas defined, we started building hardware. By hacking existing dmx light fixtures, removing the light heads and replacing them with mirrors, we created a device that can be used to position laser beams through its mirrored reflection:

IMG_1787

We also built the electronic circuits and programmed the microcontroller firmware to integrate laser diodes into our Artnet-DMX ecosystem:

IMG_1885

 

We built an 4m diameter octagonal profile to hold the different machinery:

IMG_1981

 

 

In order to be able to control all this elements, we created a software using OceaNode, our own visual programming framework. OceaNode is based on modulable oscillator banks which can be used to control hundreds of parameters through wave motion:

Captura de pantalla 2018-09-09 a las 12.51.45

 

OceaNode is openSource, you can download it and contribute at: https://github.com/playmodesStudio/ofxoceanode

A sonification engine, built with the Reaktor audio programming environment, receives OSC data from OceaNode. This data is then mapped to audio parameters, transforming laser motion into sound:

Captura de pantalla 2018-09-09 a las 12.46.17

 

This instrument is a customization of an ensemble that can be freely downloaded from the Reaktor’s user library: https://www.native-instruments.com/es/reaktor-community/reaktor-user-library/entry/show/9717/

 

Because we wanted to work with this instrument like with any other instrument in a DAW environment, we developed a VST plugin that sends OSC data to OceaNode. This way we can automate envelopes in the timeline, using Reaper:

Captura de pantalla 2018-09-09 a las 12.53.45

Captura de pantalla 2018-09-09 a las 12.54.36

 

In order to start the content creation process, we set up the piece in a theatre in Celrà. We were working on site for 2 months, adjusting the hardware and software systems and finally creating the audiovisual scenes.

 

Although the exploration led us to finally create a realtime audiovisual generative instrument, in a previous stage we believed it was better to timeline a narrative approach. We even created a whole soundtrack that we finally discarded because we felt the approach was much more powerfull by going realtime. Anyhow, you can hear the discarded soundtrack here:

 

 

 

 

info@playmodes.com