MID and playmodes
Blaus introduces us in the abstract realm of three dimensional geometry in time.
It can be a cube or a blossoming flower, a grid or a jellyfish; a mutant entity of reflecting lights which submerge the audience into a symbolical universe, driven by hidden forces of the architecture.
Blaus is an immersive space where light and sound relate intimately to impact on the visitor.
Movement of light, sound and laser beams generates a kinetic atmosphere that transforms the architecture into the main character of a geometric play.
And the full performance:
Blaus is a site-specific installation. The whole visual and sound creation is proposed by the architecture of the site where it’s held.
Blaus adapts itself and mutates on every location as a simple result of it’s perceptual origins: space, light and sound.
Through the use of self-made technologies, both hardware and software, Blaus re-draws the space by creating dynamical light figures which emerge from the architectural characteristics.
The mix of motorized laser beams, and mirrors on key points of the room allows for multiple dynamic combinations of three dimensional forms.
Also, the formal dramatism introduced by RGB light design, helps to reinforce the concepts behind each mutation.
At the same time, architecture itself is used as a musical source, since the study of the room acoustics and it’s resonance frequency allows for using the space as a giant musical instrument.
Anchoring on this resonance frequency and its harmonics, a sound composition is generated that matches and reinforces the architecture in the same way that laser does, allowing the space to speak.
Light and sound are always related in time, frequency and perceptive domains, as several self-made software algorithms link light, sound and movement in an intricate manner.
A poetical will, summed to the convergent research of musical and visual relationships of geometry, make it possible to breathe life into the architecture.
We spent a whole week working on-site, playing with lasers, motors, mirrors and sound, and ended up with a setup that mixed 6 motorized and 2 static laser diodes, around 20 mirrors of different sizes (from 1cm to 3 meter long), 2 RGB-DMX fixtures, and a sound PA with a big subwoofer.
One of the most interesting facts about working with architecture is the ability to understand, map and re-interpret the space. In the cistern, we liked to accentuate some geometrical aspects of the architecture by using mirrors and lasers, but we also worked with acoustic mapping.
We managed to find the resonance frequency of the space, that happened to be 43hz. Playing a sine wave through a subwoofer inside this space, the whole architecture acted as a giant speaker, amplifier, or resonance box, thus augmenting the amplitude of the sound waves. Acoustical experience on the audience was highly physical, as we worked on a soundtrack that took advantage of this resonant frequency, and all composition was made using 43hz and its harmonic series as a compositive source. In addition to that, the long reverberation of the space (approx. 12 seconds) was used creatively as a sound element.
In order to compose and play the 12 lasers beams, servo motors, lights and musics composition we used Duration software from James George (NYC). James developped some time ago ofxTimeline, an addon for OpenFrameworks libraries which works in the direction of developping a timeline tool which features editing, recording and playing back of data timelines which can output into MIDI, OSC, Serial, DMX signals.
We collaborated on testing Duration in early september 2012, adding some custom code for our personal project. The relation with ofxTimeline and Playmodes started on our first laser project : BlueBeams.
Here you can see a screenshot of our version of Duration, which drived the lasers intensity and rotation values.
There are many tracks to create or compose output data signals, on this screenshot we see the color track witch drived several Par LED lights via LanBox hardware to DMX.
‘radial’ and ‘blaus’ are two interactive lighting installations, both a result of the collaboration between catalonia-based digital research collectives MID (media interactive design) and playmodes. ‘blaus’ introduces the abstract realm of three dimensional geometry through the mediums of audio and illumination – this could manifest as a cube or a blossoming flower, a grid or a jellyfish, a mutant framework of reflecting lights which submerge the audience into a multi-faceted universe, driven by hidden forces of the architecture. ‘blaus’ is an immersive space where audio-visual elements relate
intimately to impact on the visitor. the process led the designers to build most of the software and hardware elements themselves, by means of algorithm design, digital fabrication techniques and craft handwork. on the hardware side, the use of open source technologies, such as arduino, allowed us to create a flexible electronic system easily addressable by opensound – control data. ‘duration open source software by
james george was used in order to independently control, compose, and play a full score for the laser diodes, servomotors, lights and music.on the sound design side, all music and sound effects are made through the use of audio programming environments such as predate and reactor.
custom digital instruments are made in order to exactly match the resonant frequency of the space and its harmonics.
‘blaus’ is an immersive space where audio-visual elements relate intimately to impact on the visitor.
‘radial’ performance ‘radial’ is a combination of ‘the particle’ – a kinetic light sculpture by multidisciplinary artist alex posada, and ‘blue beams’, a set of 8 motorized blue laser diodes. forming an octogon with lasers around the particle, the light-formed structure allows enough space for 130 people to sit цомфортабли in sunbeds, and enjoy the show. every laser beam has an associated speaker, summing a total of 8 channel audios. the team of team of designers, engineers, artists and programmers created a multichannel synthesizer driven by an 8-pase LFO that генератес synchronized sound and laser control through the OSC protocol. every laser has a sound oscillator associated which is played through its corresponding speaker. some parameters of this oscillator, such as pitch, amplitude or filter, are associated with the brightness of the ласер light and it’s movement in 1 axis. through composing music with this synthesiser, light and movement respond to generate a dynamic show.
Duration es una nueva herramienta open source de línea de tiempo, similar a los timelines de los DAW, pero orientada a proyectos… “especiales”.
Según sus desarrolladores, “el objetivo básico de Duration es el de servir de ‘orquestador’ de múltiples eventos audiovisuales, dispositivos o computadores, a través del protocolo OSC (Open Sound Control, véase, el MIDI del futuro)”.
Para ello, Duration ofrece:
Tracks de automatización mediante curvas bezier.
Tracks para el control de color mediante degradados RGB configurables (para, por ejemplo, controlar un sistema de iluminación).
Tracks de “disparo” de eventos binarios (ON / OFF).
Inserción de tracks de audio.
Rejilla magnética ligada al BPM.
Análisis FFT de los tracks de audio insertados (por si queremos usar esos datos para mover motores, por ejemplo).
Se trata, en definitiva, de una herramienta que hereda muchos comportamientos de los secuenciadores de audio, pero que está orientada a la escritura de “partituras multimedia”. Puede controlar, desde una única línea de tiempo, todos los elementos de una performance o instalación multimedia.
Duration ha sido desarrollado por James George, con el apoyo del YCAMinterlab. En su desarrollo también ha participado el equipo de creadores audiovisuales Playmodes quienes, junto a Álex Posada, han creado un par de instalaciones que demuestran las posibilidades de esta nueva herramienta.