Matthias Lohscheidt
Evolution Of Silence
source: mlohscheidtde
The work »Evolution Of Silence« portrays a virtual world populated by artificial creatures. Each creature creates a tone defined by its DNA, and is visually represented by a ray of light inside a holographic projection. The population of these creatures evolves steadily in an evolutionary algorithm.
The virtual world provides an inaudible key as an environmental condition. The more the tone of a creature fits the key, the louder its tone sounds, the stronger and broader its ray of light is, the more attractive it is to the other creatures and the longer it lives. Thus, it can produce many offspring by mating other creatures. Deviant tones however are quiet, have a very thin ray of light and die relatively quickly.
This way, the population inside the world evolves from a disharmonic and chaotic soundscape to a harmonic structure, fitting more and more to the key provided by the environment.
After two to three minutes the environment key changes randomly, and the soundscape immediately becomes chaotic again. The population will adapt to this new environment over the period of several generations.
The visitor can influence the evolutionary process by touching the rays of lights belonging to the creatures and thus “feeding” them. If the weak, quiet creatures with deviant, disharmonic tones, a thin ray of light and imminent death are touched or “fed”, they immediately become strong and loud. Thus, they are much more attractive to the other creatures and can produce many offspring. So the process of harmonising controlled by the evolutionary algorithm will slow down and be complemented with disharmonies.
»Evolution Of Silence« is the result of a series of experiments where I applied evolutionary algorithms to the emergent generation of sounds.
Following JOHN CAGE, »Silence« is interpreted as the occurrence of all sounds that arise inside an environment by the very existence of all beings and elements – without any intention associated with those sounds: The artificial beings of the virtual world don’t sound to fulfill a musical purpose, they just sound because they exist.
Only the environment with its progressing evolutionary process judges every tone and sets it into a harmonic context. This context attributes an intention to every tone, that is however, only valid within this context.
All software is written in Java based on the PROCESSING-library. The sound is generated with SuperCollider, which is controlled by the java-application through the great SUPERCOLLIDER-CLIENT for Processing by Daniel Jones. The tracking has been realized with a Kinect and Daniel Shiffman’s OPENKINECT-LIBRARY.
.
.
.
.
.
.
.
source: mlohscheidtde
Matthias Lohscheidt, born in 1986, lives, studies and works in Augsburg, Germany. At the moment he’s doing his master in »Design- & Communication-Processes« at the University of Applied Sciences, Augsburg. For that purpose he’s experimenting and prototyping in quite diverse fields of interest, mainly with PROCESSING and VVVV.
Besides that he’s one of the publishers of DER GREIF, a magazine for photography and literature.