JACK KALISH
Sound Affects
source: portfoliojackkalish
The video is from the live performance that took place on December 11th, 2011 at Cameo Gallery in Brooklyn.
Sound Affects is a musical interface that extracts, analyzes, amplifies, and sonifies the affective states of the user. Emotional expression becomes a soundscape. Sound Affects was developed as a performance piece for New Interfaces for Musical Expression, a performance of novel musical instruments that took place on December 18th at Cameo Gallery in WIlliamsburg, Brooklyn.
Personal Statement
What exactly is music and why is it that it can instill such strong emotions in people? This is the question I sought to explore when I approached this project. One theory about music is that it simulates the natural cycles and rhythms in our bodies. For instance, when you hear a loud fast bass line, it gets us excited and aroused because it simulates a loud fast heart-beat that we might feel in our own bodies. This this project, I wish to take this idea that music instills emotion, and turn it on its head by having emotion create music.
Background
I have been doing research on the nature of emotion over the past year. Emotion can be defined as the conscious recognition of physiological changes in the body that occur subconsciously in response to external stimuli. There are many ways in which such physiological states can be quantitatively measured. In this project I decided to explore a handful of these: facial expression, heart rate, and galvanic skin response.
Audience
Sound Affects was originally designed as a performance piece featuring two on-stage performers.
For the ITP Winter show, I decided to create an interactive version of this project to allow others to create music with their emotions.
User Scenario for ITP Winter Show
A user is presented with a screen, GSR sensor, and a pair of headphones. The user sees their own reflection in the computer screen. When the user makes a face in the screen (for example, a smile) different sounds are triggered in response to different facial expressions. Placing their hand on the GSR sensor produces notes in response to their skin conductivity. The user also hears my heartbeat as it is amplified through a microphone. The user may be able to attach a stethoscope to hear their own heartbeat as well.
Implementation
Sound Affects uses an open-source face-tracking library in open-frameworks called ofxFaceTracker. To get the heartbeat sound, a stethoscope head is outfitted with a microphone. The signal is then filtered and distorted using MAX/MSP. I have also built a simple GSR sensor using some custom electronics, an Arduino, and Processing. The Arduino processes the GSR signal, sends the information to Processing, which does some averaging, and then outputs the signal as MIDI which is captured by Garage Band to trigger the musical notes.
.
.
.
.
.
.
.
source: portfoliojackkalish
Sound Affects is a multimedia performance that extracts, analyzes, amplifies, and sonifies the affective states of the user, creating music from the emotions of the performers.
About the Artist
Jack Kalish (b. 1984, Kiev) is a new media artist, designer, and software developer living and working in Brooklyn. He is interested in the use of art and technology as a means of exploring new ideas through engagement and immersion. His work engages with questions about perception, language, emotion, data and artificial intelligence.
He received his masters from the Interactive Telecommunications Program at NYU in May 2012 and a BA in New Media Design from Rochester Institute of Technology in 2006.