highlike

MOUTH CTRLER

MOUTH CTRLER

source:francescaperonacom

Science Gallery London commission in occasion of the Mouthy Season (2016).

Commissioned to develop a project in collaboration with researchers at King’s College London.

A collaboration with designer Luca Alessandrini and Dr. Michelle Korda.

I led the project from conception to final delivery

Mouth CTRLer is a transdisciplinary project combining scientific findings about the sensing and sensory capabilities of the oral cavity with prosthetics and interactive technologies. It investigates tangible technological possibilities for human enhancement inside the mouth in the form of wearable prototypes.

The project developed in collaboration with experts from different scientific and creative fields. In particular we would like to thank:

Dr. Trevor Coward – King’s College London for his consultancy on prosthetic fabrication materials and techniques, facial 3d scanning and mouth casts.

Dr. Shama Rahman – Phd in Neuroscience and Complexity of Musical Creativity for her huge involvement in the development of the Taste of Music sound compositions and overall multi-sensory experience.

Nuala Clooney – Conceptual Jeweller for the development of the metallic jewellery prototypes.

Matteo Rossetti – Data Analyst for the help in analysing the qualitative data collected during the event/exhibition.

The mouth has been considered a vital tool for completing ordinary tasks, complementing hands in fulfilling everyday rituals since the dawn of man. With the increasing integration of digital technologies in our lives, our range of HCI bodily interactions became standardised.

Starting with researching existing mouth-based assistive technologies for people with disabilities and physical impairments, the team proceeded to map active/passive actors in the cavity, assigning interactive functionalities to each area. The objective was to better understand how to intuitively augment the array of gestures for digital applications.

Mouth controllers currently available on the market aim to enable people with severe disabilities and reduced mobility to interact with computer interfaces and navigate physical spaces. We challenged current mouth controller applications, thinking creatively about different ways of enhancing daily activities with mouth-controlled, tech-enabled devices.

As part of the project development we ran 2 hands-on and ‘mouths-on’ co-design workshops that engaged different audiences (local communities of young adults as well as the general public) in uncovering the potential of the mouth and cooperatively design future enhanced applications.

Inspired by cross-modal perception studies, we explored the link between the sensory modalities of taste, texture and sound.

We developed hybrid mouthpieces composed of an edible sleeve (lollipop) plugged directly to a bone conduction transducer.

The lollipop shape was designed to fit all kinds of mouths and to transfer the sound vibrations produced by the electronic device to the surface of lips and teeth when users would gently bite or rest the lips over the lollipop.

The device allowed to perceive sounds directly through the mouth rather than through the ears, whilst simultaneously experiencing a release of tastants.

We designed 2 musical compositions, engineered to modulate taste perception through selected pure tones, filters, pitches and variable intensities, creating sharper or duller taste experiences.

Each composition was designed in a sculptural manner, producing textural vibrations that resonate on different parts of the mouth with variable intensity. The sounds were developed in collaboration with Dr. Shama Rahman, specialised in neuroscience and complexity of musical creativity.

We tested the experience on an audience of 60+ participants. We asked each participant to describe the difference in taste between the 2 compositions, and write their feedback. We then analysed the feedback and were able to demonstrate through the data collected that the sounds produced did shape the sensory experience as expected.

The aim of the designed experience was to highlight the convergence in the mouth of various sensory perceptions and pose the question: could future technologies harness cross-modalism and modulate and enhance our perception through multi- sensory devices?

.
.
.
.
.
.
.
source:lucaalessandrinicom

The project “Mouth-CTRL” has been made for the Science Gallery London with Francesca Perona.

Thanks to two co-creative workshops with people with disabilities from a community in south London and teenagers coming from socially deprived areas, the group realised two interactive installations about breaking the boundaries and redesigning the daily use of the mouth in the future.

The first installation was about controlling a video game through the mouth and the second were lollipops through was possible to hear music thanks to the bone sound conductivity. Jewellery maker Nuala Clooney collaborated at the project realising concept jewels of future mouth music readers accessories. The composer, musician and neuroscientist Shama Raman that composed the music for the installation allowing to collect datas about the relation between taste and music.