• Motion Origami
  • Motion Origami
  • Motion Origami
  • Motion Origami
  • Motion Origami
  • Motion Origami
  • Motion Origami

     

Motion Origami

MAX/MSP patch with Leap Motion

Motion Origami patch explores live performance strategies with gesture based control of sound. Granular sound synthesis, sound transformations and spatial sound distribution are featured in the actual patch.


touching the sound

Motion Origami project is focused on exploring the possibilities of hand gestures in the music performance. The sensor of choice, the Leap Motion controller, provides a perfect interface for the physical gesture interaction with live sound.

In fact, you have to learn how to operate the interface as you learn to play a new music instrument and gain a new virtuosity as a regular music instrument player.

The idea behind this project is to create musically attractive interface, which enables to extract and compose sound themes during a live performance just with the hand gestures.

The name of the project, Motion Origami, comes from the idea of 'folding' sounds with the hand gestures into new soundscapes - new original ideas emerge from simple movements, as this is true for the ancient Japanese art of paper origami

The Motion Origami patch is built in MAX/MSP and enables multi-vector interaction within the signal processing domain.

Leap Motion

The sensor of choice is the Leap Motion controler, which provides extensive information data flow of the hand movements and it's respective gestures. The original idea was to use distance ultrasound or infrared sensors, but the Leap Motion SDK offered rich and robust data flow to experiment with instead.

Motion Origami patch design

The patch recognizes a specific quantisized gesture and fills the buffer.
A gesture of closed hand, in a way meaning "grab the sound", fills the buffer with live audio. The buffer is accessed than by hand movements by ~sogs granular synthesis object

The grain size and it's position in the buffer can be intuitively selected by the hand gestures. The hand gestures also affect the sound position in space, while palm roll and pitch rotation can be used for optional FX processing.

The actual object importing data into MAX/MSP is Swirly Leap, which is based on SDK 2 and can take full advantage of hand skeleton tracking, available only from SDK 2 version.

to interact

After the performance of the 'grab the sound' gesture (all fingers closed) the current live sound stream gets recorded into the buffer in the patch. Hand movement parameters are than linked to control following parameters:

Motion Origami patch features



credits and realization details

The project was realized as part of research grant Sensors in music performance run by Michal Rataj at HAMU, Music Academy in Prague.

Leap Motion advisory Matthew Ostrowski (performing artist, Harvestworks, NYC)

Swirly Leap MAX/MSP object Tom Ritchford (freelance programmer, NYC)

MAX/MSP programing advisory Michal Rataj (Music Academy in Prague, CZ)

Ambisonic objects CNMAT (Center for New Music & Audio Technologies, Berkeley)

sogs~ object IRCAM (Institute de Recherche et Coordination Acoustique/Musique, Paris)

Leap Motion inspiration Jan Hrdlička (3D Sense, Prague)