Last update
Maj. 05, 2008

Designed by
Gyoergy Pongracz

mail@gyoergy- pongracz.de

Dance&Performance

Visitors:   

BODY IN INTERACTIVE SPACE

Interactive multimedia system for artistic performances is the combination of computer generated sounds and visual media to the real environment in which a performance is taking place

This interactive system is capable of music composition, improvisation, and performance using body movements The performers can freely improvise their movements and map these improvised gestures to music generated in real time.
Body can also be considered a virtual instrument in which music is produced through body movements, opening new possibilities for music performance and composition.
The position of the body is checked against user-defined zones. MIDI messages are generated or samples started each time an object appears or disappears in a zone or moves within a zone.
These events can cause MIDI notes or samples to be switched on or off  for instance. Other MIDI messages can be generated using some of the object’s parameters, such as position, speed, and size etc.

The motion of human bodies can be used to trigger or control various other media (music, sounds, photos, films, lighting changes, etc.).
If a person then moves into the video image and some part of their body touches
one of the elements you have drawn on, then an event can be triggered, for example a certain sound might be heard.
The program can measure the amount of motion occuring within that field. Additional features let you track the position of persons within the performance area,measure their height,their width,their overall size or the degree of left-right symmetry in their shape
These control elements may each be assigned a different output.
provide the possibility of rich relationships between dance and music in interactive systems.
The dancer, through her movement, is empowered by having control over the music to which she is dancing.
The music is played by a group of dancers through their movements in space,  where
a mapping is made by the various parameters of human movement into the world of sound.
The term ‘mapping’ means the application of a given set of gestural data, obtained via a sensor system, to the control of a given sound synthesis parameter.
The collaboration between music and dance on this piece was complete; that is, the movement and sound were not designed separately, but interactively.
The choreography is affected by the live generation of sound through the use of sensors and real-time synthesis, and the resulting music is in turn shaped by these movements
The real-time sound synthesis environment was designed in PureData and ABSynth.
A PC running the real-time tracking software (EyeCon), PD and ABSynt is linked via internal MIDI interface to sending the gestural data gathered by the software to the real-time sound synthesis parameters.

The PD and ABSynth programs are the musical synthesis environment that provides many control parameters, addressing a number of built-in DSP modules that include granular sampling/synthesis, additive synthesis, spectral filtering, etc.
Both EyeCon and PD/ABSYNTH software components are organized as a series of ‘scenes,’ each describing a unique configuration of video tracking, mapping, samples and DSP.

 

[Dance&Performance] [z-t-r] [Chorrelationen] [Ich bin] [Experiments] [Text]