|
Most sounds, especially music, are created by actions. Therefore, the gestures of the musicians at the sound production have their own visual lives.
Gestural Interfaces differ from traditional ones in that they cannot produce sounds by themselves. They merely send signals that produce sounds by means of a computer or a sound module. They may be regarded as an interface between the performer and the computer insofar as they translates the energy derived from body movements into electrical signals. These electrical signals can control physical musical instrument via electronic interface. There are 3 simple intrumensts with electronic trigger interface: - gong - singing bowl - water drop
EyeCon is an application designed to take real-time video image and convert it into midi information. This midi information controls an electromagnet for striking the bowl or gong with a soft mallet. The same principle is used for water drop control with a valve.
Gesture can be triggered to start repetitive patterns, sequence, or complex texture with algorithm. The relationship between gesture and interaction can be flexibly changed during the course of a piece. The perception of interactivity can be integrated into a musical context. Gesture can be clearly reflected to the acoustic property. For example, slow movement of body reflect lower dynamic level, softer articulation, or slower tempo.
|