In this paper, we present an innovative gestural sensor interface that allows an electronic music composer to plan and conduct the musical expressivity of a performer. For musical expressivity we mean all those execution techniques and modalities that a performer has to follow in order to satisfy common musical aesthetics, as well as the desiderata of the composer. The proposed sensor interface is able to transform physical parameters in sound synthesis parameters. It is composed by a gestural transducer, that measure motion acceleration and angular velocity, and a mapping module, that transform few physical measured parameters into a lot of specific sound synthesis parameters. In this work, we focus our attention on mapping strategies based on Neural Network.
A New Adaptive Sensor Interface for Composing and Performing Music in Real Time
G Costantini;
2007
Abstract
In this paper, we present an innovative gestural sensor interface that allows an electronic music composer to plan and conduct the musical expressivity of a performer. For musical expressivity we mean all those execution techniques and modalities that a performer has to follow in order to satisfy common musical aesthetics, as well as the desiderata of the composer. The proposed sensor interface is able to transform physical parameters in sound synthesis parameters. It is composed by a gestural transducer, that measure motion acceleration and angular velocity, and a mapping module, that transform few physical measured parameters into a lot of specific sound synthesis parameters. In this work, we focus our attention on mapping strategies based on Neural Network.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.