Music conducting is the art of directing musical ensembles with hand gestures to personalize and diversify a music piece. The ability to successfully perform a music piece demands intense training and coordination of the conductor and the whole orchestra, but preparing a practice session is an expensive and time-consuming task. Accordingly, there is a need for alternatives to provide adequate training to conductors at all skill levels; virtual and augmented reality technology holds promise for this application. The goal of this research is to study the mechanics of music conducting and develop a system capable to closely simulate the conducting experience. After extensive discussions with professional and nonprofessional conductors, as well as extensive research on music conducting materials, we identify several key features of conducting. A set of lightweight algorithms exploring those features are developed to enable tempo control, volume adjustment and instrument emphasis, which are core components of conducting. By using position/orientation sensors and data gloves as the interface for human-computer interaction, we develop a functional version of the system, iConduct.
iConduct : Music control in the Interactive Conducting System