Visualizing Air Drums: Analysis of Motion and Vocalization Data Related to Playing Imaginary Drums
Air drums, or imaginary drums, are commonly played as a form of participating in musical experiences. The gestures derived from playing air drums can be acquired using accelerometers and then mapped into sound control responses. Commonly, the mapping process relies on a peak-picking procedure that maps local maxima or minima to sound triggers. In this work, we analyzed accelerometer and audio data comprising the motion of subjects playing air drums while vocalizing their expected results. Our qualitative analysis revealed that each subject produced a different relationship between their motion and the vocalization. This suggests that using a fixed peak-picking procedure can be unreliable when designing accelerometer-controlled drum instruments. Moreover, user-specific personalization can be an important feature in this type of virtual instrument. This poses a new challenge for this field, which consists of quickly personalizing virtual drum interactions. We made our dataset available to foster future work in this subject.
Mike Collicutt, Carmine Casciato, and Marcelo M. Wanderley. From real to virtual : A comparison of input devices for percussion tasks. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 1–6, Pittsburgh, PA, United States, 2009.
Thomas Hermann Tobias Grosshauser, Ulf Grossekathofer. New sensors and pattern recognition techniques for string instruments. In New Interfaces for Musical Expression, pages 271–276.
Richard O. Duda, Peter E. Hart, and David G. Stork. Pattern Classification (2nd Ed). Wiley, 2001.
Teemu Maki-patola. User interface comparison for virtual drums. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 144–147, Vancouver, BC, Canada, 2005.
Christophe Havel and Myriam Desainte-Catherine. Modeling an air percussion for composition and performance. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 31–34, Hamamatsu, Japan, 2004.
Luke Dahl. Studying the timing of discrete musical air gestures. Comput. Music J., 39(2):47–66, June 2015.