Ha Dou Ken Music: Mapping a joysticks as a musical controller
Resumo
The structure of a digital musical instrument (DMI) can be splitted up in three parts: interface, mapping and synthesizer. For DMI’s, in which sound synthesis is done via software, the interaction interface serves to capture the performer’s gestures, which can be mapped under various techniques to different sounds. In this work, we bring videogame controls as an interface for musical interaction. Due to its great presence in popular culture and its ease of access, even people who are not in the habit of playing electronic games possibly interacted with this kind of interface once in a lifetime. Thus, gestures like pressing a sequence of buttons, pressing them simultaneously or sliding your fingers through the control can be mapped for musical creation. This work aims the elaboration of a strategy in which several gestures captured by the interface can influence one or several parameters of the sound synthesis, making a mapping denominated many to many. Buttons combinations used to perform game actions that are common in fighting games, like Street Fighter, were mapped to the synthesizer to create a music. Experiments show that this mapping is capable of influencing the musical expression of a DMI making it closer to an acoustic instrument.
Referências
Vesa Välimäki and Tapio Takala. Virtual musical instruments—natural sound using physical models. Organised Sound, 1(2):75–86, 1996.
Gianpaolo Borin, Giovanni De Poli, and Augusto Sarti. Musical signal synthesis. Swets & Zeitlinger BV, 1997.
Joel Ryan. Effort and expression. In Proceedings of the International Computer Music Conference, pages 414–414. INTERNATIONAL COMPUTER MUSIC ASSOCIATION, 1992.
Andy Hunt, Marcelo M Wanderley, and Ross Kirk. Towards a model for instrumental mapping in expert musical interaction. In ICMC. Citeseer, 2000.
Andy Hunt, Marcelo M Wanderley, and Matthew Paradis. The importance of parameter mapping in electronic instrument design. Journal of New Music Research, 32(4):429–440, 2003.
Andy Hunt and Ross Kirk. Mapping strategies for musical performance. Trends in gestural control of music, 21(2000):231–258, 2000.
Matthew Blessing and Edgar Berdahl. The joystyx: A quartet of embedded acoustic instruments. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 271–274, Copenhagen, Denmark, 2017. Aalborg University Copenhagen.
Sook Young Won, Humane Chan, and Jeremy Liu. Light pipes: A light controlled midi instrument. In Proceedings of the 2004 conference on New interfaces for musical expression, pages 209–210. National University of Singapore, 2004.
Christopher Ariza. The dual-analog gamepad as a practical platform for live electronics instrument and interface design. In Proceedings of the International Conference on New Interfaces for Musical Expression, Ann Arbor, Michigan, 2012. University of Michigan.
Jace Miller and Tracy Hammond. Wiiolin : a virtual instrument using the wii remote. In Proceedings of the International Conference on New Interfaces for Musical Expres- sion, pages 497–500, Sydney, Australia, 2010.
Elaine L. Wong, Wilson Y. F. Yuen, and Clifford S. T. Choy. Designing wii controller: A powerful musical instrument in an interactive music performance system. In Proceedings of the 6th International Conference on Advances in Mobile Computing and Multimedia, MoMM ’08, pages 82–87, New York, NY, USA, 2008. ACM.
Joseph Butch Rovan, Marcelo M Wanderley, Shlomo Dubnov, and Philippe Depalle. Instrumental gestural mapping strategies as expressivity determinants in computer music performance. In Kansei, The Technology of Emotion. Proceedings of the AIMI International Workshop, pages 68–73. Genoa: Associazione di Informatica Musicale Italiana, October, 1997.
Ben Olson. Transforming 8-bit video games into musical interfaces via reverse engineering and augmentation. In Proceedings of the International Conference on New Interfaces for Musical Expression, volume 16 of 2220-4806, pages 73–77, Brisbane, Australia, 2016. Queensland Conservatorium Griffith University.
Perry R. Cook. Re-designing principles for computer music controllers : a case study of squeezevox maggie. In Proceedings of the International Conference on New Interfaces for Musical Expression, pages 218–221, Pittsburgh, PA, United States, 2009.