Virtual Reality Dance Tracks from Skeletal Animations

  • Leonardo Mastra PUC-RIO
  • Luiz J. S. Silva UNISINOS
  • Alberto Raposo PUC-RIO
  • Vinicius da Silva PUC-RIO

Resumo


This paper presents a novel approach for automatically generating Virtual Reality (VR) dance tracks, focusing on translating the movement of animated 3D models directly derived from music. Our method capitalizes on the recent advances in automated synthesis of animated 3D models from music, using this data to bridge the gap to fully automatic VR dance track creation. We introduce a novel plugin for the Unity game engine, facilitating the conversion of dance animations from the Choreomaster dataset into dance tracks for the VR game Synth Riders. This approach aims to guide dance while supporting creative movement interpretation and minimizing movement restriction. The paper offers a comprehensive review of the current state of VR dance experiences and music-to-animation synthesis, and an in-depth explanation of our plugin and its key components. Our approach has potential to enhance the creation of dance experiences in VR, reducing resource dependency and increasing versatility in dance styles. This research is a step towards fully automated VR dance track creation, potentially impacting realms of music, dance, and fitness in VR.
Palavras-chave: Virtual Reality, dance, animation

Referências

Omid Alemi, Jules Françoise, and Philippe Pasquier. 2017. GrooveNet: Realtime music-driven dance movement generation using artificial neural networks. networks 8, 17 (2017), 26.

Dimitrios S Alexiadis, Philip Kelly, Petros Daras, Noel E O’Connor, Tamy Boubekeur, and Maher Ben Moussa. 2011. Evaluating a dancer’s performance using kinect-based skeleton tracking. In Proceedings of the 19th ACM international conference on Multimedia. 659–662.

Andreas Aristidou, Nefeli Andreou, Loukas Charalambous, Anastasios Yiannakidis, Yiorgos Chrysanthou, V Hulusic, and A Chalmers. 2021. Virtual Dance Museum: The case of greek/cypriot folk dancing. In Proceedings of the Eurographics Workshop on Graphics and Cultural Heritage (Aire-la-Ville, Switzerland, Switzerland, 2021), Hulusic V., Chalmers A.,(Eds.), GCH, Vol. 21.

Rachele Bellini, Yanir Kleiman, and Daniel Cohen-Or. 2018. Dance to the beat: Synchronizing motion to audio. In Computational Visual Media. Springer, 197– 208.

Jacky CP Chan, Howard Leung, Jeff KT Tang, and Taku Komura. 2010. A virtual reality dance training system using motion capture technology. IEEE transactions on learning technologies 4, 2 (2010), 187–195.

Kang Chen, Zhipeng Tan, Jin Lei, Song-Hai Zhang, Yuan-Chen Guo, Weidong Zhang, and Shi-Min Hu. 2021. Choreomaster: choreography-oriented musicdriven dance synthesis. ACM Transactions on Graphics (TOG) 40, 4 (2021), 1–13.

Augusto Dias Pereira Dos Santos, Kalina Yacef, and Roberto Martinez-Maldonado. 2017. Let’s dance: how to build a user model for dance students using wearable technology. In Proceedings of the 25th Conference on User Modeling, Adaptation and Personalization. 183–191.

Katerina El Raheb, Akrivi Katifori, and Yannis Ioannidis. 2016. HCI challenges in Dance Education. EAI Endorsed Transactions on Ambient Systems 3, 9 (2016).

Rukun Fan, Songhua Xu, and Weidong Geng. 2011. Example-based automatic music-driven conventional dance motion synthesis. IEEE transactions on visualization and computer graphics 18, 3 (2011), 501–515.

Joao P Ferreira, Thiago M Coutinho, Thiago L Gomes, José F Neto, Rafael Azevedo, Renato Martins, and Erickson R Nascimento. 2021. Learning to dance: A graph convolutional adversarial network to generate realistic dance motions from audio. Computers & Graphics 94 (2021), 11–21.

Diego Felipe Paez Granados, Jun Kinugawa, Yasuhisa Hirata, and Kazuhiro Kosuge. 2016. Guiding human motions in physical human-robot interaction through COM motion control of a dance teaching robot. In 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids). IEEE, 279–285.

John K Haas. 2014. A history of the unity game engine. (2014).

Khoi Hoang Dinh, Ozgur S Oguz, Mariam Elsayed, and Dirk Wollherr. 2019. Adaptation and transfer of robot motion policies for close proximity humanrobot interaction. Frontiers in Robotics and AI 6 (2019), 69.

Daniel Holden, Jun Saito, and Taku Komura. 2016. A deep learning framework for character motion synthesis and editing. ACM Transactions on Graphics (TOG) 35, 4 (2016), 1–11.

Javid Iqbal and Manjit Singh Sidhu. 2022. Acceptance of dance training system based on augmented reality and technology acceptance model (TAM). Virtual Reality 26, 1 (2022), 33–54.

Manish Joshi and Sangeeta Chakrabarty. 2021. An extensive review of computational dance automation techniques and applications. Proceedings of the Royal Society A 477, 2251 (2021), 20210071.

Tae-hoon Kim, Sang Il Park, and Sung Yong Shin. 2003. Rhythmic-motion synthesis based on motion-beat analysis. ACM Transactions on Graphics (TOG) 22, 3 (2003), 392–401.

Matthew Kyan, Guoyu Sun, Haiyan Li, Ling Zhong, Paisarn Muneesawang, Nan Dong, Bruce Elder, and Ling Guan. 2015. An approach to ballet dance training through ms kinect and visualization in a cave virtual reality environment. ACM Transactions on Intelligent Systems and Technology (TIST) 6, 2 (2015), 1–37.

Jehee Lee, Jinxiang Chai, Paul SA Reitsma, Jessica K Hodgins, and Nancy S Pollard. 2002. Interactive control of avatars animated with human motion data. In Proceedings of the 29th annual conference on Computer graphics and interactive techniques. 491–500.

Ruilong Li, Shan Yang, David A Ross, and Angjoo Kanazawa. 2021. Ai choreographer: Music conditioned 3d dance generation with aist++. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 13401–13412.

Dylan P Losey, Andrea Bajcsy, Marcia K O’Malley, and Anca D Dragan. 2022. Physical interaction as communication: Learning robot objectives online from human corrections. The International Journal of Robotics Research 41, 1 (2022), 20–44.

Adriano Manfrè, Ignazio Infantino, Filippo Vella, and Salvatore Gaglio. 2016. An automatic system for humanoid dance creation. Biologically Inspired Cognitive Architectures 15 (2016), 1–9.

Christos Mousas. 2018. Performance-driven dance motion control of a virtual partner character. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 57–64.

Ferda Ofli, Yasemin Demir, Yücel Yemez, Engin Erzin, A Murat Tekalp, Koray Balcı, İdil Kızoğlu, Lale Akarun, Cristian Canton-Ferrer, Joëlle Tilmanne, et al. 2008. An audio-driven dancing avatar. Journal on Multimodal User Interfaces 2 (2008), 93–103.

Ferda Ofli, Engin Erzin, Yücel Yemez, and A Murat Tekalp. 2011. Learn2dance: Learning statistical music-to-dance mappings for choreography synthesis. IEEE Transactions on Multimedia 14, 3 (2011), 747–759.

Roosa Piitulainen, Perttu Hämäläinen, and Elisa D Mekler. 2022. Vibing together: Dance experiences in social virtual reality. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–18.

Xuanchi Ren, Haoran Li, Zijian Huang, and Qifeng Chen. 2020. Self-supervised dance video synthesis conditioned on music. In Proceedings of the 28th ACM International Conference on Multimedia. 46–54.

Adriana Schulz, Wojciech Matusik, and Luiz Velho. 2013. Choreographics: An authoring tool for dance shows. Journal of Graphics Tools 17, 4 (2013), 159–176.

Simon Senecal, Niels A Nijdam, Andreas Aristidou, and Nadia MagnenatThalmann. 2020. Salsa dance learning evaluation and motion analysis in gamified virtual reality environment. Multimedia Tools and Applications 79 (2020), 24621– 24643.

Taoran Tang, Jia Jia, and Hanyang Mao. 2018. Dance with Melody: An LSTMautoencoder Approach to Music-oriented Dance Synthesis. In Proceedings of the 26th ACM international conference on Multimedia. 1598–1606.

Jonathan Tseng, Rodrigo Castellon, and Karen Liu. 2023. Edge: Editable dance generation from music. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 448–458.

Sijie Yan, Zhizhong Li, Yuanjun Xiong, Huahan Yan, and Dahua Lin. 2019. Convolutional sequence generation for skeleton-based action synthesis. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 4394–4402.

Zijie Ye, Haozhe Wu, Jia Jia, Yaohua Bu, Wei Chen, Fanbo Meng, and Yanfeng Wang. 2020. Choreonet: Towards music to dance synthesis with choreographic action unit. In Proceedings of the 28th ACM International Conference on Multimedia. 744–752.

Yong Zhao. 2022. Teaching traditional Yao dance in the digital environment: Forms of managing subcultural forms of cultural capital in the practice of local creative industries. Technology in Society 69 (2022), 101943.
Publicado
06/11/2023
Como Citar

Selecione um Formato
MASTRA, Leonardo; SILVA, Luiz J. S.; RAPOSO, Alberto; DA SILVA, Vinicius. Virtual Reality Dance Tracks from Skeletal Animations. In: SIMPÓSIO DE REALIDADE VIRTUAL E AUMENTADA (SVR), 25. , 2023, Rio Grande/RS. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2023 . p. 248–253.