Immersive Live Concert: A Multi-Sensory Experience Based on Real-Time Lyrics Detection from Spatial Audio Data

  • Anderson Augusto Simiscuka Dublin City University
  • Gianluca Fadda Università degli Studi di Cagliari
  • Vlad Popescu Universitatea Transilvania din Braşov
  • Maurizio Murroni Università degli Studi di Cagliari
  • Gabriel-Miro Muntean Dublin City University

Abstract


This paper presents a multi-sensory solution aimed at enhancing the immersive experience of remote audiences for a live blues concerts. Using the XRBLUES pilot of the HEAT project as a case study, the approach synchronizes olfactory and visual stimuli for remote viewers. The solution focuses on audio-based detection to identify key moments in the concert’s lyrics that trigger specific scents, such as ocean air and wind effects. These sensory cues are synchronized in real-time with the live concert, delivered via scent dispensers for remote users wearing XR headsets. Real-time communication is facilitated through an MQTT-based system to ensure minimal latency. The paper evaluates the effectiveness of this audio-based cue detection approach by evaluating Automatic Speech Recognition (ASR) models, using the Vosk Python library. This work demonstrates how scent-enhanced XR technologies can be integrated into live music events, offering a more engaging and immersive experience for remote concertgoers.

Keywords: Extended Reality, Multi-Sensory Experiences, Audio Detection

References

Kalliopi Apostolou and Fotis Liarokapis. 2022. A Systematic Review: The Role of Multisensory Feedback in Virtual Reality. In 2022 IEEE 2nd International Conference on Intelligent Reality (ICIR). IEEE, 39–42.

Y. Hirata and K. Suzuki. 2019. Multisensory Feedback Systems for Enhanced Remote Interaction. IEEE Transactions on Haptics 12, 3 (2019), 291–302.

Zhiqian Jiang, Xu Zhang, Yiling Xu, Zhan Ma, Jun Sun, and Yunfei Zhang. 2021. Reinforcement Learning Based Rate Adaptation for 360-Degree Video Streaming. IEEE Transactions on Broadcasting 67, 2 (2021), 409–423. DOI: 10.1109/TBC.2020.3028286

Dan Lisowski, Kevin Ponto, Shuxing Fan, Caleb Probst, and Bryce Sprecher. 2023. Augmented Reality into Live Theatrical Performance. In Springer Handbook of Augmented Reality, Andrew Yeh Ching Nee and Soh Khim Ong (Eds.). Springer, 433–450. DOI: 10.1007/978-3-030-67822-7_18

B. Maclntyre and T. F. Smith. 2018. Thoughts on the Future of WebXR and the Immersive Web. In Proc. IEEE Int. Symposium on Mixed and Augmented Reality Adjunct (ISMAR). 338–342.

M. Melo, G. Gonçalves, P. Monteiro, H. Coelho, J. Vasconcelos-Raposo, and M. Bessa. 2020. Do Multisensory Stimuli Benefit the Virtual Reality Experience? A Systematic Review. IEEE Trans. Vis. Comput. Graphics (2020), 1–20.

Krzysztof Pietroszek, Manuel Rebol, and Becky Lake. 2022. Dill Pickle: Interactive Theatre Play in Virtual Reality. In Proc. ACM Symposium on Virtual Reality Software and Technology (VRST) (Tsukuba, Japan). Article 51, 2 pages. DOI: 10.1145/3562939.3565678

T. Plantefol, A. A. Simiscuka, A. Yaqoob, and G.-M. Muntean. 2025. CNN-based 360° Scene Recognition for Automatic Generation of Omnidirectional Scent Effects. IEEE Transactions on Multimedia (2025).

Anderson Augusto Simiscuka, Dhairyasheel Avinash Ghadge, and Gabriel-Miro Muntean. 2023. OmniScent: An Omnidirectional Olfaction-Enhanced Virtual Reality 360° Video Delivery Solution for Increasing Viewer Quality of Experience. IEEE Transactions on Broadcasting 69, 4 (2023), 941–950. DOI: 10.1109/TBC.2023.3277215

Anderson Augusto Simiscuka, Mohammed Amine Togou, Rohit Verma, Mikel Zorrilla, Noel E. O’Connor, and Gabriel-Miro Muntean. 2022. An Evaluation of 360° Video and Audio Quality in an Artistic-Oriented Platform. In Proc. IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB). 1–5. DOI: 10.1109/BMSB55706.2022.9828745

Anderson Augusto Simiscuka, Mohammed Amine Togou, Mikel Zorrilla, and Gabriel-Miro Muntean. 2024. 360-ADAPT: An Open-RAN-Based Adaptive Scheme for Quality Enhancement of Opera 360° Content Distribution. IEEE Transactions on Green Communications and Networking (2024), 1–14. DOI: 10.1109/TGCN.2024.3418948

Irina Tal, Longhao Zou, Alexandra Covaci, Eva Ibarrola, Marilena Bratu, Gheorghita Ghinea, and Gabriel-Miro Muntean. 2019. Mulsemedia in Telecommunication and Networking Education: A Novel Teaching Approach that Improves the Learning Process. IEEE Communications Magazine 57, 11 (2019), 60–66. DOI: 10.1109/MCOM.001.1900241

Rohit Verma, Anderson Augusto Simiscuka, Mohammed Amine Togou, Mikel Zorrilla, and Gabriel-Miro Muntean. 2025. A Live Adaptive Streaming Solution for Enhancing Quality of Experience in Co-Created Opera. IEEE Transactions on Broadcasting (2025), 1–12. DOI: 10.1109/TBC.2025.3541875

F. Yan, X. Zhao, and L. Li. 2020. Olfactory Interfaces for Immersive VR: Design and Evaluation. IEEE Transactions on Human-Machine Systems 50, 6 (2020), 492–501.

Q. Zhao, Y. Liu, and H. Wang. 2017. MQTT-Based Communication for IoT Applications. IEEE Internet of Things Journal 4, 3 (2017), 832–839.

Abid Yaqoob and Gabriel-Miro Muntean. 2021. A Combined Field-of-View Prediction-Assisted Viewport Adaptive Delivery Scheme for 360° Videos. IEEE Transactions on Broadcasting 67, 3 (2021), 746–760. DOI: 10.1109/TBC.2021.3105022
Published
2025-06-03
SIMISCUKA, Anderson Augusto; FADDA, Gianluca; POPESCU, Vlad; MURRONI, Maurizio; MUNTEAN, Gabriel-Miro. Immersive Live Concert: A Multi-Sensory Experience Based on Real-Time Lyrics Detection from Spatial Audio Data. In: ACM INTERNATIONAL CONFERENCE ON INTERACTIVE MEDIA EXPERIENCES WORKSHOPS (IMXW), 25. , 2025, Niterói/RJ. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2025 . p. 7-11. DOI: https://doi.org/10.5753/imxw.2025.1139.