Eyes of Fear: Leveraging Emotion Recognition for Virtual Reality Experience

Resumo


Given the growing interest in immersive gaming experiences, particularly within the horror genre, there is a rising demand for sophisticated and engaging game mechanics. However, creating a truly immersive horror experience requires a balance between fear-inducing elements and player engagement. In this work, we present Eyes of Fear, a horror game that explores human emotions and eye movement as interactive mechanisms in virtual reality. We used a state-of-the-art emotion recognition algorithm and Tobii Eye Tracker for our game prototype and evaluated their effectiveness through player feedback. Our findings show that emotion-sensitive enemy behavior elevates unpredictability and intense emotions while amusing players.
Palavras-chave: Virtual Reality, Game, Horror, Emotion Recognition, Eye Tracking

Referências

Bishay, M., Preston, K., Strafuss, M., Page, G., Turcot, J., & Mavadati, M. (2023). Affdex 2.0: A real-time facial expression analysis toolkit. 2023 IEEE 17th International Conference on Automatic Face and Gesture Recognition (FG), 1–8.

Chien, Y.-L., Lee, C.-H., Chiu, Y.-N., Tsai, W.-C., Min, Y.-C., Lin, Y.-M., Wong, J.-S., & Tseng, Y.-L. (2022). Game-based social interaction platform for cognitive assessment of autism using eye tracking. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 31, 749–758.

Clasen, M., Andersen, M., & Schjoedt, U. (2019). Adrenaline junkies and white-knucklers: A quantitative study of fear management in haunted house visitors. Poetics, 73, 61–71.

Clasen, M., & Kjeldgaard-Christiansen, J. (2016). A consilient approach to horror video games: Challenges and opportunities. Academic Quarter| Akademisk kvarter, 137–152.

Costa, W., Macêdo, D., Zanchettin, C., Talavera, E., Figueiredo, L. S., & Teichrieb, V. (2022). A fast multiple cue fusing approach for human emotion recognition. Available at SSRN 4255748.

Soares de Lima, E., Silva, B. M. C., & Teixeira Galam, G. (2022). Adaptive virtual reality horror games based on Machine learning and player modeling. Entertainment Computing, 43, 100515.

Dehghani, F., & Zaman, L. (2023). Facial Emotion Recognition in VR Games. 2023 IEEE Conference on Games (CoG), 1–4.

Farashi, S., Bashirian, S., Jenabi, E., & Razjouyan, K. (2024). Effectiveness of virtual reality and computerized training programs for enhancing emotion recognition in people with autism spectrum disorder: a systematic review and meta-analysis. International Journal of Developmental Disabilities, 70(1), 110–126.

Kim, H. K., Park, J., Choi, Y., & Choe, M. (2018). Virtual reality sickness questionnaire (VRSQ): Motion sickness measurement index in a virtual reality environment. Applied Ergonomics, 69, 66–73.

Le, N., Nguyen, K., Nguyen, A., & Le, B. (2022). Global-local attention for emotion recognition. Neural Computing and Applications, 34(24), 21625–21639.

Lee, J., Kim, S., Kim, S., Park, J., & Sohn, K. (2019). Context-aware emotion recognition networks. Proceedings of the IEEE/CVF International Conference on Computer Vision, 10143–10152.

Lerner, Y., Papo, D., Zhdanov, A., Belozersky, L., & Hendler, T. (2009). Eyes wide shut: Amygdala mediates eyes-closed effect on emotional experience with music. PLoS One, 4(7), e6230.

Lugaresi, C., Tang, J., Nash, H., McClanahan, C., Uboweja, E., Hays, M., Zhang, F., Chang, C.-L., Yong, M. G., & Lee, J. (2019). Mediapipe: A framework for building perception pipelines. arXiv preprint arXiv:1906.08172.

Mancini, M., Cherubino, P., Cartocci, G., Martinez, A., Di Flumeri, G., Petruzzellis, L., Cimini, M., Aricò, P., Trettel, A., & Babiloni, F. (2022). Esports and visual attention: Evaluating in-game advertising through eye-tracking during the game viewing experience. Brain Sciences, 12(10), 1345.

Marín-Morales, J., Llinares, C., Guixeres, J., & Alcañiz, M. (2020). Emotion recognition in immersive virtual reality: From statistics to affective computing. Sensors, 20(18), 5163.

Monteiro, D., Liang, H.-N., Xu, W., Brucker, M., Nanjappan, V., & Yue, Y. (2018). Evaluating enjoyment, presence, and emulator sickness in VR games based on first-and third-person viewing perspectives. Computer Animation and Virtual Worlds, 29(3-4), e1830.

Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., & Hays, J. (2016). WebGazer: Scalable Webcam Eye Tracking Using User Interactions. Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI), 3839–3845.

Reynolds, E. (2013). Nevermind: Creating an entertaining biofeedback-enhanced game experience to train users in stress management. ACM SIGGRAPH 2013 Posters, 1–1.

Shadiev, R., & Li, D. (2023). A review study on eye-tracking technology usage in immersive virtual reality learning environments. Computers & Education, 196, 104681.

Sundström, P. (2005). Exploring the affective loop. Ph. D. Dissertation.

Tsuji, M., Nishizuka, Y., & Emoto, K. (2023). Threat gates visual aversion via theta activity in Tachykinergic neurons. Nature Communications, 14(1), 3987.

Xue, J., Quan, C., Li, C., Yue, J., & Zhang, C. (2017). A crucial temporal accuracy test of combining EEG and Tobii eye tracker. Medicine, 96(13), e6444.

Zhao, Z., Liu, Q., & Wang, S. (2021). Learning deep global multi-scale and local attention features for facial expression recognition in the wild. IEEE Transactions on Image Processing, 30, 6544–6556.
Publicado
30/09/2024
MARINHO, Isabela et al. Eyes of Fear: Leveraging Emotion Recognition for Virtual Reality Experience. In: SIMPÓSIO DE REALIDADE VIRTUAL E AUMENTADA (SVR), 26. , 2024, Manaus/AM. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2024 . p. 90-96.