Engagement in Digital Educational Games: Analysis with a Hybrid Detection Model

Abstract


This study investigated how the challenges proposed in an Educational Digital Game (EDG) and students’ abilities to overcome them influence their affective and behavioral states, affecting engagement and learning. A hybrid model was developed that integrates automatically collected emotional and behavioral data, such as facial emotions and eye and head movements. These data were combined to assess students’ engagement during interaction with the EDG. Additionally, students’ self-reports through pre- and post-questionnaires were used to validate the results obtained by the automatic model and to provide a qualitative analysis of students’ perceptions regarding the challenges faced and the learning outcomes provided by the game. Ten students participated in the study and remained engaged most of the time. Disengagement was primarily observed when students faced difficulties controlling the game. Based on the results, a representation of the dynamics of students’ emotions during gameplay was proposed.
Keywords: Digital Educational Game (DEG), Engagement, Affective States, Facial Emotion Detection

References

Ahmed, A. A. and Goodwin, M. S. (2017). Automated detection of facial expressions during computer-assisted instruction in individuals on the autism spectrum. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pages 6050–6055, [S.l.]. s.n.

Csikszentmihalyi, M. (1998). Finding Flow: The Psychology of Engagement with Everyday Life. Basic Books (AZ), New York, NY, USA, revised ed. edition.

Csikszentmihalyi, M. (2008). Flow: The Psychology of Optimal Experience. Harper Perennial, New York, English edition.

D’Mello, S. and Graesser, A. (2012). Dynamics of affective states during complex learning. Learning and Instruction, 22(2):145–157.

Ekman, P., Friesen, W. V., O’Sullivan, M., Chan, A., Diacoyanni-Tarlatzis, I., Heider, K., Krause, R., LeCompte, W. A., Pitcairn, T., Ricci-Bitti, P. E., et al. (1987). Universals and cultural differences in the judgments of facial expressions of emotion. Journal of Personality and Social Psychology, 53(4):712.

Gonzalez Sánchez, J. L., Padilla Zea, N., and Gutiérrez, F. L. (2009). From usability to playability: Introduction to player-centered video game development process. In Human Centered Design: First International Conference, HCD 2009, Held as Part of HCI International 2009, San Diego, CA, USA, July 19-24, 2009 Proceedings 1, pages 65–74. Springer.

Gupta, S. K., Ashwin, T., and Guddeti, R. M. R. (2019). Students’ affective content analysis in smart classroom environment using deep learning techniques. Multimedia Tools and Applications, 78:25321–25348.

Isbister, K. and Schaffer, N. (2008). “Gamenics” and its Potential - Interview with Akihiro Saito, Professor, Ritsumeikan University, College of Image Arts and Sciences; Director, Bmat Japan, pages 357–379. O’Reilly Media, Inc.

Kiili, K., De Freitas, S., Arnab, S., and Lainema, T. (2012). The design principles for flow experience in educational games. Procedia Computer Science, 15:78–91.

Klein, R. and Celik, T. (2017). The WITS Intelligent Teaching System: Detecting student engagement during lectures using convolutional neural networks. In 2017 IEEE International Conference on Image Processing (ICIP), pages 2856–2860. IEEE.

Mansoor Iqbal (2024). Fortnite usage and revenue statistics (2024). Disponível em: [link]. Acesso em: 10 fevereiro 2024.

Ninaus, M., Greipl, S., Kiili, K., Lindstedt, A., Huber, S., Klein, E., Karnath, H.-O., and Moeller, K. (2019). Increased emotional engagement in game-based learning – a machine learning approach on facial emotion detection data. Computers & Education, 142:103641.

Psaltis, A., Apostolakis, K. C., Dimitropoulos, K., and Daras, P. (2017). Multimodal student engagement recognition in prosocial games. IEEE Transactions on Games, 10(3):292–303.

Sell, K., Lillie, T., and Taylor, J. (2008). Energy expenditure during physically interactive video game playing in male college students with different playing experience. Journal of American College Health, 56(5):505–512.

Sharma, P., Joshi, S., Gautam, S., Maharjan, S., Khanal, S. R., Reis, M. C., Barroso, J., and de Jesus Filipe, V. M. (2022). Student engagement detection using emotion analysis, eye tracking and head movement with machine learning. In International Conference on Technology and Innovation in Learning, Teaching and Education, pages 52–68. Springer.

Thiruthuvanathan, M. M., Krishnan, B., and Rangaswamy, M. (2021). Engagement detection through facial emotional recognition using shallow residual convolutional neural networks. International Journal of Intelligent Engineering and Systems, 14(2):236–247.

Wan, C.-S. and Chiou, W.-B. (2007). The motivations of adolescents who are addicted to online games: A cognitive perspective. Adolescence, 42:179–97.

Whitehill, J., Serpell, Z., Lin, Y.-C., Foster, A., and Movellan, J. R. (2014). The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Transactions on Affective Computing, 5(1):86–98.

Zaletelj, J. and Košir, A. (2017). Predicting students’ attention in the classroom from Kinect facial and body features. EURASIP Journal on Image and Video Processing, 2017(1):1–12.
Published
2024-11-04
NASCIMENTO JUNIOR, Nelson; BRAGA, Juliana Cristina; JAQUES, Patricia A.; GOIS, João Paulo. Engagement in Digital Educational Games: Analysis with a Hybrid Detection Model. In: BRAZILIAN SYMPOSIUM ON COMPUTERS IN EDUCATION (SBIE), 35. , 2024, Rio de Janeiro/RJ. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2024 . p. 896-909. DOI: https://doi.org/10.5753/sbie.2024.242572.