Graphical User Interface for Adaptive Human-Robot Interaction Design in Educational Activities Creation
Resumo
Robots and interactive devices are being successfully applied in education to increase the students cognitive experience and, consequently, their learning rate. However, there is a lack of social robots with adaptive skills that can be easily programmed. In this paper, a Graphical User Interface to create, perform and evaluate educational activities with autonomous robots is presented. The interface runs in a system that provides adaptation and personalization through open source techniques for autonomous analysis and recognition of speech, focus deviation and facial emotion. During the activity execution, the adaptive algorithm detects student’s body signals and verbal responses to adapt the addressed content to harder or easier questions. After creating and running an activity, all sessions’ evaluation and information can be accessed for visual analysis, as well as students’ preferences throughout the interaction.
Palavras-chave:
educational robotics, graphical interface, adaptive human-robot interaction, educational activities, personalization
Referências
Belpaeme, T., Vogt, P., van den Berghe, R., Bergmann, K., Göksun, T., de Haas, M., Kanero, J., Kennedy, J., Küntay, A. C., Oudgenoeg-Paz, O., et al. (2017). Guidelines for designing social robots as second language tutors. International Journal of Social Robotics, 3:325–341.
Benitti, F. B. V. (2012). Exploring the educational potential of robotics in schools: A systematic review. Computers & Education, 58(3):978–988.
Clabaugh, C., Tsiakas, K., and Mataric, M. (2017). Predicting preschool mathematics performance of children with a socially assistive robot tutor. In Proceedings of the Synergies between Learning and Interaction Workshop @ IROS, Vancouver, BC, Canada, pages 24–28.
Cortellessa, G., Fracasso, F., Sorrentino, A., Orlandini, A., Bernardi, G., Coraci, L., De Benedictis, R., and Cesta, A. (2018). Robin, a telepresence robot to support older users monitoring and social inclusion: Development and evaluation. Telemedicine and e-Health, 24(2):145–154.
Johal, W., Castellano, G., Tanaka, F., and Okita, S. (2018). Robots for learning. International Journal of Social Robotics, pages 293–294.
Kafai, Y., Sawyer, C. I. R., Papert, S., Harel, S. C. I. I., Papert, S., Duval, E., et al. (2017). Technology and theories of learning. Technology Enhanced Learning: Research Themes, 17(1):169.
Kessous, L., Castellano, G., and Caridakis, G. (2010). Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis. Journal on Multimodal User Interfaces, 3(1-2):33–48.
Lucas, G. M., Boberg, J., Traum, D., Artstein, R., Gratch, J., Gainer, A., Johnson, E., Leuski, A., and Nakano, M. (2018). Getting to know each other: The role of social dialogue in recovery from errors in social robots. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pages 344–351. ACM.
Murray, T. (1999). Authoring intelligent tutoring systems: An analysis of the state of the art. International Journal of Artificial Intelligence in Education (IJAIED), 10:98–129.
Platz, M., Krieger, M., Niehaus, E., and Winter, K. (2018). Suggestion of an E-proof Environment in Mathematics Education. Springer International Publishing, Cham.
Spaulding, S., Chen, H., Ali, S., Kulinski, M., and Breazeal, C. (2018). A social robot system for modeling children's word pronunciation: Socially interactive agents track. In Proceedings of the 17th International Conference on Autonomous Agents and Multiagent Systems, pages 1658–1666. International Foundation for Autonomous Agents and Multiagent Systems.
Tozadore, D., Pinto, A., Valentini, M. C. J., Zavarizz, R., Rodrigues, V., Vedrameto, F., and Romero, R. (2017). Project r-castle: Robotic-cognitive adaptive system for teaching and learning. Accepted in IEEE Trans. on Cognitive and Developmental Systems.
Benitti, F. B. V. (2012). Exploring the educational potential of robotics in schools: A systematic review. Computers & Education, 58(3):978–988.
Clabaugh, C., Tsiakas, K., and Mataric, M. (2017). Predicting preschool mathematics performance of children with a socially assistive robot tutor. In Proceedings of the Synergies between Learning and Interaction Workshop @ IROS, Vancouver, BC, Canada, pages 24–28.
Cortellessa, G., Fracasso, F., Sorrentino, A., Orlandini, A., Bernardi, G., Coraci, L., De Benedictis, R., and Cesta, A. (2018). Robin, a telepresence robot to support older users monitoring and social inclusion: Development and evaluation. Telemedicine and e-Health, 24(2):145–154.
Johal, W., Castellano, G., Tanaka, F., and Okita, S. (2018). Robots for learning. International Journal of Social Robotics, pages 293–294.
Kafai, Y., Sawyer, C. I. R., Papert, S., Harel, S. C. I. I., Papert, S., Duval, E., et al. (2017). Technology and theories of learning. Technology Enhanced Learning: Research Themes, 17(1):169.
Kessous, L., Castellano, G., and Caridakis, G. (2010). Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis. Journal on Multimodal User Interfaces, 3(1-2):33–48.
Lucas, G. M., Boberg, J., Traum, D., Artstein, R., Gratch, J., Gainer, A., Johnson, E., Leuski, A., and Nakano, M. (2018). Getting to know each other: The role of social dialogue in recovery from errors in social robots. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pages 344–351. ACM.
Murray, T. (1999). Authoring intelligent tutoring systems: An analysis of the state of the art. International Journal of Artificial Intelligence in Education (IJAIED), 10:98–129.
Platz, M., Krieger, M., Niehaus, E., and Winter, K. (2018). Suggestion of an E-proof Environment in Mathematics Education. Springer International Publishing, Cham.
Spaulding, S., Chen, H., Ali, S., Kulinski, M., and Breazeal, C. (2018). A social robot system for modeling children's word pronunciation: Socially interactive agents track. In Proceedings of the 17th International Conference on Autonomous Agents and Multiagent Systems, pages 1658–1666. International Foundation for Autonomous Agents and Multiagent Systems.
Tozadore, D., Pinto, A., Valentini, M. C. J., Zavarizz, R., Rodrigues, V., Vedrameto, F., and Romero, R. (2017). Project r-castle: Robotic-cognitive adaptive system for teaching and learning. Accepted in IEEE Trans. on Cognitive and Developmental Systems.
Publicado
29/10/2018
Como Citar
TOZADORE, Daniel; ROMERO, Roseli.
Graphical User Interface for Adaptive Human-Robot Interaction Design in Educational Activities Creation. In: SIMPÓSIO BRASILEIRO DE INFORMÁTICA NA EDUCAÇÃO (SBIE), 29. , 2018, Fortaleza/CE.
Anais [...].
Porto Alegre: Sociedade Brasileira de Computação,
2018
.
p. 605-614.
DOI: https://doi.org/10.5753/cbie.sbie.2018.605.
