Graphical User Interface for educational content programming with social robots activities and how teachers may perceive it

Autores

  • Daniel Carnieto Tozadore Institute of Mathematics and Computer Sciences, University of São Paulo
  • Roseli Aparecida Francelin Romero Institute of Mathematics and Computer Sciences, University of São Paulo

DOI:

https://doi.org/10.5753/rbie.2020.28.0.191

Palavras-chave:

Educational Social Robots, Human-Robot Interaction, Machine Learning in Education

Resumo

Interactive devices have been successfully applied in education in the last decades. The most used devices for such tasks are personal computers and tablets, due to its financial trade-off and popularization. Social robots are less used, mainly because of its cost and the complexity of being programmed. In this paper, a solution to work around the complexity of programming social robots is presented as a Graphical User Interface (GUI). The GUI system controls an interactive robot which plays with the students and adapts its behavior autonomously. During the activity execution, the adaptive algorithm detects student's body signals and verbal responses to adapt the addressed content to harder or easier questions. After creating and running an activity, all sessions' evaluation and information can be accessed for visual analysis, as well as students' preferences throughout the interaction. The proposal was presented to regular teachers from the elementary school that answered a questionnaire about their perception about the proposal. The answers were analyzed and, in general, they seemed to slightly notice the system potential in and how it can support them in after-classes exercises, despite it require some time to fully get used with the interface.

Downloads

Não há dados estatísticos.

Referências

Belpaeme, T., Kennedy, J., Ramachandran, A., Scassellati, B., & Tanaka, F. (2018a). Social robots for education: A review. Science Robotics, 3(21), DOI:10.1126/scirobotics.aat5954 [GS Search]

Belpaeme, T., Vogt, P., Van den Berghe, R., Bergmann, K., Göksun, T., De Haas, M. & Papadopoulos, F. (2018b). Guidelines for designing social robots as second language tutors. International Journal of Social Robotics, 10(3), 325-341. DOI: 10.1007/s12369-018-0467-6 [GS Search]

Benitti, F. B. V. (2012). Exploring the educational potential of robotics in schools: A systematic review. Computers & Education, 58(3), 978-988. DOI: 10.1016/j.compedu.2011.10.006 [GS Search]

Clabaugh, C., Tsiakas, K., & Mataric, M. (2017). Predicting preschool mathematics performance of children with a socially assistive robot tutor. In Proceedings of the Synergies between Learning and Interaction Workshop@ IROS, Vancouver, BC, Canada (pp. 24-28). [GS Search]

Cortellessa, G., Fracasso, F., Sorrentino, A., Orlandini, A., Bernardi, G., Coraci, L. & Cesta, A. (2018). ROBIN, a telepresence robot to support older users monitoring and social inclusion: development and evaluation. Telemedicine and e-Health, 24(2), 145-154. DOI: 10.1089/tmj.2016.0258 [GS Search]

Gao, A. Y., Barendregt, W., & Castellano, G. (2017). Personalised human-robot co-adaptation in instructional settings using reinforcement learning. In IVA Workshop on Persuasive Embodied Agents for Behavior Change: PEACH 2017, August 27, Stockholm, Sweden [GS Search]

Goodrich, M. A., & Schultz, A. C. (2008). Human–robot interaction: a survey. Foundations and Trends in Human–Computer Interaction, 1(3), 203-275. DOI: 10.1561/1100000005 [GS Search]

Johal, W., Castellano, G., Tanaka, F., & Okita, S. (2018). Robots for learning. International Journal of Social Robotics, 293–294. DOI: 10.1007/s12369-018-0481-8 [GS Search]

Jones, A., & Castellano, G. (2018). Adaptive robotic tutors that support self-regulated learning: A longer-term investigation with primary school children. International Journal of Social Robotics, 10(3), 357-370. DOI: 10.1007/s12369-017-0458-z [GS Search]

Kafai, Y., Sawyer, C. I. R., Papert, S., Harel, S. C. I. I., Papert, S., & Duval, E. (2017). Technology and theories of learning. Technology Enhanced Learning: Research Themes, 17(1), 169. [GS Search]

Kessous, L., Castellano, G., & Caridakis, G. (2010). Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis. Journal on Multimodal User Interfaces, 3(1-2), 33-48. DOI: 10.1007/s12193-009-0025-5 [GS Search]

Leite, I., Martinho, C., & Paiva, A. (2013). Social robots for long-term interaction: a survey. International Journal of Social Robotics, 5(2), 291-308. DOI 10.1007/s12369-013-0178-y [GS Search]

Lucas, G. M., Boberg, J., Traum, D., Artstein, R., Gratch, J., Gainer, A. & Nakano, M. (2018, February). Getting to know each other: The role of social dialogue in recovery from errors in social robots. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 344-351). ACM. DOI: 10.1145/3171221.3171258 [GS Search]

Mubin, O., Stevens, C. J., Shahid, S., Al Mahmud, A., & Dong, J. J. (2013). A review of the applicability of robots in education. Journal of Technology in Education and Learning, 1(209-0015), 13. [GS Search]

Murray, T. (1999). Authoring intelligent tutoring systems: An analysis of the state of the art. International Journal of Artificial Intelligence in Education (IJAIED), 10, 98-129. [GS Search]

Pinto, A. H., Tozadore, D. C., & Romero, R. A. (2015, October). A question game for children aiming the geometrical figures learning by using a humanoid robot. In 2015 12th Latin American Robotics Symposium and 2015 3rd Brazilian Symposium on Robotics (LARS-SBR) (pp. 228-233). IEEE. DOI: 10.1109/LARS-SBR.2015.62 [GS Search]

Platz, M., Krieger, M., Niehaus, E., & Winter, K. (2018). Suggestion of an E-proof Environment in Mathematics Education. In Classroom Assessment in Mathematics (pp. 107-120). Springer, Cham. DOI: 10.1007/978-3-319-73748-5_8 [GS Search]

Rivas, D., Alvarez, M., Velasco, P., Mamarandi, J., Carrillo-Medina, J. L., Bautista, V. & Huerta, M. (2015, February). BRACON: Control system for a robotic arm with 6 degrees of freedom for education systems. In 2015 6th International Conference on Automation, Robotics and Applications (ICARA) (pp. 358-363). IEEE. DOI: 10.1109/ICARA.2015.7081174 [GS Search]

Shiomi, M., Kanda, T., Ishiguro, H., & Hagita, N. (2006, March). Interactive humanoid robots for a science museum. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction (pp. 305-312). ACM. DOI: 10.1145/1121241.1121293 [GS Search]

Spaulding, S., Chen, H., Ali, S., Kulinski, M., & Breazeal, C. (2018, July). A Social Robot System for Modeling Children's Word Pronunciation: Socially Interactive Agents Track. In Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems (pp. 1658-1666). International Foundation for Autonomous Agents and Multiagent Systems. [GS Search]

Tozadore, D., Ranieri, C., Nardari, G., Guizilini, V., & Romero, R. (2018, October). Effects of Emotion Grouping for Recognition in Human-Robot Interactions. In 2018 7th Brazilian Conference on Intelligent Systems (BRACIS) (pp. 438-443). IEEE. DOI: 10.1109/BRACIS.2018.00082 [GS Search]

Tozadore, D. C., Valentini, J. P. H., de Souza Rodrigues, V. H., Vendrameto, F. M. L., Zavarizz, R. G., & Romero, R. A. F. (2018, November). Towards Adaptation and Personalization in Task Based on Human-Robot Interaction. In 2018 Latin American Robotic Symposium, 2018 Brazilian Symposium on Robotics (SBR) and 2018 Workshop on Robotics in Education (WRE) (pp. 383-389). IEEE. DOI: 10.1109/LARS/SBR/WRE.2018.00075 [GS Search]

Viola, P., & Jones, M. (2001). Rapid object detection using a boosted cascade of simple features. CVPR (1), 1, 511-518. [GS Search]

Arquivos adicionais

Publicado

2020-02-16

Como Citar

TOZADORE, D. C.; ROMERO, R. A. F. Graphical User Interface for educational content programming with social robots activities and how teachers may perceive it. Revista Brasileira de Informática na Educação, [S. l.], v. 28, p. 191–207, 2020. DOI: 10.5753/rbie.2020.28.0.191. Disponível em: https://sol.sbc.org.br/journals/index.php/rbie/article/view/3861. Acesso em: 27 fev. 2024.

Edição

Seção

Artigos Premiados