Enabling Face Tracking in an Affordable Social Robot Using a Computer Vision Camera

  • María Gaitán-Padilla UFES
  • Elizabeth Sánchez R. UFES
  • Dionatas Brito UFES
  • Maria José Pontes UFES
  • Ricardo Mello UFES
  • Carlos A. Cifuentes University of the West of England
  • Camilo A. R. Diaz UFES

Resumo


Social robotics has been explored as a tool to support social interactions in various applications of human well-being. In therapies of children with Autism Spectrum Disorder (ASD) social robots have been included as motivational tools to perform activities within the therapeutic session. This work presents a contribution to the natural behaviours of the CASTOR robot, an open-source robot for assisted therapies of children with ASD. The project proposal involves using a camera with computer vision and implementing face recognition to improve the movements of the robot's neck and eyes. The tests were carried out in two phases; first, validating the functioning of the eye-contact tracking; and then, on the neck movement. The resulting improvements provided more natural and engaging gaze behaviour, where the responsiveness of the system obtained an accuracy of 96.11 % in the eyes and an RMSE of 3.62° of the tracking angle for the neck. Future work will evaluate user perception of the natural gaze system in a social interaction activity using standardized questionnaires.
Palavras-chave: Computer vision, Tracking, Face recognition, Social robots, Robot vision systems, Education, Medical treatment, Cameras, Neck, Proposals, Face Tracking, Human-Robot Interaction, Sensor-based Control, Social behaviour
Publicado
13/10/2025
GAITÁN-PADILLA, María; R., Elizabeth Sánchez; BRITO, Dionatas; PONTES, Maria José; MELLO, Ricardo; CIFUENTES, Carlos A.; DIAZ, Camilo A. R.. Enabling Face Tracking in an Affordable Social Robot Using a Computer Vision Camera. In: SIMPÓSIO BRASILEIRO DE ROBÓTICA E SIMPÓSIO LATINO AMERICANO DE ROBÓTICA (SBR/LARS), 17. , 2025, Vitória/ES. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2025 . p. 243-248.