EyeTeractive: Sistema Multiplataforma de Rastreamento Ocular para Interação Assistiva de Baixo Custo

Resumo


Contexto: Pessoas com deficiências motoras leves ainda encontram dificuldades no uso de tecnologias digitais; devido ao custo e complexidade de ferramentas assistivas disponíveis. Objetivo: Desenvolver um sistema de rastreamento ocular multiplataforma e de baixo custo; que funcione com webcams convencionais e traduza movimentos oculares em comandos de controle. Métodos: A abordagem combina redes neurais convolucionais (ResNet-101) com o MediaPipe para detecção da região ocular. A lógica paraconsistente é utilizada para interpretar a direção e a intensidade do movimento ocular; mesmo sob incertezas. Resultados: O sistema obteve 98;8% de acurácia na classificação de nove direções distintas do olhar. Conclusão: A solução representa uma alternativa acessível e eficaz para interação assistiva baseada em rastreamento ocular com webcams.
Palavras-chave: Rastreamento ocular, tecnologia assistiva, lógica paraconsistente, acessibilidade, webcam

Referências

Arbabi, E., Shabani, M., e Yarigholi, A. (2017). A low cost non-wearable gaze detection system based on infrared image processing.

Ban, S., Lee, Y. J., Yu, K. J., Chang, J. W., Kim, J.-H., e Yeo, W.-H. (2023). Persistent human–machine interfaces for robotic arm control via gaze and eye direction tracking. Advanced Intelligent Systems, 5(7):2200408.

Borges, W. F. e Mendes, E. G. (2024). Tecnologia assistiva e baixa visão: apps e recursos de acessibilidade em dispositivos móveis. Cadernos Brasileiros de Terapia Ocupacional, 32:e3746.

Chao, C., Zhou, P., Belkacem, A. N., Lu, L., Xu, R., Wang, X., Tan, W., Qiao, Z., Li, P., Gao, Q., e SHIN, D. (2020). Quadcopter robot control based on hybrid brain–computer interface system. Sensors and Materials, 32:991.

Chemnad, K. e Othman, A. (2024). Digital accessibility in the era of artificial intelligence—bibliometric analysis and systematic review. Frontiers in Artificial Intelligence, Volume 7 2024.

Chen, S., Chen, C., Fan, L., Fan, M., Zhan, X., e Liu, Y. (2022). Accessible or not? an empirical investigation of android app accessibility. IEEE Transactions on Software Engineering, 48(10):3954–3968.

Ehinger, B. V., Groß, K., Ibs, I., e König, P. (2019). A new comprehensive eye-tracking test battery concurrently evaluating the pupil labs glasses and the eyelink 1000. PeerJ, 7:e7086.

Fernández, A., Usamentiaga, R., Carús, J. L., e Casado, R. (2016). Driver distraction using visual-based sensors and algorithms. Sensors, 16(11).

Fernández, G., Eizaguirre, B., Gonzalez, C., Marinangeli, A., Ciufia, N., Bacigalupe, L., Silva, B., Cohen, L., Pita, C., Garcea, O., Casas, M., Lazaro, L., Pardo, G., e Alonso, R. (2024). Abnormal eye movements increase as motor disabilities and cognitive impairments become more evident in multiple sclerosis: A novel eyetracking study. Multiple Sclerosis Journal Experimental, Translational and Clinical, 10(2):20552173241255008. Epub ahead of print, 2024 Apr-Jun.

Fujii, K., Gras, G., Salerno, A., e Yang, G.-Z. (2018). Gaze gesture based human robot interaction for laparoscopic surgery. Medical Image Analysis, 44:196–214.

Gajos, K. Z., Wobbrock, J. O., e Weld, D. S. (2007). Automatically generating user interfaces adapted to users’ motor and vision capabilities. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, UIST ’07, page 231–240, New York, NY, USA. Association for Computing Machinery.

Garde, G., Larumbe-Bergera, A., Bossavit, B., Porta, S., Cabeza, R., e Villanueva, A. (2021). Low-cost eye tracking calibration: A knowledge-based study. Sensors, 21(15)

He, K., Zhang, X., Ren, S., e Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 770–778.

Heo, J., Yoon, H., e Park, K. S. (2017). A novel wearable forehead eog measurement system for human computer interfaces. Sensors, 17(7).

Housholder, A., Reaban, J., Peregrino, A., Votta, G., e Mohd, T. K. (2021). Evaluating accuracy of the tobii eye tracker 5. In International Conference on Intelligent Human Computer Interaction, pages 379–390. Springer.

Khaan, H. (2024). Nothingness and paraconsistent logic. Revista Latinoamericana de Filosofía, 50:51–62.

Khan, M. Q. e Lee, S. (2019). Gaze and eye tracking: Techniques and applications in adas. Sensors, 19(24).

King, D. E. (2009). Dlib-ml: A machine learning toolkit. The Journal of Machine Learning Research, 10:1755–1758.

Koonce, B. (2021). ResNet 50, pages 63–72. Apress, Berkeley, CA.

Kumar, M. (2006). Reducing the cost of eye tracking systems. Technical report, Stanford University, HCI Group, Gates Building, Room 382, 353 Serra Mall, Stanford, CA 94303-9035, USA. Stanford University Technical Report.

Lopes, A., Fernández, M., Rodríguez, H., Ferrero, F., e Postolache, O. (2018). Development of an eog-based system to control a serious game. Measurement, 127:481–488. Scimago Journal Ranking: 0.72 (2018).

Lugaresi, C., Tang, J., Nash, H., McClanahan, C., Uboweja, E., Hays, M., Zhang, F., Chang, C.-L., Yong, M., Lee, J., et al. (2019). Mediapipe: A framework for perceiving and processing reality. In Third workshop on computer vision for AR/VR at IEEE computer vision and pattern recognition (CVPR), volume 2019.

MacKenzie, I. S. (2012). Evaluating eye tracking systems for computer input. In Majaranta, P., Aoki, H., Donegan, M., Hansen, D. W., Hansen, J. P., Hyrskykari, A., e Räihä, K.-J., editors, Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies, pages 205–225. IGI Global, Hershey, PA. [software].

Mishra, S., Norton, J. J. S., Lee, Y., Lee, D. S., Agee, N., Chen, Y., Chun, Y., e Yeo, W.H. (2017). Soft, conformal bioelectronics for a wireless human-wheelchair interface. Biosensors and Bioelectronics, 91:796–803. PMCID: PMC5323068.

Mozetic, V. A. (2024). Transformação digital centrada no ser humano: redefinindo o espaço público digital. Revista Internacional CONSINTER de Direito, 10(19):291– 306. ISSN 2183-6396 (impresso), 2183-9522 (online). Acesso em: 25 abr. 2025.

Neto, N. G. d. S., Medeiros, F. P. A. d., Araújo, R. P., e Silva, A. M. d. (2021). Acessibilidade em dispositivos móveis: uma análise sob a perspectiva das pesquisas em interação humano computador no brasil/ accessibility on mobile devices: an analysis from the perspective of human computer interaction researches in brazil. Brazilian Journal of Development, 7(4):34137–34150.

Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., e Hays, J. (2016). WebGazer: Scalable webcam eye tracking using user interactions. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI-16), pages 3839–3845. AAAI.

Pérez-Reynoso, F. D., Rodríguez-Guerrero, L., Salgado-Ramírez, J. C., e Ortega-Palacios, R. (2021). Human–machine interface: Multiclass classification by machine learning on 1d eog signals for the control of an omnidirectional robot. Sensors, 21(17).

Rakhmatulin, I. (2020). A review of the low-cost eye-tracking systems for 2010–2020. SSRN Electronic Journal, pages 1–11. Posted: 9 Mar. 2021.

Tanenhaus, M. K. e Spivey-Knowlton, M. J. (1996). Eye-tracking. Language and Cognitive Processes, 11(6):583–588. PsycINFO Database Record (c) 2016 APA, all rights reserved.

Valliappan, N., Dai, N., Steinberg, E., He, J., Rogers, K., Ramachandran, V., Xu, P., Shojaeizadeh, M., Guo, L., Kohlhoff, K., e Navalpakkam, V. (2020). Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature Communications, 11(1):4553.

Wang, X., Xiao, Y., Deng, F., Chen, Y., e Zhang, H. (2021). Eye-movement-controlled wheelchair based on flexible hydrogel biosensor and wt-svm. Biosensors, 11(6):198. PMC8234407.

Xu, J., Huang, Z., Liu, L., Li, X., e Wei, K. (2023). Eye-gaze controlled wheelchair based on deep learning. Sensors, 23(13).

Yang, S. e Du, W. (2024). Tri-cam: Practical eye gaze tracking via camera network. [link]. Acesso em: 20 ago. 2025"

Zhang, X., de Greef, L., Swearngin, A., White, S., Murray, K., Yu, L., Shan, Q., Nichols, J., Wu, J., Fleizach, C., Everitt, A., e Bigham, J. P. (2021). Screen recognition: Creating accessibility metadata for mobile applications from pixels.

Štêpán Novák, J., Masner, J., Benda, P., Šimek, P., e and, V. M. (2024). Eye tracking, usability, and user experience: A systematic review. International Journal of Human–Computer Interaction, 40(17):4484–4500.
Publicado
08/09/2025
GONÇALVES, Jáder Louis de Souza; CUNHA, Lucas Marques de. EyeTeractive: Sistema Multiplataforma de Rastreamento Ocular para Interação Assistiva de Baixo Custo. In: PÔSTERES E DEMONSTRAÇÕES - SIMPÓSIO BRASILEIRO SOBRE FATORES HUMANOS EM SISTEMAS COMPUTACIONAIS (IHC), 24. , 2025, Belo Horizonte/MG. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2025 . p. 241-247. DOI: https://doi.org/10.5753/ihc_estendido.2025.15965.