EyePursuitLinks - an Eye-pursuit Based Interface for Web Browsing Using Smart Targets

  • Alex Torquato Souza Carneiro USP
  • Candy Veronica Tenorio Gonzales USP
  • Carlos Hitoshi Morimoto USP

Resumo


Web accessibility and digital inclusion are still one of the great research challenges in computer science. For people with severe motor disabilities, eye-gaze interaction enables hands-free operation of graphical interfaces. Nonetheless, gaze tracking devices typically require frequent user calibration that weakens the overall user experience and performance. In this paper we propose EyePursuitLinks, an eye-pursuit based technique that allows people with severe motor disabilities to browse the web by following multiple moving targets corresponding to the links they want to follow. The main contribution of this paper is the use of Smart Targets (ST) to select potentially large number of hyperlinks within a relatively small area. We have performed a user experiment with 10 volunteers to evaluate the performance of the Smart Targets selection against conventional pursuit selection mechanism using 4, 8, and 16 simultaneous targets. Our results show that the use of ST is significantly more robust than the conventional method for larger number of targets.
Palavras-chave: Eye-gaze interaction, Responsive interfaces, Probabilistic algorithm, Smooth pursuits

Referências

M. Cecilia Baranauskas, Clarisse de Souza, and Roberto Pereira. 2014. I GranDIHC-BR — Grandes Desafios de Pesquisa em Interação Humano-Computador no Brasil. https://doi.org/10.13140/2.1.3651.9201

Alex Torquato S. Carneiro, Carlos Eduardo L. Elmadjian, Candy Gonzales, Flavio L. Coutinho, and Carlos H. Morimoto. 2019. PursuitPass: A Visual Pursuit-Based User Authentication System. In 2019 32nd SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI). IEEE, Brazil, 226–233. https://doi.org/10.1109/SIBGRAPI.2019.00038

Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O’Hara, and Frank Vetere. 2016. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (San Jose, California, USA) (CHI ’16). ACM, New York, NY, USA, 3415–3427. https://doi.org/10.1145/2858036.2858284

Christopher Clarke, Alessio Bellino, Augusto Esteves, Eduardo Velloso, and Hans Gellersen. 2016. TraceMatch: a computer vision technique for user input by tracing of animated controls. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. 298–303.

Robert Dawson. 2011. How significant is a boxplot outlier?Journal of Statistics Education 19, 2 (2011)

Heiko Drewes, Mohamed Khamis, and Florian Alt. 2019. DialPlates: enabling pursuits-based user interfaces with large target numbers. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia. 1–10.

Andrew T Duchowski, Nathan Cournia, and Hunter Murphy. 2004. Gaze-contingent displays: A review. CyberPsychology & Behavior 7, 6 (2004), 621–634

Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, 457–466.

Pedro Figueiredo and Manuel J. Fonseca. 2018. EyeLinks: A Gaze-Only Click Alternative for Heterogeneous Clickables. In Proceedings of the 20th ACM International Conference on Multimodal Interaction (Boulder, CO, USA) (ICMI ’18). Association for Computing Machinery, New York, NY, USA, 307–314. https://doi.org/10.1145/3242969.3243021

David Halliday, Robert Resnick, and Jearl Walker. 2016. Fundamentos de física. Vol. I.Grupo Gen-LTC

Christof Lutteroth, Moiz Penkar, and Gerald Weber. 2015. Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (Charlotte, NC, USA) (UIST ’15). Association for Computing Machinery, New York, NY, USA, 385–394. https://doi.org/10.1145/2807442.2807461

Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8, 1 (2015), 1–11

P. Majaranta, U.K. Ahola, and O. Špakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the 27th international conference on Human factors in computing systems (CHI ’09). ACM, New York, NY, 357–360. https://doi.org/10.1145/1518701.1518758

Katharina Reiter, Ken Pfeuffer, Augusto Esteves, Tim Mittermeier, and Florian Alt. 2022. Look & Turn: One-handed and Expressive Menu Interaction by Gaze and Arm Turns in VR. In 2022 Symposium on Eye Tracking Research and Applications. 1–7.

John W Tukey 1977. Exploratory data analysis. Vol. 2. Reading, Mass

JJ Van Middendorp, F Watkins, C Park, and H Landymore. 2015. Eye-tracking computer systems for inpatients with tetraplegia: findings from a feasibility study. Spinal cord 53, 3 (2015), 221–225

Eduardo Velloso, Flavio Luiz Coutinho, Andrew Kurauchi, and Carlos H Morimoto. 2018. Circular orbits detection for gaze interaction using 2D correlation and profile matching algorithms. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, 25.

Eduardo Velloso and Carlos H Morimoto. 2021. A probabilistic interpretation of motion correlation selection techniques. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–13.

David Verweij, Augusto Esteves, Vassilis-Javed Khan, and Saskia Bakker. 2017. WaveTrace: motion matching input using wrist-worn motion sensors. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 2180–2186.

Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (Zurich, Switzerland) (UbiComp ’13). ACM, New York, NY, USA, 439–448. https://doi.org/10.1145/2493432.2493477

Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2015. Pursuits: Spontaneous eye-based interaction for dynamic interfaces. GetMobile: Mobile Computing and Communications 18, 4 (2015), 8–10.

Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W Gellersen. 2013. Pursuits: eye-based interaction with moving targets. In CHI’13 Extended Abstracts on Human Factors in Computing Systems. 3147–3150

John Williamson. 2006. Continuous uncertain interaction. Ph. D. Dissertation. University of Glasgow (United Kingdom)

John Williamson and Roderick Murray-Smith. 2004. Pointing without a pointer. In CHI’04 Extended Abstracts on Human Factors in Computing Systems. ACM, 1407–1410.
Publicado
23/10/2023
Como Citar

Selecione um Formato
CARNEIRO, Alex Torquato Souza; GONZALES, Candy Veronica Tenorio; MORIMOTO, Carlos Hitoshi. EyePursuitLinks - an Eye-pursuit Based Interface for Web Browsing Using Smart Targets. In: SIMPÓSIO BRASILEIRO DE SISTEMAS MULTIMÍDIA E WEB (WEBMEDIA), 29. , 2023, Ribeirão Preto/SP. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2023 . p. 16-24.