XRBars: Fast and Reliable Multiple Choice Selection by Gaze in XR

  • João Vitor Nogueira UFF
  • Carlos Morimoto USP

Abstract


Web browsing is essential for modern education, supporting everything from self-directed study to academic research. However, traditional web interfaces are designed for keyboards and mice, and touch screens creating accessibility barriers for users with motor impairments. Modern Head-Mounted Displays (HMDs), which lack these conventional input devices but often include eye-tracking technology, make gaze-based interaction in XR a promising alternative due to its immersive experience. For a gaze-based XR web browser to be viable, it must be efficient and provide a good user experience. To this end, this paper proposes XRBars, a system that leverages GazeBars to improve accessibility to online educational resources in XR.
Keywords: Gaze Interaction, Web Browsing, Mixed Reality

References

Colin Allison, Alan Miller, Iain Oliver, Rosa Michaelson, and Thanassis Tiropanis. 2012. The Web in education. Computer Networks 56, 18 (2012), 3811–3824.

Emmanuel Arias, Gustavo López, Luis Quesada, and Luis Guerrero. 2016. Web accessibility for people with reduced mobility: a case study using eye tracking. In Advances in Design for Inclusion: Proceedings of the AHFE 2016 International Conference on Design for Inclusion, July 27-31, 2016, Walt Disney World®, Florida, USA. Springer, 463–473.

Michael Ashmore, Andrew T Duchowski, and Garth Shoemaker. 2005. Efficient eye pointing with a fisheye lens. In Proceedings of Graphics interface 2005. 203–210.

Dennis Beck. 2019. Augmented and virtual reality in education: Immersive learning research. Journal of Educational Computing Research 57, 7 (2019), 1619–1625.

Peter Brophy and Jenny Craven. 2007. Web accessibility. Library trends 55, 4 (2007), 950–972.

Milton Campoverde-Molina, Sergio Lujan-Mora, and Llorenc Valverde Garcia. 2020. Empirical studies on web accessibility of educational websites: A systematic literature review. IEEE Access 8 (2020), 91676–91700.

Alex Torquato Souza Carneiro, Candy Veronica Tenorio Gonzales, and Carlos Hitoshi Morimoto. 2023. EyePursuitLinks - an Eye-pursuit Based Interface for Web Browsing Using Smart Targets. In Proceedings of the 29th Brazilian Symposium on Multimedia and the Web (, Ribeirão Preto, Brazil,) (WebMedia ’23). Association for Computing Machinery, New York, NY, USA, 16–24. DOI: 10.1145/3617023.3617058

Matteo Casarini, Marco Porta, and Piercarlo Dondi. 2020. A gaze-based web browser with multiple methods for link selection. In ACM Symposium on Eye Tracking Research and Applications. 1–8.

Emiliano Castellina, Fulvio Corno, et al. 2007. Accessible web surfing through gaze interaction. In Gaze-based Creativity, Interacting with Games and On-line Communities. 74–77.

Raphael Cohen-Almagor. 2013. Internet history. In Moral, ethical, and social dilemmas in the age of technology: Theories and practice. IGI Global, 19–39.

Inmaculada Coma-Tatay, Sergio Casas-Yrurzum, Pablo Casanova-Salas, and Marcos Fernández-Marín. 2019. FI-AR learning: a web-based platform for augmented reality educational content. Multimedia Tools and Applications 78, 5 (2019), 6093–6118.

Piercarlo Dondi and Marco Porta. 2023. Gaze-based human–computer interaction for museums and exhibitions: technologies, Applications and Future Perspectives. Electronics 12, 14 (2023), 3064.

Carlos Elmadjian and Carlos H Morimoto. 2021. Gazebar: Exploiting the midas touch in gaze interaction. In Extended abstracts of the 2021 CHI conference on human factors in computing systems. 1–7.

Augusto Esteves, Yonghwan Shin, and Ian Oakley. 2020. Comparing selection mechanisms for gaze input techniques in head-mounted displays. International Journal of Human-Computer Studies 139 (2020), 102414.

Wenxin Feng, Jiangnan Zou, Andrew Kurauchi, Carlos H Morimoto, and Margrit Betke. 2021. HGaze Typing: Head-Gesture Assisted Gaze Typing. Association for Computing Machinery, New York, NY, USA. DOI: 10.1145/3448017.3457379

Pedro Figueiredo and Manuel J Fonseca. 2018. EyeLinks: a gaze-only click alternative for heterogeneous clickables. In Proceedings of the 20th ACM International Conference on Multimodal Interaction. 307–314.

Tovi Grossman and Ravin Balakrishnan. 2005. The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor’s activation area. In Proceedings of the SIGCHI conference on Human factors in computing systems. 281–290.

Juan David Hincapié-Ramos, Xiang Guo, Paymahn Moghadasian, and Pourang Irani. 2014. Consumed endurance: a metric to quantify arm fatigue of midair interactions. In Proceedings of the SIGCHI conference on human factors in computing systems. 1063–1072.

Dave J Hobbs and RJ Taylor. 1996. The Impact on Education of the World Wide Web. ERIC.

Robert JK Jacob. 1995. Eye tracking in advanced interface design. Virtual environments and advanced interface design 258, 288 (1995), 2.

Mohammad Amin Kuhail, Areej ElSayary, Shahbano Farooq, and Ahlam Alghamdi. 2022. Exploring immersive learning experiences: A survey. In Informatics, Vol. 9. MDPI, 75.

Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos H. Morimoto, and Margrit Betke. 2020. Swipe&Switch: Text Entry Using Gaze Paths and Context Switching. In Adjunct Publication of the 33rd Annual ACM Symposium on User Interface Software and Technology (Virtual Event, USA) (UIST ’20 Adjunct). Association for Computing Machinery, New York, NY, USA, 84–86. DOI: 10.1145/3379350.3416193

Lik Hang Lee, Kit Yung Lam, Yui Pan Yau, Tristan Braud, and Pan Hui. 2019. Hibey: Hide the keyboard in augmented reality. In 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom. IEEE, 1–10.

Xueshi Lu, Difeng Yu, Hai-Ning Liang, and Jorge Goncalves. 2021. itext: Handsfree text entry on an imaginary keyboard for augmented reality systems. In The 34th Annual ACM Symposium on User Interface Software and Technology. 815–825.

Xueshi Lu, Difeng Yu, Hai-Ning Liang, Wenge Xu, Yuzheng Chen, Xiang Li, and Khalad Hasan. 2020. Exploration of hands-free text entry techniques for virtual reality. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 344–349.

Yiqin Lu, Chun Yu, Xin Yi, Yuanchun Shi, and Shengdong Zhao. 2017. Blindtype: Eyes-free text entry on handheld touchpad by leveraging thumb’s muscle memory. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 2 (2017), 1–24.

Christof Lutteroth, Moiz Penkar, and Gerald Weber. 2015. Gaze vs. mouse: A fast and accurate gaze-only click alternative. In Proceedings of the 28th annual ACM symposium on user interface software & technology. 385–394.

I Scott MacKenzie. 2010. An eye on input: research challenges in using the eye for computer input control. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. 11–12.

Raphael Menges, Chandan Kumar, and Steffen Staab. 2019. Improving User Experience of Eye Tracking-Based Interaction: Introspecting and Adapting Interfaces. In ACM Trans. Comput.-Hum. Interact., Vol. 26. ACM, New York, NY, USA, Article 37, 46 pages. DOI: 10.1145/3338844

Shailendra Palvia, Prageet Aeron, Parul Gupta, Diptiranjan Mahapatra, Ratri Parida, Rebecca Rosner, and Sumita Sindhi. 2018. Online education: Worldwide status, challenges, trends, and implications. 233–241 pages.

Abdul Moiz Penkar, Christof Lutteroth, and Gerald Weber. 2013. Eyes only: Navigating hypertext with gaze. In Human-Computer Interaction–INTERACT 2013: 14th IFIP TC 13 International Conference, Cape Town, South Africa, September 2-6, 2013, Proceedings, Part II 14. Springer, 153–169.

Marco Porta and Matteo Turina. 2008. Eye-S: a full-screen input modality for pure eye-based communication. In Proceedings of the 2008 symposium on Eye tracking research & applications. 27–34.

Xiuquan Qiao, Pei Ren, Schahram Dustdar, Ling Liu, Huadong Ma, and Junliang Chen. 2019. Web AR: A promising future for mobile augmented reality—State of the art, challenges, and insights. Proc. IEEE 107, 4 (2019), 651–666.

Vijay Rajanna and John Paulin Hansen. 2018. Gaze typing in virtual reality: impact of keyboard design, selection method, and motion. In Proceedings of the 2018 ACM symposium on eye tracking research & applications. 1–10.

Daniel Vella and Chris Porter. 2024. Remapping the document object model using geometric and hierarchical data structures for efficient eye control. Proceedings of the ACM on Human-Computer Interaction 8, ETRA (2024), 1–16.

Zhimin Wang, Maohang Rao, Shanghua Ye, Weitao Song, and Feng Lu. 2025. Towards spatial computing: recent advances in multimodal natural interaction for XR headsets. arXiv preprint arXiv:2502.07598 (2025).

Zhimin Wang, Maohang Rao, Shanghua Ye, Weitao Song, and Feng Lu. 2025. Towards spatial computing: recent advances in multimodal natural interaction for XR headsets. arXiv preprint arXiv:2502.07598 (2025).

Xin Yi, Yiqin Lu, Ziyin Cai, Zihan Wu, Yuntao Wang, and Yuanchun Shi. 2022. Gazedock: Gaze-only menu selection in virtual reality using auto-triggering peripheral menu. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 832–842.

Chun Yu, Yizheng Gu, Zhican Yang, Xin Yi, Hengliang Luo, and Yuanchun Shi. 2017. Tap, dwell or gesture? exploring head-based text entry techniques for hmds. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 4479–4488.

Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human factors in computing systems. 246–253.

Maozheng Zhao, Alec M Pierce, Ran Tan, Ting Zhang, Tianyi Wang, Tanya R Jonker, Hrvoje Benko, and Aakar Gupta. 2023. Gaze speedup: Eye gaze assisted gesture typing in virtual reality. In Proceedings of the 28th International Conference on Intelligent User Interfaces. 595–606.

Suwen Zhu, Tianyao Luo, Xiaojun Bi, and Shumin Zhai. 2018. Typing on an invisible keyboard. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.
Published
2025-06-03
NOGUEIRA, João Vitor; MORIMOTO, Carlos. XRBars: Fast and Reliable Multiple Choice Selection by Gaze in XR. In: ACM INTERNATIONAL CONFERENCE ON INTERACTIVE MEDIA EXPERIENCES WORKSHOPS (IMXW), 25. , 2025, Niterói/RJ. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2025 . p. 155-159. DOI: https://doi.org/10.5753/imxw.2025.2085.