Harnessing Foveated Rendering and AI to Tackle VR Cybersickness: A Feature-Centric Perspective
Resumo
As virtual reality becomes increasingly immersive, issues related to cybersickness pose a major challenge. This review investigates how foveated rendering techniques, powered by artificial intelligence, are transforming our response to this topic. We analyze the primary factors that lead to cybersickness, including latency, field of view, vergence-accommodation mismatch, and unnatural locomotion, while demonstrating how adaptive visual strategies can significantly alleviate user discomfort. By considering individual traits like age, previous virtual reality experience, and real-time physiological indicators, including heart rate and skin conductance, modern rendering systems are evolving to be more intelligent and user-specific. We emphasize the role of advanced machine learning models, from interpretable symbolic frameworks to deep neural networks, along with gaze prediction systems that enable real-time adjustments through predictive rendering and user-context-specific optimization. Our findings highlight the promise of closed-loop rendering systems, which aim to preserve visual fidelity while enhancing comfort and engagement, steering to toward safer, more personalized virtual reality experiences.
Referências
Bala, P. Oakley, I. Nisi, V. and Nunes, N. J. (2021). Dynamic field of view restriction in 360 video: Aligning optical flow and visual slam to mitigate vims. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pages 1–18. DOI: 10.1145/3411764.3445499.
Bauer, D. Wu, Q. and Ma, K.-L. (2022). Fovolnet: Fast volume rendering using foveated deep neural networks. IEEE Transactions on Visualization and Computer Graphics, 29(1):515–525. DOI: 10.1109/tvcg.2022.3209498.
Biswas, N. Mukherjee, A. and Bhattacharya, S. (2024). “Are you feeling sick?” – a systematic literature review of cybersickness in virtual reality. ACM Computing Surveys, 56(11):1–38. DOI: 10.1145/3670008.
Brunnström, K. Dima, E. Qureshi, T. Johanson, M. Andersson, M. and Sjöström, M. (2020). Latency impact on quality of experience in a virtual reality simulator for remote control of machines. Signal Processing: Image Communication, 89:116005. DOI: 10.1016/j.image.2020.116005.
Chang, E. Billinghurst, M. and Yoo, B. (2023). Brain activity during cybersickness: a scoping review. Virtual Reality, 27(3):2073–2097. DOI: 10.1007/s10055-023-00795-y.
Davis, J. Hsieh, Y.-H. and Lee, H.-C. (2015). Humans perceive flicker artifacts at 500 Hz. Scientific Reports, 5(1):7861. DOI: 10.1038/srep07861.
Deng, N. He, Z. Ye, J. Duinkharjav, B. Chakravarthula, P. Yang, X. and Sun, Q. (2022). Fov-nerf: Foveated neural radiance fields for virtual reality. IEEE Transactions on Visualization and Computer Graphics, 28(11):3854–3864. DOI: 10.1109/tvcg.2022.3203102.
Dilanchian, A. T. Andringa, R. and Boot, W. R. (2021). A pilot study exploring age differences in presence, workload, and cybersickness in immersive virtual reality environments. Frontiers in Virtual Reality, 2:736793. DOI: 10.3389/frvir.2021.736793.
Ding, D. Cao, Z. Gu, Z. Chen, H. Qi, C. and Dong, F. (2025). Foanet: Focus of attention prediction for foveated pre-rendering to enable high-quality edge VR. ACM Transactions on Sensor Networks. DOI: 10.1145/3722222.
Drazich, B. F. McPherson, R. Gorman, E. F. Chan, T. Teleb, J. Galik, E. and Resnick, B. (2023). In too deep? A systematic literature review of fully immersive virtual reality and cybersickness among older adults. Journal of the American Geriatrics Society, 71(12):3906–3915. DOI: 10.1111/jgs.18553.
Duchowski, A. T. Cournia, N. and Murphy, H. (2003). Gaze-contingent displays: Review and current trends. Available at: [link].
Emery, K. J. Zannoli, M. Warren, J. Xiao, L. and Talathi, S. S. (2021). OpenNEEDs: A dataset of gaze, head, hand, and scene signals during exploration in open-ended VR environments. In ACM Symposium on Eye Tracking Research and Applications, pages 1–7. DOI: 10.1145/3448018.3457996.
Fan, R. Wu, J. Shi, X. Zhao, L. Ma, Q. and Wang, L. (2025). Fov-GS: Foveated 3D Gaussian splatting for dynamic scenes. IEEE Transactions on Visualization and Computer Graphics. DOI: 10.1109/tvcg.2025.3549576.
Feldstein, I. T. and Ellis, S. R. (2020). A simple video-based technique for measuring latency in virtual reality or teleoperation. IEEE Transactions on Visualization and Computer Graphics, 26(11):3463–3473. DOI: 10.1109/tvcg.2020.3018843.
Garbin, S. J. Kowalski, M. Johnson, M. Shotton, J. and Valentin, J. (2021). Fast neural rendering for free-view synthesis. ACM Transactions on Graphics, 40(4):1–13. DOI: 10.1145/3450626.3459802.
Garcia-Agundez, A. Folkerts, A. K. Konrad, R. Caserman, P. Tregel, T. Goosses, M. and Göbel, S. (2019). Detecting cybersickness through heart rate variability. Frontiers in Virtual Reality, 1:1–11. DOI: 10.3389/frvir.2019.00001.
Henriques, V. P. Silva, F. L. Sousa, C. A. and Clua, E. (2024). Foveated path culling: Hybrid path tracing and radiance fields for foveated 3D Gaussian splatting. Computers & Graphics, 118:103917. DOI: 10.1016/j.cag.2024.103917.
Hussain, A. Hassan, A. and Rehman, S. (2021). Foveated rendering for VR sickness reduction: a comprehensive review. IEEE Access, 9:169255–169270. DOI: 10.1109/access.2021.3135099.
Illahi, M. Pande, P. and Verma, G. (2022). Real-time gaze prediction for foveated rendering using head motion and gaze history. IEEE Access, 10:127451–127463. DOI: 10.1109/access.2022.3223854.
Islam, M. R. Kamal, A. R. M. and Ahmed, M. U. (2020). Hybrid deep learning model for cybersickness detection using physiological signals. IEEE Access, 8:105797–105808. DOI: 10.1109/access.2020.2998793.
Jabbireddy, V. Mohanto, S. and Chen, H. (2022). Modeling the human visual system for foveated rendering. Journal of Vision Research, 190:107995. DOI: 10.1016/j.visres.2022.107995.
Kemeny, A. George, P. and Merienne, F. (2020). Motion sickness in virtual environments: causes and mitigation strategies. Presence: Teleoperators and Virtual Environments, 29(4):358–373. DOI: 10.1162/pres_a_00338.
Liu, H. Zhang, L. and Zhou, X. (2025). FovealNet: Event-driven foveated rendering with saliency-aware gaze prediction. IEEE Transactions on Visualization and Computer Graphics, 31(3):1128–1141. DOI: 10.1109/tvcg.2025.3539941.
MacArthur, K. R. Little, S. and Price, T. (2021). Gender differences in cybersickness: A systematic review. Computers in Human Behavior Reports, 4:100151. DOI: 10.1016/j.chbr.2021.100151.
Nunes da Silva, R. Alves, J. P. Medeiros, F. and Clua, E. (2024). Symbolic machine learning for adaptive cybersickness mitigation using physiological signals. Frontiers in Virtual Reality, 5:145–160. DOI: 10.3389/frvir.2024.00145.
Oh, S. and Son, Y. (2022). CYRE: A reference dataset for cybersickness research in VR environments. Virtual Reality, 26(4):1619–1635. DOI: 10.1007/s10055-021-00560-9.
Porcino, T. M. Clua, E. Trevisan, D. Sassi, V. and Oliveira, J. C. (2021). A survey on cybersickness in virtual environments: causes, assessment, and mitigation. Computers & Graphics, 95:102–117. DOI: 10.1016/j.cag.2021.03.011.
Salehi, M. van der Meulen, E. and Clua, E. (2024). Head movement patterns as predictors of cybersickness in virtual reality. Frontiers in Virtual Reality, 5:117–130. DOI: 10.3389/frvir.2024.00117.
Ye, J. Zhao, L. and Ma, Q. (2024). Neural foveated super-resolution for real-time VR rendering. IEEE Transactions on Visualization and Computer Graphics, 30(7):4001–4013. DOI: 10.1109/tvcg.2024.3579835.
