Real-Time Viewport-Aware Optical Flow Estimation in 360-degree Videos for Visually-Induced Motion Sickness Mitigation

  • Zekun Cao Duke University
  • Regis Kopper University of North Carolina at Greensboro

Resumo


Visually-induced motion sickness (VIMS), a side effect of perceived motion caused by visual stimulation, is a major obstacle to the widespread use of Virtual Reality (VR). Along with scene object information, visual stimulation can be primarily indicated by optical flow, which characterizes the motion pattern, such as the intensity and direction of the moving image. We estimated the real time optical flow in 360-degree videos targeted at immersive user interactive visualization based on the user’s current viewport. The proposed method allows the estimation of customized visual flow for each experience of dynamic 360-degree videos and is an improvement over previous methods that consider a single optical flow value for the entire equirectangular frame. We applied our method to modulate the opacity of granulated rest frames (GRFs), a technique consisting of visual noise-like randomly distributed visual references that are stable to the user’s body during immersive pre-recorded 360-degree video experience. We report the results of a pilot one-session between-subject study with 18 participants, where users watched a 2-minute high-intensity 360-degree video. The results show that our proposed method successfully estimates optical flow, with pilot data showing that GRFs combined with real-time optical flow estimation may improve user comfort when watching 360-degree videos. However, more data are needed for statistically significant results.
Palavras-chave: HCI, Virtual Reality, Optical Flow Estimation, Rest Frames, VIMS

Referências

Paola Araiza-Alba, Therese Keane, Bernadette Matthews, Kate Simpson, Grace Strugnell, Won Sun Chen, and Jordy Kaufman. 2021. The potential of 360- degree virtual reality videos to teach water-safety skills to children. Computers & Education 163 (2021), 104096. https://doi.org/10.1016/j.compedu.2020.104096

Steven S. Beauchemin and John L. Barron. 1995. The computation of optical flow. ACM computing surveys (CSUR) 27, 3 (1995), 433–466.

Zekun Cao, Jeronimo Grandi, and Regis Kopper. 2021. Granulated Rest Frames Outperform Field of View Restrictors on Visual Search Performance. Frontiers in Virtual Reality 2 (2021), 63.

Zekun Cao, Jason Jerald, and Regis Kopper. 2018. Visually-Induced Motion Sickness Reduction via Static and Dynamic Rest Frames. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 105–112. https://doi.org/10.1109/VR.2018.8446210

Zekun Cao, Jason Jerald, and Regis Kopper. 2018. Visually-induced motion sickness reduction via static and dynamic rest frames. In 2018 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE, IEEE, 105–112.

Alexey Dosovitskiy, Philipp Fischer, Eddy Ilg, Philip Hausser, Caner Hazirbas, Vladimir Golkov, Patrick Van Der Smagt, Daniel Cremers, and Thomas Brox. 2015. Flownet: Learning optical flow with convolutional networks. In Proceedings of the IEEE international conference on computer vision. 2758–2766.

Ajoy S Fernandes and Steven K Feiner. 2016. Combating VR sickness through subtle dynamic field-of-view modification. In Proc. 3DUI. IEEE, 201–210.

John F Golding. 2006. Predicting individual differences in motion sickness susceptibility by questionnaire. Personality and Individual differences 41, 2 (2006), 237–248.

Berthold KP Horn and Brian G Schunck. 1981. Determining optical flow. Artificial intelligence 17, 1-3 (1981), 185–203.

Eddy Ilg, Nikolaus Mayer, Tonmoy Saikia, Margret Keuper, Alexey Dosovitskiy, and Thomas Brox. 2017. Flownet 2.0: Evolution of optical flow estimation with deep networks. In Proceedings of the IEEE conference on computer vision and pattern recognition. 2462–2470.

Rifatul Islam, Kevin Desai, and John Quarles. 2021. Cybersickness Prediction from Integrated HMD’s Sensors: A Multimodal Deep Fusion Approach using Eye-tracking and Head-tracking Data. In 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 31–40.

L. James Smart Jr., Edward W. Otten, Adam J. Strang, Eric M. Littman, and Henry E. Cook. 2014. Influence of Complexity and Coupling of Optic Flow on Visually Induced Motion Sickness. Ecological Psychology 26, 4 (2014), 301–324. https://doi.org/10.1080/10407413.2014.958029 arXiv: https://doi.org/10.1080/10407413.2014.958029

Robert S Kennedy, Norman E Lane, Kevin S Berbaum, and Michael G Lilienthal. 1993. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. The international journal of aviation psychology 3, 3 (1993), 203–220.

Behrang Keshavarz, Heiko Hecht, and Ben Lawson. 2014. Visually induced motion sickness: Characteristics, causes, and countermeasures. In Handbook of Virtual Environments: Design, Implementation, and Applications. CRC Press, 648–697.

Jinwoo Kim, Woojae Kim, Heeseok Oh, Seongmin Lee, and Sanghoon Lee. 2019. A deep cybersickness predictor based on brain signal analysis for virtual reality contents. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 10580–10589.

Tae Min Lee, Jong-Chul Yoon, and In-Kwon Lee. 2019. Motion sickness prediction in stereoscopic videos using 3d convolutional neural networks. IEEE transactions on visualization and computer graphics 25, 5 (2019), 1919–1927.

WT Lo and Richard HY So. 2001. Cybersickness in the presence of scene rotational movements along different axes. Applied ergonomics 32, 1 (2001), 1–14.

Nitish Padmanaban, Timon Ruban, Vincent Sitzmann, Anthony M Norcia, and Gordon Wetzstein. 2018. Towards a machine-learning approach for sickness prediction in 360 stereoscopic videos. IEEE transactions on visualization and computer graphics 24, 4 (2018), 1594–1603.

Su Han Park, Bin Han, and Gerard Jounghyun Kim. 2022. Mixing in reverse optical flow to mitigate vection and simulation sickness in virtual reality. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–11.

Devika Patel, Jessica Hawkins, Lara Zena Chehab, Patrick Martin-Tuite, Joshua Feler, Amy Tan, Benjamin S Alpers, Sophia Pink, Jerome Wang, Jonathan Freise, et al. 2020. Developing virtual reality trauma training experiences using 360- degree video: tutorial. Journal of medical Internet research 22, 12 (2020), e22420.

Robert J Peterka. 2018. Sensory integration for human balance control. Handbook of clinical neurology 159 (2018), 27–42.

Jerrold Douglas Prothero and Thomas A. Furness. 1998. The Role of Rest Frames in Vection, Presence and Motion Sickness. Ph. D. Dissertation. USA. AAI9836238.

Michael A Rupp, Katy L Odette, James Kozachuk, Jessica R Michaelis, Janan A Smither, and Daniel S McConnell. 2019. Investigating learning outcomes and subjective experiences in 360-degree videos. Computers & Education 128 (2019), 256–268.

Rustam Shadiev, Liuxin Yang, and Yueh Min Huang. 2022. A review of research on 360-degree video and its applications to education. Journal of Research on Technology in Education 54, 5 (2022), 784–799.

Rabia Shafi, Wan Shuai, and Muhammad Usman Younus. 2020. 360-degree video streaming: A survey of the state of the art. Symmetry 12, 9 (2020), 1491.

Kay M Stanney, Robert S Kennedy, and Julie M Drexler. 1997. Cybersickness is not simulator sickness. Proceedings of the Human Factors and Ergonomics Society annual meeting 41, 2 (1997), 1138–1142.
Publicado
06/11/2023
Como Citar

Selecione um Formato
CAO, Zekun; KOPPER, Regis. Real-Time Viewport-Aware Optical Flow Estimation in 360-degree Videos for Visually-Induced Motion Sickness Mitigation. In: SIMPÓSIO DE REALIDADE VIRTUAL E AUMENTADA (SVR), 25. , 2023, Rio Grande/RS. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2023 . p. 210–218.