Prototyping Immersive Interactions with Speech Recognition in Virtual Reality

  • Jacky Yang Ontario Tech University
  • Michael Chan Ontario Tech University
  • Alvaro Uribe-Quevedo Ontario Tech University
  • Bill Kapralos Ontario Tech University
  • Norman Jaimes Universidad Militar Nueva Granada
  • Adam Dubrowski Ontario Tech University

Resumo


Traditional avatar interactions rely on buttons and menus which are suitable for 2D standing computer monitors yet not suitable for virtual reality where the trainee is immersed in a 3D world and typically doesn't have access to the keyboard and mouse. Virtual avatar interactions are becoming relevant in education, health care, training, and entertainment, where immerse dialogue systems can impact presence, empathy, and ultimately, the virtual task being performed. In this paper, we outline a framework that allows us to prototype modes of virtual communication that enhances trainee immersion when interacting with a virtual character in virtual reality. These modes of communication include speaking, gesturing, and object fetching tasks. To understand the applicability of the framework, an eye examination scenario was employed as a test bed where a trainee acts as an optometrist performing standard eye examination procedures on a virtual patient.
Palavras-chave: speech recognition, gestures, medical simulation, eye examination, virtual avatars
Publicado
07/11/2020
Como Citar

Selecione um Formato
YANG, Jacky; CHAN, Michael; URIBE-QUEVEDO, Alvaro; KAPRALOS, Bill; JAIMES, Norman; DUBROWSKI, Adam. Prototyping Immersive Interactions with Speech Recognition in Virtual Reality. In: SIMPÓSIO DE REALIDADE VIRTUAL E AUMENTADA (SVR), 22. , 2020, Evento Online. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2020 . p. 438-442.