Vol.69, No.1, 2021, pp.647-660, doi:10.32604/cmc.2021.016712
OPEN ACCESS
ARTICLE
Immersion Analysis Through Eye-Tracking and Audio in Virtual Reality
  • Jihoon Lee, Nammee Moon*
Department of Computer Science, Hoseo University, Asan-si, 31499, Korea
* Corresponding Author: Nammee Moon. Email:
(This article belongs to this Special Issue: Advances of AI and Blockchain technologies for Future Smart City)
Received 09 January 2021; Accepted 12 March 2021; Issue published 04 June 2021
Abstract
In this study, using Head Mounted Display (HMD), which is one of the biggest advantage of Virtual Reality (VR) environment, tracks the user’s gaze in 360° video content, and examines how the gaze pattern is distributed according to the user’s immersion. As a result of analyzing the gaze pattern distribution of contents with high user immersion and contents with low user immersion through a questionnaire, it was confirmed that the higher the immersion, the more the gaze distribution tends to be concentrated in the center of the screen. Through this experiment, we were able to understand the factors that make users immerse themselves in the VR environment, and among them, the importance of the audio of the content was shown. Furthermore, it was found that the shape of the gaze distribution for grasping the degree of immersion by the subject of the content was different. While reviewing the experimental results, we also confirmed the necessity of research to recognize specific objects in a VR environment.
Keywords
Virtual reality; eye-tracking; immersion analysis; 360° video
Cite This Article
J. Lee and N. Moon, "Immersion analysis through eye-tracking and audio in virtual reality," Computers, Materials & Continua, vol. 69, no.1, pp. 647–660, 2021.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.