Open Access iconOpen Access

ARTICLE

crossmark

Immersion Analysis Through Eye-Tracking and Audio in Virtual Reality

Jihoon Lee, Nammee Moon*

Department of Computer Science, Hoseo University, Asan-si, 31499, Korea

* Corresponding Author: Nammee Moon. Email: email

(This article belongs to this Special Issue: Advances of AI and Blockchain technologies for Future Smart City)

Computers, Materials & Continua 2021, 69(1), 647-660. https://doi.org/10.32604/cmc.2021.016712

Abstract

In this study, using Head Mounted Display (HMD), which is one of the biggest advantage of Virtual Reality (VR) environment, tracks the user’s gaze in 360° video content, and examines how the gaze pattern is distributed according to the user’s immersion. As a result of analyzing the gaze pattern distribution of contents with high user immersion and contents with low user immersion through a questionnaire, it was confirmed that the higher the immersion, the more the gaze distribution tends to be concentrated in the center of the screen. Through this experiment, we were able to understand the factors that make users immerse themselves in the VR environment, and among them, the importance of the audio of the content was shown. Furthermore, it was found that the shape of the gaze distribution for grasping the degree of immersion by the subject of the content was different. While reviewing the experimental results, we also confirmed the necessity of research to recognize specific objects in a VR environment.

Keywords


Cite This Article

J. Lee and N. Moon, "Immersion analysis through eye-tracking and audio in virtual reality," Computers, Materials & Continua, vol. 69, no.1, pp. 647–660, 2021.



cc This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 2310

    View

  • 1376

    Download

  • 0

    Like

Share Link