Open Access
ARTICLE
Study on User Interaction for Mixed Reality through Hand Gestures Based on Neural Network
1 Department of Game Design and Development, Sangmyung University, Seoul, 03016, Republic of Korea
2 Department of Computer Engineering, Chosun University, Gwangju, 61452, Republic of Korea
* Corresponding Author: SeongKi Kim. Email:
Computers, Materials & Continua 2025, 85(2), 2701-2714. https://doi.org/10.32604/cmc.2025.067280
Received 29 April 2025; Accepted 19 August 2025; Issue published 23 September 2025
Abstract
The rapid evolution of virtual reality (VR) and augmented reality (AR) technologies has significantly transformed human-computer interaction, with applications spanning entertainment, education, healthcare, industry, and remote collaboration. A central challenge in these immersive systems lies in enabling intuitive, efficient, and natural interactions. Hand gesture recognition offers a compelling solution by leveraging the expressiveness of human hands to facilitate seamless control without relying on traditional input devices such as controllers or keyboards, which can limit immersion. However, achieving robust gesture recognition requires overcoming challenges related to accurate hand tracking, complex environmental conditions, and minimizing system latency. This study proposes an artificial intelligence (AI)-driven framework for recognizing both static and dynamic hand gestures in VR and AR environments using skeleton-based tracking compliant with the OpenXR standard. Our approach employs a lightweight neural network architecture capable of real-time classification within approximately 1.3 ms while maintaining average accuracy of 95%. We also introduce a novel dataset generation method to support training robust models and demonstrate consistent classification of diverse gestures across widespread commercial VR devices. This work represents one of the first studies to implement and validate dynamic hand gesture recognition in real time using standardized VR hardware, laying the groundwork for more immersive, accessible, and user-friendly interaction systems. By advancing AI-driven gesture interfaces, this research has the potential to broaden the adoption of VR and AR across diverse domains and enhance the overall user experience.Keywords
Cite This Article
Copyright © 2025 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools