Vol.131, No.1, 2022, pp.219-237, doi:10.32604/cmes.2022.018413
Game Outlier Behavior Detection System Based on Dynamic Time Warp Algorithm
  • Shinjin Kang1, Soo Kyun Kim2,*
1 School of Games, Hongik University, Sejong, 30016, Korea
2 Department of Computer Engineering, Jeju National University, Jeju, 63243, Korea
* Corresponding Author: Soo Kyun Kim. Email:
(This article belongs to this Special Issue: HPC with Artificial Intelligence based Deep Video Data Analytics: Models, Applications and Approaches)
Received 23 July 2021; Accepted 02 November 2021; Issue published 24 January 2022
This paper proposes a methodology for using multi-modal data in gameplay to detect outlier behavior. The proposed methodology collects, synchronizes, and quantifies time-series data from webcams, mouses, and keyboards. Facial expressions are varied on a one-dimensional pleasure axis, and changes in expression in the mouth and eye areas are detected separately. Furthermore, the keyboard and mouse input frequencies are tracked to determine the interaction intensity of users. Then, we apply a dynamic time warp algorithm to detect outlier behavior. The detected outlier behavior graph patterns were the play patterns that the game designer did not intend or play patterns that differed greatly from those of other users. These outlier patterns can provide game designers with feedback on the actual play experiences of users of the game. Our results can be applied to the game industry as game user experience analysis, enabling a quantitative evaluation of the excitement of a game.
Facial expression recognition; webcam; behavior analysis; affective computing
Cite This Article
Kang, S., Kim, S. K. (2022). Game Outlier Behavior Detection System Based on Dynamic Time Warp Algorithm. CMES-Computer Modeling in Engineering & Sciences, 131(1), 219–237.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.