Open AccessOpen Access


Feature Fusion-Based Deep Learning Network to Recognize Table Tennis Actions

Chih-Ta Yen1,*, Tz-Yun Chen2, Un-Hung Chen3, Guo-Chang Wang3, Zong-Xian Chen3

1 Department of Electrical Engineering, National Taiwan Ocean University, Keelung City, 202301, Taiwan
2 Office of Physical Education, National Formosa University, Yunlin County, 632, Taiwan
3 Department of Electrical Engineering, National Formosa University, Yunlin County 632, Taiwan

* Corresponding Author: Chih-Ta Yen. Email:

Computers, Materials & Continua 2023, 74(1), 83-99.


A system for classifying four basic table tennis strokes using wearable devices and deep learning networks is proposed in this study. The wearable device consisted of a six-axis sensor, Raspberry Pi 3, and a power bank. Multiple kernel sizes were used in convolutional neural network (CNN) to evaluate their performance for extracting features. Moreover, a multiscale CNN with two kernel sizes was used to perform feature fusion at different scales in a concatenated manner. The CNN achieved recognition of the four table tennis strokes. Experimental data were obtained from 20 research participants who wore sensors on the back of their hands while performing the four table tennis strokes in a laboratory environment. The data were collected to verify the performance of the proposed models for wearable devices. Finally, the sensor and multi-scale CNN designed in this study achieved accuracy and F1 scores of 99.58% and 99.16%, respectively, for the four strokes. The accuracy for five-fold cross validation was 99.87%. This result also shows that the multi-scale convolutional neural network has better robustness after five-fold cross validation.


Cite This Article

C. Yen, T. Chen, U. Chen, G. Wang and Z. Chen, "Feature fusion-based deep learning network to recognize table tennis actions," Computers, Materials & Continua, vol. 74, no.1, pp. 83–99, 2023.

This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1005


  • 481


  • 0


Share Link