Open Access iconOpen Access

ARTICLE

IoT-Based Real-Time Medical-Related Human Activity Recognition Using Skeletons and Multi-Stage Deep Learning for Healthcare

Subrata Kumer Paul1,2, Abu Saleh Musa Miah3,4, Rakhi Rani Paul1,2, Md. Ekramul Hamid2, Jungpil Shin4,*, Md Abdur Rahim5

1 Department of Computer Science and Engineering (CSE), Bangladesh Army University of Engineering & Technology (BAUET), Qadirabad Cantonment, Dayarampur, Natore, 6431, Bangladesh
2 Department of Computer Science and Engineering (CSE), Rajshahi University, Rajshahi, 6205, Bangladesh
3 Department of CSE, Bangladesh Army University of Science and Technology (BAUST), Nilphamari, Saidpur, 5311, Bangladesh
4 School of Computer Science and Engineering, The University of Aizu, Aizuwakmatsu, 965-8580, Japan
5 Department of Computer Science and Engineering, Pabna University of Science and Technology, Rajapur, 6600, Bangladesh

* Corresponding Author: Jungpil Shin. Email: email

(This article belongs to the Special Issue: Next-Generation Activity Recognition: Methods, Challenges, and Solutions)

Computers, Materials & Continua 2025, 84(2), 2513-2530. https://doi.org/10.32604/cmc.2025.063563

Abstract

The Internet of Things (IoT) and mobile technology have significantly transformed healthcare by enabling real-time monitoring and diagnosis of patients. Recognizing Medical-Related Human Activities (MRHA) is pivotal for healthcare systems, particularly for identifying actions critical to patient well-being. However, challenges such as high computational demands, low accuracy, and limited adaptability persist in Human Motion Recognition (HMR). While some studies have integrated HMR with IoT for real-time healthcare applications, limited research has focused on recognizing MRHA as essential for effective patient monitoring. This study proposes a novel HMR method tailored for MRHA detection, leveraging multi-stage deep learning techniques integrated with IoT. The approach employs EfficientNet to extract optimized spatial features from skeleton frame sequences using seven Mobile Inverted Bottleneck Convolutions (MBConv) blocks, followed by Convolutional Long Short Term Memory (ConvLSTM) to capture spatio-temporal patterns. A classification module with global average pooling, a fully connected layer, and a dropout layer generates the final predictions. The model is evaluated on the NTU RGB+D 120 and HMDB51 datasets, focusing on MRHA such as sneezing, falling, walking, sitting, etc. It achieves 94.85% accuracy for cross-subject evaluations and 96.45% for cross-view evaluations on NTU RGB+D 120, along with 89.22% accuracy on HMDB51. Additionally, the system integrates IoT capabilities using a Raspberry Pi and GSM module, delivering real-time alerts via Twilios SMS service to caregivers and patients. This scalable and efficient solution bridges the gap between HMR and IoT, advancing patient monitoring, improving healthcare outcomes, and reducing costs.

Keywords

Real-time human motion recognition (HMR); ENConvLSTM; EfficientNet; ConvLSTM; skeleton data; NTU RGB+D 120 dataset; MRHA

Cite This Article

APA Style
Paul, S.K., Miah, A.S.M., Paul, R.R., Hamid, M.E., Shin, J. et al. (2025). IoT-Based Real-Time Medical-Related Human Activity Recognition Using Skeletons and Multi-Stage Deep Learning for Healthcare. Computers, Materials & Continua, 84(2), 2513–2530. https://doi.org/10.32604/cmc.2025.063563
Vancouver Style
Paul SK, Miah ASM, Paul RR, Hamid ME, Shin J, Rahim MA. IoT-Based Real-Time Medical-Related Human Activity Recognition Using Skeletons and Multi-Stage Deep Learning for Healthcare. Comput Mater Contin. 2025;84(2):2513–2530. https://doi.org/10.32604/cmc.2025.063563
IEEE Style
S. K. Paul, A. S. M. Miah, R. R. Paul, M. E. Hamid, J. Shin, and M. A. Rahim, “IoT-Based Real-Time Medical-Related Human Activity Recognition Using Skeletons and Multi-Stage Deep Learning for Healthcare,” Comput. Mater. Contin., vol. 84, no. 2, pp. 2513–2530, 2025. https://doi.org/10.32604/cmc.2025.063563



cc Copyright © 2025 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 644

    View

  • 152

    Download

  • 0

    Like

Share Link