Open Access
ARTICLE
Location and Object Aware Model for Parallel Activity Recognition in Multi-Resident Smart Homes
1 Department of Computer Science, National University of Computer and Emerging Sciences, Islamabad, Pakistan
2 Department of Artificial Intelligence and Data Science, National University of Computer and Emerging Sciences, Islamabad, Pakistan
3 School of Computer Science SCS, Taylor’s University SDN BHD, Subang Jaya, Selangor, Malaysia
4 Office of Research and Development, Asia University, Taichung, Taiwan
5 School of Computing, Engineering and the Build Environment, Sir David Bell Building, University of Roehampton, London, UK
* Corresponding Author: Mamoona Humayun. Email:
Computers, Materials & Continua 2026, 87(3), 83 https://doi.org/10.32604/cmc.2026.076379
Received 19 November 2025; Accepted 27 February 2026; Issue published 09 April 2026
Abstract
Smart homes enable elderly individuals and people with impairments to live independently through remote monitoring of their activities. Sequences of sensor activations are mapped with their associated labels to recognize different activities. Activity recognition in a multi-resident environment is challenging due to multiple activities performed by different residents in parallel. A novel multi-resident activity recognition approach is proposed to separate the sensor events based on their location. A spatial matrix is generated to capture the spatial and temporal patterns of the activities, and activations of sensors are recorded as binary values. The spatial matrix is converted into images to represent parallel activities as multiple objects within each image. Activities are annotated within each image based on their location and sequence patterns. The YOLO (You Only Look Once) model is deployed to recognize parallel activities from the activity images. Experimental evaluation on the CASAS dataset shows a performance improvement of 7 percent compared with existing state-of-the-art approaches.Keywords
Cite This Article
Copyright © 2026 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools