Open Access
ARTICLE
A CNN-Transformer Hybrid Model for Real-Time Recognition of Affective Tactile Biosignals
1 School of Design and Art, Shanghai Dianji University, Shanghai, China
2 Innovation Academy for Microsatellites, Chinese Academy of Sciences, Shanghai, China
* Corresponding Author: Chang Xu. Email:
Computers, Materials & Continua 2026, 87(1), 99 https://doi.org/10.32604/cmc.2026.074417
Received 10 October 2025; Accepted 13 January 2026; Issue published 10 February 2026
Abstract
This study presents a hybrid CNN-Transformer model for real-time recognition of affective tactile biosignals. The proposed framework combines convolutional neural networks (CNNs) to extract spatial and local temporal features with the Transformer encoder that captures long-range dependencies in time-series data through multi-head attention. Model performance was evaluated on two widely used tactile biosignal datasets, HAART and CoST, which contain diverse affective touch gestures recorded from pressure sensor arrays. The CNN-Transformer model achieved recognition rates of 93.33% on HAART and 80.89% on CoST, outperforming existing methods on both benchmarks. By incorporating temporal windowing, the model enables instantaneous prediction, improving generalization across gestures of varying duration. These results highlight the effectiveness of deep learning for tactile biosignal processing and demonstrate the potential of the CNN-Transformer approach for future applications in wearable sensors, affective computing, and biomedical monitoring.Keywords
Cite This Article
Copyright © 2026 The Author(s). Published by Tech Science Press.This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Submit a Paper
Propose a Special lssue
View Full Text
Download PDF
Downloads
Citation Tools