Home / Journals / CMC / Online First / doi:10.32604/cmc.2026.074417
Special Issues
Table of Content

Open Access

ARTICLE

A CNN-Transformer Hybrid Model for Real-Time Recognition of Affective Tactile Biosignals

Chang Xu1,*, Xianbo Yin2, Zhiyong Zhou1, Bomin Liu1
1 School of Design and Art, Shanghai Dianji University, Shanghai, China
2 Innovation Academy for Microsatellites, Chinese Academy of Sciences, Shanghai, China
* Corresponding Author: Chang Xu. Email: email

Computers, Materials & Continua https://doi.org/10.32604/cmc.2026.074417

Received 10 October 2025; Accepted 13 January 2026; Published online 26 January 2026

Abstract

This study presents a hybrid CNN-Transformer model for real-time recognition of affective tactile biosignals. The proposed framework combines convolutional neural networks (CNNs) to extract spatial and local temporal features with the Transformer encoder that captures long-range dependencies in time-series data through multi-head attention. Model performance was evaluated on two widely used tactile biosignal datasets, HAART and CoST, which contain diverse affective touch gestures recorded from pressure sensor arrays. The CNN-Transformer model achieved recognition rates of 93.33% on HAART and 80.89% on CoST, outperforming existing methods on both benchmarks. By incorporating temporal windowing, the model enables instantaneous prediction, improving generalization across gestures of varying duration. These results highlight the effectiveness of deep learning for tactile biosignal processing and demonstrate the potential of the CNN-Transformer approach for future applications in wearable sensors, affective computing, and biomedical monitoring.

Keywords

Tactile biosignals; affective touch recognition; wearable sensors; signal processing; human–machine interaction
  • 62

    View

  • 13

    Download

  • 0

    Like

Share Link