Home / Advanced Search

  • Title/Keywords

  • Author/Affliations

  • Journal

  • Article Type

  • Start Year

  • End Year

Update SearchingClear
  • Articles
  • Online
Search Results (87)
  • Open Access

    ARTICLE

    The Early Emotional Responses and Central Issues of People in the Epicenter of the COVID-19 Pandemic: An Analysis from Twitter Text Mining

    Eun-Joo Choi1, Yun-Jung Choi2,*

    International Journal of Mental Health Promotion, Vol.25, No.1, pp. 21-29, 2023, DOI:10.32604/ijmhp.2022.022641

    Abstract This study aimed to explore citizens’ emotional responses and issues of interest in the context of the coronavirus disease 2019 (COVID-19) pandemic. The dataset comprised 65,313 tweets with the location marked as New York State. The data collection period was four days of tweets when New York City imposed a lockdown order due to an increase in confirmed cases. Data analysis was performed using R Studio. The emotional responses in tweets were analyzed using the Bing and NRC (National Research Council Canada) dictionaries. The tweets’ central issue was identified by Text Network Analysis. When tweets were classified as either positive… More > Graphic Abstract

    The Early Emotional Responses and Central Issues of People in the Epicenter of the COVID-19 Pandemic: An Analysis from Twitter Text Mining

  • Open Access

    ARTICLE

    A Multi-Level Circulant Cross-Modal Transformer for Multimodal Speech Emotion Recognition

    Peizhu Gong1, Jin Liu1, Zhongdai Wu2, Bing Han2, Y. Ken Wang3, Huihua He4,*

    CMC-Computers, Materials & Continua, Vol.74, No.2, pp. 4203-4220, 2023, DOI:10.32604/cmc.2023.028291

    Abstract Speech emotion recognition, as an important component of human-computer interaction technology, has received increasing attention. Recent studies have treated emotion recognition of speech signals as a multimodal task, due to its inclusion of the semantic features of two different modalities, i.e., audio and text. However, existing methods often fail in effectively represent features and capture correlations. This paper presents a multi-level circulant cross-modal Transformer (MLCCT) for multimodal speech emotion recognition. The proposed model can be divided into three steps, feature extraction, interaction and fusion. Self-supervised embedding models are introduced for feature extraction, which give a more powerful representation of the… More >

  • Open Access

    ARTICLE

    Human-Computer Interaction Using Deep Fusion Model-Based Facial Expression Recognition System

    Saiyed Umer1,*, Ranjeet Kumar Rout2, Shailendra Tiwari3, Ahmad Ali AlZubi4, Jazem Mutared Alanazi4, Kulakov Yurii5

    CMES-Computer Modeling in Engineering & Sciences, Vol.135, No.2, pp. 1165-1185, 2023, DOI:10.32604/cmes.2022.023312

    Abstract A deep fusion model is proposed for facial expression-based human-computer Interaction system. Initially, image preprocessing, i.e., the extraction of the facial region from the input image is utilized. Thereafter, the extraction of more discriminative and distinctive deep learning features is achieved using extracted facial regions. To prevent overfitting, in-depth features of facial images are extracted and assigned to the proposed convolutional neural network (CNN) models. Various CNN models are then trained. Finally, the performance of each CNN model is fused to obtain the final decision for the seven basic classes of facial expressions, i.e., fear, disgust, anger, surprise, sadness, happiness,… More >

  • Open Access

    ARTICLE

    Performance Analysis of a Chunk-Based Speech Emotion Recognition Model Using RNN

    Hyun-Sam Shin1, Jun-Ki Hong2,*

    Intelligent Automation & Soft Computing, Vol.36, No.1, pp. 235-248, 2023, DOI:10.32604/iasc.2023.033082

    Abstract Recently, artificial-intelligence-based automatic customer response system has been widely used instead of customer service representatives. Therefore, it is important for automatic customer service to promptly recognize emotions in a customer’s voice to provide the appropriate service accordingly. Therefore, we analyzed the performance of the emotion recognition (ER) accuracy as a function of the simulation time using the proposed chunk-based speech ER (CSER) model. The proposed CSER model divides voice signals into 3-s long chunks to efficiently recognize characteristically inherent emotions in the customer’s voice. We evaluated the performance of the ER of voice signal chunks by applying four RNN techniques—long… More >

  • Open Access

    ARTICLE

    The Efficacy of Deep Learning-Based Mixed Model for Speech Emotion Recognition

    Mohammad Amaz Uddin1, Mohammad Salah Uddin Chowdury1, Mayeen Uddin Khandaker2,*, Nissren Tamam3, Abdelmoneim Sulieman4

    CMC-Computers, Materials & Continua, Vol.74, No.1, pp. 1709-1722, 2023, DOI:10.32604/cmc.2023.031177

    Abstract Human speech indirectly represents the mental state or emotion of others. The use of Artificial Intelligence (AI)-based techniques may bring revolution in this modern era by recognizing emotion from speech. In this study, we introduced a robust method for emotion recognition from human speech using a well-performed preprocessing technique together with the deep learning-based mixed model consisting of Long Short-Term Memory (LSTM) and Convolutional Neural Network (CNN). About 2800 audio files were extracted from the Toronto emotional speech set (TESS) database for this study. A high pass and Savitzky Golay Filter have been used to obtain noise-free as well as… More >

  • Open Access

    ARTICLE

    Multilayer Neural Network Based Speech Emotion Recognition for Smart Assistance

    Sandeep Kumar1, MohdAnul Haq2, Arpit Jain3, C. Andy Jason4, Nageswara Rao Moparthi1, Nitin Mittal5, Zamil S. Alzamil2,*

    CMC-Computers, Materials & Continua, Vol.74, No.1, pp. 1523-1540, 2023, DOI:10.32604/cmc.2023.028631

    Abstract Day by day, biometric-based systems play a vital role in our daily lives. This paper proposed an intelligent assistant intended to identify emotions via voice message. A biometric system has been developed to detect human emotions based on voice recognition and control a few electronic peripherals for alert actions. This proposed smart assistant aims to provide a support to the people through buzzer and light emitting diodes (LED) alert signals and it also keep track of the places like households, hospitals and remote areas, etc. The proposed approach is able to detect seven emotions: worry, surprise, neutral, sadness, happiness, hate… More >

  • Open Access

    ARTICLE

    Emotions, Perceptions and Health Behaviors of Adult Congenital Heart Disease Patients during COVID-19 in New York City

    Jodi L. Feinberg1, Peter Sheng2, Stephanie Pena2, Adam J. Small1, Susanna Wendelboe1, Katlyn Nemani3, Vikram Agrawal4, Dan G. Halpern1,*

    Congenital Heart Disease, Vol.17, No.5, pp. 519-531, 2022, DOI:10.32604/chd.2022.024174

    Abstract Background: Adults with congenital heart disease (ACHD) have increased prevalence of mood and anxiety disorders. There are limited data regarding the influence of the COVID-19 pandemic on the mental health and health behaviors of these patients. Objective: The purpose is to evaluate the perceptions, emotions, and health behaviors of ACHD patients during the COVID-19 pandemic. Methods: In this cross-sectional study of ACHD patients, we administered surveys evaluating self-reported emotions, perceptions and health behaviors. Logistic regressions were performed to determine the adjusted odds of displaying each perception, emotion and health behavior based on predictor variables. Results: Ninety-seven patients (mean age 38.3… More > Graphic Abstract

    Emotions, Perceptions and Health Behaviors of Adult Congenital Heart Disease Patients during COVID-19 in New York City

  • Open Access

    ARTICLE

    Challenge-Response Emotion Authentication Algorithm Using Modified Horizontal Deep Learning

    Mohamed Ezz1, Ayman Mohamed Mostafa1,*, Ayman Elshenawy2,3

    Intelligent Automation & Soft Computing, Vol.35, No.3, pp. 3659-3675, 2023, DOI:10.32604/iasc.2023.031561

    Abstract Face authentication is an important biometric authentication method commonly used in security applications. It is vulnerable to different types of attacks that use authorized users’ facial images and videos captured from social media to perform spoofing attacks and dynamic movements for penetrating security applications. This paper presents an innovative challenge-response emotions authentication model based on the horizontal ensemble technique. The proposed model provides high accurate face authentication process by challenging the authorized user using a random sequence of emotions to provide a specific response for every authentication trial with a different sequence of emotions. The proposed model is applied to… More >

  • Open Access

    ARTICLE

    Suicide Ideation Detection of Covid Patients Using Machine Learning Algorithm

    R. Punithavathi1,*, S. Thenmozhi2, R. Jothilakshmi3, V. Ellappan4, Islam Md Tahzib Ul5

    Computer Systems Science and Engineering, Vol.45, No.1, pp. 247-261, 2023, DOI:10.32604/csse.2023.025972

    Abstract During Covid pandemic, many individuals are suffering from suicidal ideation in the world. Social distancing and quarantining, affects the patient emotionally. Affective computing is the study of recognizing human feelings and emotions. This technology can be used effectively during pandemic for facial expression recognition which automatically extracts the features from the human face. Monitoring system plays a very important role to detect the patient condition and to recognize the patterns of expression from the safest distance. In this paper, a new method is proposed for emotion recognition and suicide ideation detection in COVID patients. This helps to alert the nurse,… More >

  • Open Access

    ARTICLE

    Human Emotions Classification Using EEG via Audiovisual Stimuli and AI

    Abdullah A Asiri1, Akhtar Badshah2, Fazal Muhammad3,*, Hassan A Alshamrani1, Khalil Ullah4, Khalaf A Alshamrani1, Samar Alqhtani5, Muhammad Irfan6, Hanan Talal Halawani7, Khlood M Mehdar8

    CMC-Computers, Materials & Continua, Vol.73, No.3, pp. 5075-5089, 2022, DOI:10.32604/cmc.2022.031156

    Abstract Electroencephalogram (EEG) is a medical imaging technology that can measure the electrical activity of the scalp produced by the brain, measured and recorded chronologically the surface of the scalp from the brain. The recorded signals from the brain are rich with useful information. The inference of this useful information is a challenging task. This paper aims to process the EEG signals for the recognition of human emotions specifically happiness, anger, fear, sadness, and surprise in response to audiovisual stimuli. The EEG signals are recorded by placing neurosky mindwave headset on the subject’s scalp, in response to audiovisual stimuli for the… More >

Displaying 21-30 on page 3 of 87. Per Page