NUST Institutional Repository

Deep Learning based Emotion Charting for Healthy and Cognitively Impaired Subjects using Physiological Signals

Show simple item record

dc.contributor.author Dar, Muhammad Najam
dc.date.accessioned 2023-06-24T05:22:22Z
dc.date.available 2023-06-24T05:22:22Z
dc.date.issued 2015
dc.identifier.other NUST201590342PCEME0815F
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/34203
dc.description Supervisor: Dr. M. Usman Akram Co-Supervisor: Dr. Sajid Gul Khawaja en_US
dc.description.abstract Parkinson’s Disease (PD) is the second most common neurodegenerative disorder, resulting in cognitive impairments in emotion recognition. The deficit of emotional expression poses challenges to the healthcare services provided and the quality of life of PD patients. Emotion charting for cognitively impaired patients is challenging compared to healthy subjects. The continuous monitoring of cognitively impaired patients with physiological signals such as Electroencephalogram (EEG), Electro- cardiograms (ECG), and Galvanic skin response (GSR) provide physiological health for these patients. Novel research trends incorporate these physiological signals by reflecting actual and intrinsic emotional states resulting in more reliable, natu- ral, and meaningful human-computer interaction with applications in entertainment consumption behavior, interactive brain-computer interface, and monitoring of the psychological health of patients. Young adults and children commonly use technol- ogy for human-computer interaction and entertainment consumption behavior. The main challenges in this domain are the low emotion recognition performance for PD patients due to loss of dopaminergic neurons, low performance for memory-induced emotions due to weaker signal and concentration loss, and lack of dataset of children. The previous research lacked to directly explores the one-dimensional convolutional recurrent neural network deep learning model, suitable for long, continuous, and repetitive patterns of EEG, ECG, and GSR for the emotion charting of cognitively impaired patients and memory-induced emotion recognition. Other challenges in real-world applications include a reduced performance with the increased number of emotion classes, wearable acquisition sensors, and experimental settings such as age group and emotional stimuli provided to the subjects. Similarly, despite the efficacy of 1D-CRNN and ELM for physiological signals data, the combination of these two is not explored in the literature. This thesis addresses these challenges by proposing a novel 1D-CRNN-ELM architec- ture, which combines a one-dimensional Convolutional Recurrent Neural Network (1D-CRNN) with an Extreme Learning Machine (ELM), robust for the emotion detection of PD patients, also available for cross dataset learning with various emo- tions and experimental settings. In the proposed framework, the preprocessing of physiological signals involves baseline removal, passband filtering, and Z-score nor- malization. After preprocessing, 1D-CRNN architecture with three 1D-CNN layers (16 filters with the size of 1x8 each), followed by an LSTM layer trained with pre- processed physiological signals. The trained 1D-CRNN architecture is used as the feature extractor of physiological signals. The extracted deep features are then passed through an extreme learning machine classifier to classify emotions both in a categorical (fear, happy, sad, disgust, anger, and surprise) and dimensional model (four quadrants of high valence high arousal (HVHA), high valence low arousal (HVLA), low valence high arousal (LVHA), and low valence low arousal (LVLA)). This research also explored fine-tuning for cross-dataset learning of emotions among Parkinson’s disease patients dataset and publicly available datasets of healthy sub- jects. This research contributed a novel, robust and generic framework to handle healthy i and cognitively impaired patients for emotion recognition. The proposed framework outperforms the recognition performance of existing techniques with publicly avail- able datasets of AMIGOS, DREAMER, and SEED-IV, with the PD patients dataset, and provides benchmark baseline results for memory-induced and children datasets. It improves the recognition performance compared to the state-of-the-art for both categorical and dimensional models of emotion charting subject-independent study with wireless sensors is suitable for less-constrained real-world environments. It also incorporated the less explored ECG and GSR signals for less invasive, low-cost, wear- able emotion recognition with multimodal fusion at the decision level. This research provides an original attempt to explore the deep learning model for PD patients and a novel dataset of self-induced emotions with autobiographical memories evoked by the relevant words. The induced emotions through external stimuli are often more substantial than the emotional experiences felt by humans in daily routines. The self-induced emotions through memory recalls are the most common experiences in real-world scenarios for continuous emotion charting with a novel dataset of evoked memory recalls titled MEMO. This research developed and provided a novel, multi- modal dataset of children and young adults (YAAD) with benchmark results publicly available to the research community for emotion charting with physiological signals. The proposed method outperforms state-of-the-art studies to classify emotions with publicly available datasets, provide cross-dataset learning, and validate the robust- ness of the deep learning framework for real-world scenarios such as evoked memory recalls and psychological healthcare monitoring of Parkinson’s patients. en_US
dc.language.iso en en_US
dc.publisher College of Electrical and Mechanical Engineering (CEME), NUST en_US
dc.subject Deep Learning based Emotion Charting for Healthy and Cognitively Impaired Subjects using Physiological Signals en_US
dc.title Deep Learning based Emotion Charting for Healthy and Cognitively Impaired Subjects using Physiological Signals en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account