NUST Institutional Repository

Deep Emotion Recognition by Analyzing Inertial Data of Human Gait

Show simple item record

dc.contributor.author Hamza Ali Imran
dc.date.accessioned 2021-12-31T05:40:21Z
dc.date.available 2021-12-31T05:40:21Z
dc.date.issued 2018
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/28250
dc.description.abstract The Internet of Things (IoT) anticipates a future in which commonplace products will be incorporated with sensors. Sensors can detect both living environs and the human body, allowing for easier contact with the physical world. Sensors, such as inertial sensors, are found in devices such as smartphones, smartwatches, and fitness bands. There is a need for deep learning models that can work for the classification of different applications based on inertial sensors embedded in smart-wearables. In this thesis, we have presented an Inception ResNet inspired model which can be tuned to work for various applications. We began by experimenting with Emotion Recognition before fine-tuning it to work with Human Activity Recognition. Along with this, the proposed model is experimented with five different input features ranging from 1D to 6D which are ‘magnitude of 3D Accelerations’ i.e. mag \w a ’, ‘magnitude of 3D Velocities i.e. mag \w ω ’, ‘Both 3D Accelerations and 3D Velocities i.e. (a w x , aw y , aw z , ωw x , ωw y , ωw z )’, ‘3D Accelerations i.e. (a w x , aw y , aw z )’ and ‘3D Velocities i.e. (ω w x , ωw y , ωw z )’. Emotions are an integral part of our daily lives. If computers can sense emotions, they will be able to interact more effectively and humanely. We used the SEECS Emotion Recognition Dataset for this work, which contains 6 distinct emotions namely ‘happy’, ‘fear’, ‘sad’, ‘disgust’, ‘anger’, and ‘surprise’ data of human gait. Human activity recognition (HAR) is another popular topic in the IoT paradigm. HAR uses include fitness tracking, entertainment, childcare, security, driver behavior monitoring, ambient assisted living, and others. We used two datasets from Wireless Sensor Data Mining in this study: WISDM 2011 and WISDM 2019. We have achieved 95.23% accuracy for Emotion Recognition for all 6 classes with ‘mag \w a ’ and 95.01% accuracy with ‘ (a w x , aw y , aw z , ωw x , ωw y , ωw z )’. For Human Activity Recognition we have achieved 97.29% all 6 activities in WISDM 2011 dataset with ‘mag \w a ’ and 98.81% accuracy with ‘(a w x , aw y , aw z , ωw x , ωw y , ωw z )’. For all 18 activities in WISDM 2019, we have achieved we have achieved 97.5% with ‘mag \w a ’ and 98.4% accuracy with ‘(a w x , aw y , aw z , ωw x , ωw y , ωw z )’ using only the smartwatch dataset. We outperformed the stat en_US
dc.description.sponsorship Dr. Qaiser Riaz en_US
dc.language.iso en en_US
dc.publisher SEECS, National University of Science and Technology, Islamabad. en_US
dc.subject MSCS SEECS 2018 en_US
dc.subject Keywords: Emotion Recognition, Gait Analysis, Human Motion Analysis, Smartwatch based activity recognition, Human Activity Recognition (HAR), Deep neural network, Pervasive Computing, Ubiquitous computing, Inertial Sensors Signal Processing, Inertial Sensors, Internet of Things, Digital Signal Processing en_US
dc.title Deep Emotion Recognition by Analyzing Inertial Data of Human Gait en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

  • MS [379]

Show simple item record

Search DSpace


Advanced Search

Browse

My Account