NUST Institutional Repository

Deep Human Activity Recognition for the Internet of Health Things

Show simple item record

dc.contributor.author Imran, Nimra
dc.date.accessioned 2023-08-11T11:04:03Z
dc.date.available 2023-08-11T11:04:03Z
dc.date.issued 2023
dc.identifier.other 319313
dc.identifier.uri http://10.250.8.41:8080/xmlui/handle/123456789/36345
dc.description Supervisor: Dr. Qaiser Riaz en_US
dc.description.abstract Human Activity Recognition (HAR) directs machines’ ability to determine various user activities. The attributes discovered through these recognition procedures are inte grated into essential applications which the corresponding machine uses to determine movements and motion and, in response, executes meticulously determined duties. Our research aspires to a unique collection of HAR data gathered from ninety people aged 18 to 34. The dataset is KU-HAR, encompassing 1945 raw activity samples from 18 classes. This dataset also consists of 20,750 sub-samples extracted from them further, including 3 seconds of related activity data. For this research, we have delivered an InceptionResNet-inspired model which can be adjusted to function for various datasets. The proposed model has been tested with five additional input features varying from 1- dimensional to 6-dimensional.The symbols magacc correspond to the magnitudes of 3D accelerations. Similarly, magav represents the magnitudes of 3D angular velocities. The symbols accw x , accw y , accw z , avx w, avy w, avz w) are used to represent both 3D accelerations and 3D velocities. More specifically, accw x , accw y , accw z is used to represent 3D accelerations, while avx w, avy w, avz w is used to represent 3D angular velocities. We have achieved an accuracy of 99% using the KU-HAR 2019 and have surpassed in the HAR domain. en_US
dc.language.iso en en_US
dc.publisher School of Electrical Engineering and Computer Sciences (SEECS), NUST en_US
dc.title Deep Human Activity Recognition for the Internet of Health Things en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

  • MS [432]

Show simple item record

Search DSpace


Advanced Search

Browse

My Account