dc.description.abstract |
Emotion recognition is an active area of research in the domain of human
computer interaction. Computers can interact better with humans if they
can explicitly understand human emotions. Most of the existing state of the
art methodologies rely heavily on facial expression analysis which has its own
drawbacks. In this research work, we present a set of hand-crafted features
computed from the inertial data of natural walk of a human, which can
be used to predict human emotions with higher accuracy. We recorded 6D
accelerations and angular velocities of 40 subjects using an on-board IMU
of a smartphone attached at the chest. The subjects were asked to walk
in their natural gait by assuming themself in one of the six basic emotions
i.e. happy, sad, anger, fear, disgust and surprise. The raw inertial signals
were segmented into sequences of steps and strides. A set of hand-crafted
features from the time, frequency and wavelet domains for steps and strides
was computed. The hand-crafted features were used to train two predictors
namely Random Forest and Support Vector Machines. Using 10-fold cross
validation strategy on stride features set, a classi cation accuracy of 95% was
achieved with two classes of emotions and a classi cation accuracy of 86%
was achieved with six classes of emotions. The results have been computed
on di erent sets of features and show that the proposed set of hand-crafted
features can recognize emotions with reliably and accurately. |
en_US |