Abstract:
In the field of healthcare and wellness, the classification of Activities of Daily Living (ADL) and exercises using smartphone sensor data has emerged as a pivotal research area. The importance of classifying Activities of Daily Living (ADL) and exercise recognition lies in integration smartphone built in sensor data and advanced machine learning methods. This research presents a comprehensive approach involving data collection through smartphone internal sensors (accelerometer and gyroscope) positioned at various locations such as the bag, belly, hand, and thigh. The dataset encompasses regular postures and transitions such as lateral movements, accidental touches, vehicle ingress and egress, and bending positions. Utilizing deep learning algorithms, particularly Long Short-Term Memory (LSTM) and other machine learning classifiers, the study aims to accurately classify each class of ADLs with best accuracy. The proposed methodology achieved 92.8% accuracy by LSTM for diverse ADL categories. This research study explores the broader implications of ADL highlighting the potential for personalized health interventions and the promotion of active lifestyles.