Abstract:
Our final year project, DriveCare, aims to improve road safety by helping drivers stay
focused on the road and prevent accidents caused by distracted driving. To achieve this, we
use facial expression recognition and physiological changes detection to determine the
driver's emotional state and whether they are focused on driving. If the driver is detected as
being distracted, an alarm is played to alert them to become more attentive.
Facial expression recognition is achieved by capturing digital images or videos of the driver,
which are sent to a hardware system containing a Raspberry Pi module running machine
learning and computer vision algorithms. The system analyzes the image or video stream to
detect the driver's emotions. Similarly, a physiological parameter such as heart rate is
detected through ECG signals, which can be processed to predict the driver's emotional state.
When the driver is found to be distracted, an alarm is played to alert the driver to focus on
driving.