By submitting this form, you are agreeing to the Terms of Use and Privacy Policy.
A driver’s abilities to either the positive or negative aspects of driving are influenced by their emotions. Face expression recognition technology is utilised to track the emotions of the driver by recognising the person’s face and facial expressions.
The improvement of human-machine interactions for defensive driving and road safety is being positively impacted by facial expression analysis. The technique known as facial expression recognition (FER) is used in advanced driver assistance systems (ADAS) to detect driver emotions.
The Global Driver Emotion Recognition System market accounted for $XX Billion in 2022 and is anticipated to reach $XX Billion by 2030, registering a CAGR of XX% from 2023 to 2030.
For the driver’s real emotion recognizer (DRER), an automatic system is developed using deep learning. Based on previous studies on the design of driver facial expressions for intelligent products, the emotional values of drivers in indoor vehicles are symmetrically mapped to image design to investigate the characteristics of abstract expressions and expression design principles.
In order to identify the driver’s varied emotions, transfer learning is used in the NasNet big CNN model. This research project also includes the creation of a unique driver emotion recognition image dataset.It is typical for such insignificant alterations to take place as a result of the repression of actual emotions, whether purposely or accidentally.
Several studies focusing on face microexpressions have led to the development of potential techniques for uncovering concealed emotions. Drivers’ true emotions can be identified by analysing their micro-expressions in combination with deep learning-based algorithms.
The main obstacles to widespread adoption, however, are a lack of samples and an imbalanced distribution of those that are available. Ultimately, they’re searching for drivers who are feeling their emotions honestly rather than those who have repressed or concealed them.This is a big advancement because it uses physiological cues to pinpoint human emotions.
The physiological signals that are most frequently employed in clinical practice are the electroencephalogram (EEG), electrocardiogram (ECG), photoplethysmography (PPG), and electrical skin activity (ESA).
According to a few studies, a combination of physiological signals and facial expressions can be utilised to correctly categorise a variety of moods.
This research suggests a deep learning-based DRER based on these trends that uses sensor fusion of physiological data and driver FER to identify the true emotional state of the driver while driving. The DRER is designed to identify the driver’s genuine emotional state while the car is moving.