Biblio
The automotive domain currently experiences a radical transition towards automation, connectivity and digitalization. This is a cause for major change in human-machine interaction. The research presented here examines 1) company visions of future mobility 2) user's reaction to the first trials of these visions. The data analyses reveal that implementing companies' visions for 2040 requires improvement concerning user acceptance. One way of improving user acceptance is to integrate emotion recognition in manual and automated vehicles. By reacting to users' positive and negative emotions, vehicles can learn to improve driving behavior, communication and to adjust driver assistance accordingly. Therefore, a roadmap for future research in emotion recognition has been developed by interviews with twelve experts in the field. Emotions that they judged to be most relevant to detect include anger, stress and fear, amongst others. Furthermore, ideas on sensors for emotion recognition, potential countermeasures for the negative effects of emotions and additional challenges were collected. The research presented is designed to shape further research directions of in-car emotion recognition.
Autonomous, shared, and electric - this is the vision for future transport services that enable both efficient and climate-friendly mobility. The success of such services will crucially depend on their actual use by the population, which is in turn determined by perceptions of their usefulness, ease of use, safety, and attractiveness. The new features even entail some new challenges to users. The authors present methods to identify user needs and potential use barriers early in the process of designing autonomous vehicles systems for public transport, and give examples from their user-centered research methods which can be used to incorporate user needs in the development of advanced systems for public transport.
Driving is a complex task concurrently drawing on multiple cognitive resources. Yet, there is a lack of studies investigating interactions at the brain-level among different driving subtasks in dual-tasking. This study investigates how visuospatial attentional demands related to increased driving difficulty interacts with different working memory load (WML) levels at the brain level. Using multichannel whole-head high density functional near-infrared spectroscopy (fNIRS) brain activation measurements, we aimed to predict driving difficulty level, both separate for each WML level and with a combined model. Participants drove for approximately 60 min on a highway with concurrent traffic in a virtual reality driving simulator. In half of the time, the course led through a construction site with reduced lane width, increasing visuospatial attentional demands. Concurrently, participants performed a modified version of the n-back task with five different WML levels (from 0-back up to 4-back), forcing them to continuously update, memorize, and recall the sequence of the previous 'n' speed signs and adjust their speed accordingly. Using multivariate logistic ridge regression, we were able to correctly predict driving difficulty in 75.0% of the signal samples (1.955 Hz sampling rate) across 15 participants in an out-of-sample cross-validation of classifiers trained on fNIRS data separately for each WML level. There was a significant effect of the WML level on the driving difficulty prediction accuracies [range 62.2-87.1%; χ2(4) = 19.9, p < 0.001, Kruskal-Wallis H test] with highest prediction rates at intermediate WML levels. On the contrary, training one classifier on fNIRS data across all WML levels severely degraded prediction performance (mean accuracy of 46.8%). Activation changes in the bilateral dorsal frontal (putative BA46), bilateral inferior parietal (putative BA39), and left superior parietal (putative BA7) areas were most predictive to increased driving difficulty. These discriminative patterns diminished at higher WML levels indicating that visuospatial attentional demands and WML involve interacting underlying brain processes. The changing pattern of driving difficulty related brain areas across WML levels could indicate potential changes in the multitasking strategy with level of WML demand, in line with the multiple resource theory.
Mental workload is a popular concept in ergonomics as it provides an intuitive explanation why exceedingly cognitive task demands result in a decrease in task performance and increase the risk of fatal incidents while driving. At the same time, affective states such as frustration, also play a role in traffic safety as they increase the tendency for speedy and aggressive driving and may even degrade cognitive processing capacities. To reduce accidents due to dangerous effects of degraded cognitive processing capacities and affective biases causing human errors, it is necessary to continuously assess multiple user states simultaneously to better understand potential interactions. In two previous studies, we measured brain activity with functional near-infrared spectroscopy (fNIRS) for separate brain based prediction of working memory load (WML) (Unni et al., 2017) and frustration levels (Ihme et al. submitted) while driving. Here, we report results from a study designed to predict simultaneously manipulated WML and frustration using data driven machine learning approaches from whole-head fNIRS brain activation measurements.
Empathic vehicles are a promising concept to increase the safety and acceptance of automated vehicles. However, on the way towards empathic vehicles a lot of research in the area of automated emotion recognition is necessary. Successful methods to detect emotions need to be trained on realistic data that contain the target emotion and come from a setting close to the final application. At the moment, data sets fulfilling these requirements are lacking. Therefore, the goal of this work is to present an experimental paradigm that induces four different emotional states (neutral, positive, frustration and mild anxiety) in a real-world driving setting using a combination of secondary tasks and conversation-based emotional recall. An evaluation of the paradigm using self-report data, annotation of speech data and peripheral physiology indicates that the methods to induce the target emotions were successful. Based on the insights of the experiment, finally a list of recommendations for the induction of emotions in real world driving settings is given.
Experiencing frustration while driving can harm cognitive processing, result in aggressive behavior and hence negatively influence driving performance and traffic safety. Being able to automatically detect frustration would allow adaptive driver assistance and automation systems to adequately react to a driver’s frustration and mitigate potential negative consequences. To identify reliable and valid indicators of driver’s frustration, we conducted two driving simulator experiments. In the first experiment, we aimed to reveal facial expressions that indicate frustration in continuous video recordings of the driver’s face taken while driving highly realistic simulator scenarios in which frustrated or non-frustrated emotional states were experienced. An automated analysis of facial expressions combined with multivariate logistic regression classification revealed that frustrated time intervals can be discriminated from non-frustrated ones with accuracy of 62.0% (mean over 30 participants). A further analysis of the facial expressions revealed that frustrated drivers tend to activate muscles in the mouth region (chin raiser, lip pucker, lip pressor). In the second experiment, we measured cortical activation with almost whole-head functional near-infrared spectroscopy (fNIRS) while participants experienced frustrating and non-frustrating driving simulator scenarios. Multivariate logistic regression applied to the fNIRS measurements allowed us to discriminate between frustrated and non-frustrated driving intervals with higher accuracy of 78.1% (mean over 12 participants). Frustrated driving intervals were indicated by increased activation in the inferior frontal, putative premotor and occipito-temporal cortices. Our results show that facial and cortical markers of frustration can be informative for time resolved driver state identification in complex realistic driving situations. The markers derived here can potentially be used as an input for future adaptive driver assistance and automation systems that detect driver frustration and adaptively react to mitigate it.