Biblio
Filters: Author is Lu, Yali [Clear All Filters]
Effects of trust in human-automation shared control: A human-in-the-loop driving simulation study. 2021 IEEE International Intelligent Transportation Systems Conference (ITSC). :1147–1154.
.
2021. Human-automation shared control is proposed to reduce the risk of driver disengagement in Level-3 autonomous vehicles. Although previous studies have approved shared control strategy is effective to keep a driver in the loop and improve the driver's performance, over- and under-trust may affect the cooperation between the driver and the automation system. This study conducted a human-in-the-loop driving simulation experiment to assess the effects of trust on driver's behavior of shared control. An expert shared control strategy with longitudinal and lateral driving assistance was proposed and implemented in the experiment platform. Based on the experiment (N=24), trust in shared control was evaluated, followed by a correlation analysis of trust and behaviors. Moderating effects of trust on the relationship between gaze focalization and minimum of time to collision were then explored. Results showed that self-reported trust in shared control could be evaluated by three subscales respectively: safety, efficiency and ease of control, which all show stronger correlations with gaze focalization than other behaviors. Besides, with more trust in ease of control, there is a gentle decrease in the human-machine conflicts of mean brake inputs. The moderating effects show trust could enhance the decrease of minimum of time to collision as eyes-off-road time increases. These results indicate over-trust in automation will lead to unsafe behaviors, particularly monitoring behavior. This study contributes to revealing the link between trust and behavior in the context of human-automation shared control. It can be applied in improving the design of shared control and reducing risky behaviors of drivers by further trust calibration.