Visible to the public Differences in Trust Between Human and Automated Decision Aids

TitleDifferences in Trust Between Human and Automated Decision Aids
Publication TypeConference Paper
Year of Publication2016
AuthorsPearson, Carl J., Welk, Allaire K., Boettcher, William A., Mayer, Roger C., Streck, Sean, Simons-Rudolph, Joseph M., Mayhorn, Christopher B.
Conference NameProceedings of the Symposium and Bootcamp on the Science of Security
Date PublishedApril 2016
PublisherACM
Conference LocationNew York, NY, USA
ISBN Number978-1-4503-4277-3
KeywordsAutomation, composability, compositionally, decision-making, expandability, Human Behavior, human trust, pubcrawl, reliance, Resiliency, risk, Strain, Trust, Trust Routing, workload
Abstract

Humans can easily find themselves in high cost situations where they must choose between suggestions made by an automated decision aid and a conflicting human decision aid. Previous research indicates that humans often rely on automation or other humans, but not both simultaneously. Expanding on previous work conducted by Lyons and Stokes (2012), the current experiment measures how trust in automated or human decision aids differs along with perceived risk and workload. The simulated task required 126 participants to choose the safest route for a military convoy; they were presented with conflicting information from an automated tool and a human. Results demonstrated that as workload increased, trust in automation decreased. As the perceived risk increased, trust in the human decision aid increased. Individual differences in dispositional trust correlated with an increased trust in both decision aids. These findings can be used to inform training programs for operators who may receive information from human and automated sources. Examples of this context include: air traffic control, aviation, and signals intelligence.

URLhttps://dl.acm.org/doi/10.1145/2898375.2898385
DOI10.1145/2898375.2898385
Citation Keypearson_differences_2016