Differences in Trust Between Human and Automated Decision Aids
Title | Differences in Trust Between Human and Automated Decision Aids |
Publication Type | Conference Paper |
Year of Publication | 2016 |
Authors | Pearson, Carl J., Welk, Allaire K., Boettcher, William A., Mayer, Roger C., Streck, Sean, Simons-Rudolph, Joseph M., Mayhorn, Christopher B. |
Conference Name | Proceedings of the Symposium and Bootcamp on the Science of Security |
Date Published | April 2016 |
Publisher | ACM |
Conference Location | New York, NY, USA |
ISBN Number | 978-1-4503-4277-3 |
Keywords | Automation, composability, compositionally, decision-making, expandability, Human Behavior, human trust, pubcrawl, reliance, Resiliency, risk, Strain, Trust, Trust Routing, workload |
Abstract | Humans can easily find themselves in high cost situations where they must choose between suggestions made by an automated decision aid and a conflicting human decision aid. Previous research indicates that humans often rely on automation or other humans, but not both simultaneously. Expanding on previous work conducted by Lyons and Stokes (2012), the current experiment measures how trust in automated or human decision aids differs along with perceived risk and workload. The simulated task required 126 participants to choose the safest route for a military convoy; they were presented with conflicting information from an automated tool and a human. Results demonstrated that as workload increased, trust in automation decreased. As the perceived risk increased, trust in the human decision aid increased. Individual differences in dispositional trust correlated with an increased trust in both decision aids. These findings can be used to inform training programs for operators who may receive information from human and automated sources. Examples of this context include: air traffic control, aviation, and signals intelligence. |
URL | https://dl.acm.org/doi/10.1145/2898375.2898385 |
DOI | 10.1145/2898375.2898385 |
Citation Key | pearson_differences_2016 |