How Much Information?: Effects of Transparency on Trust in an Algorithmic Interface
Title | How Much Information?: Effects of Transparency on Trust in an Algorithmic Interface |
Publication Type | Conference Paper |
Year of Publication | 2016 |
Authors | Kizilcec, René F. |
Conference Name | Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems |
Date Published | May 2016 |
Publisher | ACM |
Conference Location | New York, NY, USA |
ISBN Number | 978-1-4503-3362-7 |
Keywords | algorithm awareness, attitude change, Computing Theory, Human Behavior, human trust, interface design, peer assessment, pubcrawl, transparency, Trust |
Abstract | The rising prevalence of algorithmic interfaces, such as curated feeds in online news, raises new questions for designers, scholars, and critics of media. This work focuses on how transparent design of algorithmic interfaces can promote awareness and foster trust. A two-stage process of how transparency affects trust was hypothesized drawing on theories of information processing and procedural justice. In an online field experiment, three levels of system transparency were tested in the high-stakes context of peer assessment. Individuals whose expectations were violated (by receiving a lower grade than expected) trusted the system less, unless the grading algorithm was made more transparent through explanation. However, providing too much information eroded this trust. Attitudes of individuals whose expectations were met did not vary with transparency. Results are discussed in terms of a dual process model of attitude change and the depth of justification of perceived inconsistency. Designing for trust requires balanced interface transparency - not too little and not too much. |
URL | https://dl.acm.org/doi/10.1145/2858036.2858402 |
DOI | 10.1145/2858036.2858402 |
Citation Key | kizilcec_how_2016 |