Visible to the public How Much Information?: Effects of Transparency on Trust in an Algorithmic Interface

TitleHow Much Information?: Effects of Transparency on Trust in an Algorithmic Interface
Publication TypeConference Paper
Year of Publication2016
AuthorsKizilcec, René F.
Conference NameProceedings of the 2016 CHI Conference on Human Factors in Computing Systems
Date PublishedMay 2016
PublisherACM
Conference LocationNew York, NY, USA
ISBN Number978-1-4503-3362-7
Keywordsalgorithm awareness, attitude change, Computing Theory, Human Behavior, human trust, interface design, peer assessment, pubcrawl, transparency, Trust
Abstract

The rising prevalence of algorithmic interfaces, such as curated feeds in online news, raises new questions for designers, scholars, and critics of media. This work focuses on how transparent design of algorithmic interfaces can promote awareness and foster trust. A two-stage process of how transparency affects trust was hypothesized drawing on theories of information processing and procedural justice. In an online field experiment, three levels of system transparency were tested in the high-stakes context of peer assessment. Individuals whose expectations were violated (by receiving a lower grade than expected) trusted the system less, unless the grading algorithm was made more transparent through explanation. However, providing too much information eroded this trust. Attitudes of individuals whose expectations were met did not vary with transparency. Results are discussed in terms of a dual process model of attitude change and the depth of justification of perceived inconsistency. Designing for trust requires balanced interface transparency - not too little and not too much.

URLhttps://dl.acm.org/doi/10.1145/2858036.2858402
DOI10.1145/2858036.2858402
Citation Keykizilcec_how_2016