Care Robot Transparency Isn't Enough for Trust
Title | Care Robot Transparency Isn't Enough for Trust |
Publication Type | Conference Paper |
Year of Publication | 2018 |
Authors | Poulsen, A., Burmeister, O. K., Tien, D. |
Conference Name | 2018 IEEE Region Ten Symposium (Tensymp) |
Keywords | care robot ethical decision-making, care robot transparency, Cognitive science, decision making, ethical aspects, Ethics, healthcare robotics, Human Behavior, human factors, human robot interaction, human-robot interaction, human-robot social interaction, IEEE Regions, machine transparency, medical robotics, Medical services, patient care, patient trust, pubcrawl, resilience, Resiliency, robot ethics, Robot sensing systems, Robot Trust, robot-made ethical decisions, robust trust, socially determined behaviour |
Abstract | A recent study featuring a new kind of care robot indicated that participants expect a robot's ethical decision-making to be transparent to develop trust, even though the same type of `inspection of thoughts' isn't expected of a human carer. At first glance, this might suggest that robot transparency mechanisms are required for users to develop trust in robot-made ethical decisions. But the participants were found to desire transparency only when they didn't know the specifics of a human-robot social interaction. Humans trust others without observing their thoughts, which implies other means of determining trustworthiness. The study reported here suggests that the method is social interaction and observation, signifying that trust is a social construct. Moreover, that `social determinants of trust' are the transparent elements. This socially determined behaviour draws on notions of virtue ethics. If a caregiver (nurse or robot) consistently provides good, ethical care, then patients can trust that caregiver to do so often. The same social determinants may apply to care robots and thus it ought to be possible to trust them without the ability to see their thoughts. This study suggests why transparency mechanisms may not be effective in helping to develop trust in care robot ethical decision-making. It suggests that roboticists need to build sociable elements into care robots to help patients to develop patient trust in the care robot's ethical decision-making. |
DOI | 10.1109/TENCONSpring.2018.8692047 |
Citation Key | poulsen_care_2018 |
- IEEE Regions
- socially determined behaviour
- robust trust
- robot-made ethical decisions
- Robot sensing systems
- robot ethics
- Resiliency
- resilience
- pubcrawl
- patient trust
- patient care
- Medical services
- medical robotics
- machine transparency
- Robot Trust
- human-robot social interaction
- human-robot interaction
- Human Robot Interaction
- Human Factors
- Human behavior
- healthcare robotics
- ethics
- ethical aspects
- Decision Making
- cognitive science
- care robot transparency
- care robot ethical decision-making