Biblio
Filters: Keyword is human-robot cooperation [Clear All Filters]
Designing Psychological Conflict Resolution Strategies for Autonomous Service Robots. 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI). :1146–1148.
.
2022. As autonomous service robots will become increasingly ubiquitous in our daily lives, human-robot conflicts will become more likely when humans and robots share the same spaces and resources. This thesis investigates the conflict resolution of robots and humans in everyday conflicts in the domestic and public context. Hereby, the acceptability, trustworthiness, and effectiveness of verbal and non-verbal strategies for the robot to solve the conflict in its favor are evaluated. Based on the assumption of the Media Equation and CASA paradigm that people interact with computers as social actors, robot conflict resolution strategies from social psychology and human-machine interaction were derived. The effectiveness, acceptability, and trustworthiness of those strategies were evaluated in online, virtual reality, and laboratory experiments. Future work includes determining the psychological processes of human-robot conflict resolution in further experimental studies.
It Will Not Take Long! Longitudinal Effects of Robot Conflict Resolution Strategies on Compliance, Acceptance and Trust. 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI). :225–235.
.
2022. Domestic service robots become increasingly prevalent and autonomous, which will make task priority conflicts more likely. The robot must be able to effectively and appropriately negotiate to gain priority if necessary. In previous human-robot interaction (HRI) studies, imitating human negotiation behavior was effective but long-term effects have not been studied. Filling this research gap, an interactive online study (\$N=103\$) with two sessions and six trials was conducted. In a conflict scenario, participants repeatedly interacted with a domestic service robot that applied three different conflict resolution strategies: appeal, command, diminution of request. The second manipulation was reinforcement (thanking) of compliance behavior (yes/no). This led to a 3×2×6 mixed-subject design. User acceptance, trust, user compliance to the robot, and self-reported compliance to a household member were assessed. The diminution of a request combined with positive reinforcement was the most effective strategy and perceived trustworthiness increased significantly over time. For this strategy only, self-reported compliance rates to the human and the robot were similar. Therefore, applying this strategy potentially seems to make a robot equally effective as a human requester. This paper contributes to the design of acceptable and effective robot conflict resolution strategies for long-term use.