Biblio

Filters: Author is Nielsen, C.  [Clear All Filters]
2020-12-01
Nielsen, C., Mathiesen, M., Nielsen, J., Jensen, L. C..  2019.  Changes in Heart Rate and Feeling of Safety When Led by a Rehabilitation Robot. 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). :580—581.

Trust is an important topic in medical human-robot interaction, since patients may be more fragile than other groups of people. This paper investigates the issue of users' trust when interacting with a rehabilitation robot. In the study, we investigate participants' heart rate and perception of safety in a scenario when their arm is led by the rehabilitation robot in two types of exercises at three different velocities. The participants' heart rate are measured during each exercise and the participants are asked how safe they feel after each exercise. The results showed that velocity and type of exercise has no significant influence on the participants' heart rate, but they do have significant influence on how safe they feel. We found that increasing velocity and longer exercises negatively influence participants' perception of safety.

Weigelin, B. C., Mathiesen, M., Nielsen, C., Fischer, K., Nielsen, J..  2018.  Trust in Medical Human-Robot Interactions based on Kinesthetic guidance. 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). :901—908.

In medical human-robot interactions, trust plays an important role since for patients there may be more at stake than during other kinds of encounters with robots. In the current study, we address issues of trust in the interaction with a prototype of a therapeutic robot, the Universal RoboTrainer, in which the therapist records patient-specific tasks for the patient by means of kinesthetic guidance of the patients arm, which is connected to the robot. We carried out a user study with twelve pairs of participants who collaborate on recording a training program on the robot. We examine a) the degree with which participants identify the situation as uncomfortable or distressing, b) participants' own strategies to mitigate that stress, c) the degree to which the robot is held responsible for the problems occurring and the amount of agency ascribed to it, and d) when usability issues arise, what effect these have on participants' trust. We find signs of distress mostly in contexts with usability issues, as well as many verbal and kinesthetic mitigation strategies intuitively employed by the participants. Recommendations for robots to increase users' trust in kinesthetic interactions include the timely production of verbal cues that continuously confirm that everything is alright as well as increased contingency in the presentation of strategies for recovering from usability issues arising.