Visible to the public Context-dependent control of smart artificial hands through enhanced touch perception and mechatronic reflexes

Abstract:

One grand challenge facing the nascent field of cyber-physical systems is the human-machine interface. Complex systems such as teleoperated space robots, telesurgery robots, and wheel-chair mounted assistive robots involve various levels of human intervention and proximity of the human operator to the artificial system. As with any human-in-the-loop system, each must address critical issues of bidirectional information flow (sensing and control), mixed authority, and context awareness. Couched in the specifics of neuroprosthetic hands, our work seeks to develop transformative methods of human-machine communication and control to enhance the capabilities of currently-limited physical resources. In the grand vision of cyber-physical systems, these advancements translate into (i) communication of tactile sensation from a remote end-effector to a human user, (ii) division of control based on the spatial and temporal capabilities of the system's agents, and (iii) smooth, context-dependent transfer of control between the agents. The objective of this research is to address challenges posed by the human-machine interface. Our approach analyzes control-switching issues in brain-computer interfaces. A nonhuman primate will perform a manual task while movement- and touch-related brain signals are recorded. As a robotic hand replays the movements, electronic signals will be recorded from tactile sensors on the robot's fingers, mapped to touch-based brain signals, and used to give the subject tactile sensation via direct cortical stimulation.

  • Research Goal 1 will establish a mapping between a biological and artificial system (a nonhuman primate as a proxy for a human subject, and a robot) for tactile sensing and control during a manual task.
  • Research Goal 2 will use artificial tactile sensors and cortical stimulation to provide a conscious perception of tactile feedback.
  • Research Goal 3 will simulate the neuroprosthetic reality in which only visual feedback and artificial touch (via cortical stimulation) are available for controlling a robot hand. Smooth, context-dependent transfers of authority between the biological and artificial agents will be accomplished using finite-state and behavior-based controllers to integrate low-level reflexes with voluntary high-level control.

This research will advance methods for providing tactile feedback from a remote manipulator, dividing control appropriate to human and machine capabilities, and transferring authority in a smooth, context- dependent manner. The principles established from such work can be applied to any cyber-physical system requiring robustness in the face of control delays or limited information flow at the human- machine interface. The resulting transformative methods of human-machine communication and control will have applications for robotics (space, underwater, military, rescue, surgery, assistive, prosthetic), haptics, biomechanics, and neuroscience.

License: 
Creative Commons 2.5