Visible to the public CPS: Synergy: A Novel Biomechatronic Interface Based on Wearable Dynamic Imaging Sensors

The problem of controlling biomechatronic systems, such as multiarticulating prosthetic hands, involves unique challenges in the science and engineering of Cyber Physical Systems (CPS), requiring integration between computational systems for recognizing human functional activity and intent and controlling prosthetic devices to interact with the physical world. Research on this problem has been limited by the difficulties in noninvasively acquiring robust biosignals that allow intuitive and reliable control of multiple degrees of freedom (DoF). Traditionally, myoelectric signals based on surface electromyography (sEMG) recordings from the skin are used for control signals. These signals suffer from poor signal to noise and limited specificity for deeper muscles. The objective of this research is to investigate a new sensing paradigm based on ultrasonic imaging of dynamic muscle activity. Our approach involves the integration of novel imaging technologies, new computational methods for activity recognition and learning, and high-performance embedded computing to enable robust and intuitive control of dexterous prosthetic hands with multiple DoF. In the first year, our focus has been the development of spatio-temporal image analysis and pattern recognition algorithms to learn and predict different dexterous tasks based on sonographic patterns of muscle activity. A prototype interactive test environment was developed using a Sonix RP ultrasound system with a 5-14 MHz linear array transducer streaming ultrasound image data to a computer via a research interface. Dynamic ultrasound images of the forearm muscles were obtained from able-bodied volunteers while they performed different digit movements and grasps. Image analysis algorithms processed the data to determine muscle activity patterns encoding the spatial changes in echogenicity of different muscle compartments as the movements were performed. A library of different activity patterns corresponding to different movements was constructed for each subject. During testing, the subjects' movements were then decoded using a k-NN classification algorithm to select the best match from the library. The correlation coefficient between the activity pattern and the best-matched pattern from the library served as a graded signal to control a virtual prosthetic limb in real time. Six able-bodied volunteers were used to evaluate the performance of this system. Individual digit movements alone could be decoded with 97% accuracy. The system could decode between 15 different complex grasps, including pinch, power grasp, wrist pronation and supination with 87% accuracy. Real- time classification between four different movements could be performed with 92% accuracy. The signal to noise was 43 dB, which is significantly higher than that for sEMG signals. A strong linear relationship (R2=0.9) was observed between individual joint angles of digits and changes in ultrasound-derived control signals demonstrating the feasibility of graded control. Preliminary evaluation was also performed on atransradial amputee demonstrating the feasibility of the use of ultrasound for graded control.These results demonstrate the potential of ultrasound imaging as a sensing strategy for upper extremity prosthetic and provide intuitive graded control. Ongoing activities involve the development of a wearable image-based biosignal sensing system by integrating multiple ultrasound imaging sensors with a low-power heterogeneous multicore embedded processor, and the development of more sophisticated pattern recognition and machine learning algorithms that can overcome movement artifacts and minimize the need for recalibration during donning and doffing of the prosthetic.

License: 
Creative Commons 2.5

Other available formats:

CPS: Synergy: A Novel Biomechatronic Interface Based on Wearable Dynamic Imaging Sensors