A Novel Human Centric CPS to Improve Motor/Cognitive Assessment and Enable Adaptive Rehabilitation
Abstract:
Cerebral Palsy (CP) is the most common motor disorder of central origin in childhood and affects at least 2 children per 1000 live births every year. This project will research new methods and tools in motor/cognitive assessment for small children (5-8 years old) with Cerebral Palsy. It will develop a multimodal adaptive game system called CPLAY that integrates multiple views of cyber and physical components, and provides an assessment mechanism of rehabilitation progress through game activity monitoring as well as rehabilitation. In the proposed framework, low level sensor data are processed into events that are then fed into the Event Processing/Identification Module. Unusual behaviors associated with the game are sent to the human expert/ operator (who closes the loop) with commands for changes to be made on the actuator controller. The actuator controller implements game changes that may involve new types of sensors, game strategies and metrics, input from audio or camera / robot activation, etc. The sensory input acquired from these changes is sent to the behavior recognition module. The goal for the proposed framework is to make it applicable to a wide range of rehabilitation therapy settings, including range of motion, therapeutic exercises, strengthening/weight-bearing activities, dexterity practice (pincer, finger isolation), and functional activities. As part of the project, we have developed several games that can be used for CP rehabilitation therapy. In ongoing and future work, we investigated the use of digital gloves as input devices, so as to get exact measurement of finger movement. We also designed and developed interactive games using the Kinect RGB-D sensor. Kinect sensors allow for real-time 3D skeletal tacking, gesture recognition. Game activities can be customized to encourage therapeutic exercises as prescribed by therapist. We will also use humanoid robots to hold the attention of the user during the game. We also developed low-cost 3D point of Gaze Human-Computer interface. We designed Head- mounted eye tracking devices to map user gaze vector to 3D points in space. The device uses RGB-D scene camera, objects of interest are clustered and identified. User attention can be utilized as a data modality in interactive games. We created publicly available dataset for 3D eye gaze interaction to facilitate eye tracking algorithm development. We has made progress toward a novel Human-Robot Interaction System - RoDiCA, which will be used in the future as an early diagnostic and treatment tool for children with CP and ASD. The technological hypothesis is that a life-like robot, in both appearance and modality of interaction with an increased interactivity via real time visual and force feedback will motivate children to engage in motor activities leading to more effective treatment options. Pediatric occupational therapist and doctors are invited to use and evaluate the system. We have introduced our system in Cook's Children hospital to parents of children with motor and cognitive challenges. We will also collect fNIR data while the children are playing games, and we will correlate fNIR data with data collected in the game for assessment and evaluation.
- PowerPoint presentation
- 2.83 MB
- 355 downloads
- Download
- PDF version
- Printer-friendly version