Visible to the public Active Heterogeneous Sensing for Fall Detection and Fall Risk Assessment

Abstract:

The interdisciplinary eldertech team at the University of Missouri is dedicated to developing and evaluating technology to keep older adults functioning at high levels and living independently. We are leveraging ongoing research at a unique local eldercare facility (TigerPlace) to study active sensing and fusion using vision and acoustic sensors for the continuous assessment of a resident's risk of falling as well as the reliable detection of falls in the home environment. This is part of a larger effort in identifying and assessing health problems at a very early stage so that early interventions can be offered to alleviate catastrophic health problems. Project Objectives:

  • Investigate adaptive, active, anonymized vision sensing for monitoring elders in a home setting
  • Investigate adaptive acoustic sensing for monitoring elders in a home setting
  • Investigate adaptive sensor fusion and intelligent decision making using heterogeneous sensor data collected at varying time scales, including both quantitative and qualitative data, and incorporating risk factors
  • Evaluate the effectiveness of the monitoring system in a realistic physical environment with variable conditions

The project seeks to advance the state of the art in (1) active vision sensing for activity recognition in dynamic and unpredictable environments, (2) acoustic sensing in unstructured environments, (3) adaptive sensor fusion and decision making using heterogeneous sensor data in dynamic and unpredictable environments, and (4) automatic fall detection and fall risk assessment using non-wearable sensors. The project offers an example of a cyber physical system in which we are studying the interplay of anomaly detection (falls) and the risk factors affecting the likelihood of the anomaly event. We have been addressing objective four by evaluating the effectiveness of the monitoring system in ten TigerPlace elderly apartments. Webcam and Kinect sensing systems have been installed and operate 24 hours a day, seven days a week in unstructured, dynamic environments. Silhouettes are extracted from the webcam data and the Kinect depth data to form 3D models. From the many walking paths observed, the system looks for purposeful walking sequences that are good candidates for capturing gait parameters. Walking speed, stride time, and stride length are extracted from the models to represent fall risk and tracked over time to observe changes. In the past year, we have focused on the Kinect depth images for fall detection and in-home gait analysis. We now have algorithms that operate in noisy, cluttered environments with variable lighting and multiple residents and visitors. An individual resident's gait parameters are identified by looking for clusters in the feature space. Visitors appear as outliers to the cluster centers and thus can be discarded. The average in-home walking speed captured using the depth images has been shown to correlate well to standard fall risk assessment instruments collected monthly with the participating residents. We also have implemented live fall detection in TigerPlace apartments, using the depth images. Alerts are sent automatically to the clinical staff and include a link to a short video clip of the depth images, which can be used to look for injuries or filter out false alarms. Using the Microsoft Kinect as a platform, we have also investigated objective three for fall detection using the fusion of depth images with acoustic signals from the four embedded microphones. Data have been collected in TigerPlace apartments using stunt actors; challenging scenarios have been included to test the robustness to noise and dynamic conditions. Significant improvement is obtained with the fused method over using the acoustic data alone.

License: 
Creative Commons 2.5

Other available formats:

Active Heterogeneous Sensing for Fall Detection and Fall Risk Assessment