Visible to the public CPS: Medium: Collaborative Research: Human-on-the-Loop Control for Smart Ultrasound ImagingConflict Detection Enabled

Project Details
Lead PI:Mostafa Fatemi
Co-PI(s):Azra Alizad
Performance Period:10/01/18 - 09/30/21
Institution(s):Mayo Clinic Rochester
Sponsor(s):National Science Foundation
Award Number:1837572
693 Reads. Placed 541 out of 804 NSF CPS Projects based on total reads on all related artifacts.
Abstract: Due to low operating cost and patient safety, ultrasound is widely accepted as one of the best forms of medical imaging compared to similar technologies, such as Computer Tomography (CT) scans or Magnetic Resonance Imaging (MRI). Still, there can be large variability in image quality obtained by different experts imaging the same patient, which can affect successful diagnosis and patient treatment. This problem becomes even more pronounced across patients. Consequently, to decrease this variability this project will develop imaging techniques that are not passive but are based on real-time ultrasound beam control and adaptation, while facilitating best use of operator expertise to obtain the most informative images. Such new active ultrasound systems, where expert users with varying levels of training interact with a smart ultrasound device to improve medical imaging and facilitate diagnosis, will provide significant performance gains compared to present systems that are only manually controlled. This project will also have a significant societal impact in accurate, safe, and cost-effective diagnosis of many medical conditions, such as cancers or liver fibrosis. For instance, the use of such systems for breast cancer diagnosis will significantly reduce the number of unnecessary biopsies, which currently cost more than $1 billion annually in the US alone. At the same time this technology can enable a variety of other imaging applications that rely on different forms of ultrasound, such as mapping of the heart chambers using Doppler ultrasound or identifying the mechanical properties of materials in structures for failure prognosis. Specifically, the goal of this project is the development of an active ultrasound system where user expertise is employed to refine the control process, while autonomous elasticity (or viscoelasticity) mapping improves image quality and allows human operator to best use their skills for both optimization and diagnosis. The project's research products include: (i) data fusion techniques for ultrasound elastography; (ii) methods for interactive ultrasound elastography; and (iii) framework for safe and efficient device implementation. The ultrasound system will be validated on a test-bed based on suitable laboratory phantoms and real-time control of existing ultrasound devices. Investigators will focus on the unique aspects of this novel paradigm that, compared to existing methods, include: (1) new active, user-machine, imaging techniques improving on the characterization of the mechanical properties of tissue; and (2) the systematic transition of algorithms and user interfaces to embedded computers for safe execution by the device. This requires overcoming intellectual challenges related to the integration of visco-elastography mapping and human-on-the-loop ultrasound control, as well as synthesis of new theoretical results drawing from computational mechanics, controls and estimation, and embedded systems design. The project also has extensive education and outreach components, including curriculum development focused on design of safety-critical medical cyber-physical systems that exhibit highly dynamical system behaviors and plant uncertainty, human interactions, and the need for real-time implementation. The outreach component of this project will also improve the pre-college students' awareness of the potential and attractiveness of a research and engineering career.