Visible to the public CPS: Medium: Collaborative Research: Augmented reality for control of reservation-based intersections with mixed flowsConflict Detection Enabled

Project Details
Lead PI:Linda Boyle
Performance Period:10/01/18 - 09/30/20
Institution(s):University of Washington
Sponsor(s):National Science Foundation
Award Number:1739085
695 Reads. Placed 544 out of 804 NSF CPS Projects based on total reads on all related artifacts.
Abstract: In urban environments, signalized intersections are a major cause of congestion since their actual capacity is very low. Autonomous vehicles are a possible leap forward: by receiving coordinated guidance information from the intersection system itself, these vehicles could navigate through the intersections with minimal speed reduction or wait times, resulting in far more efficient intersections. These smart intersections can reduce wait times by orders of magnitude, though they only work if all vehicles are autonomous: the presence of even one percent of non-autonomous vehicles would negate almost all benefits. This project investigates augmented reality technology as a scalable means of improving flow through these smart intersections by coordinating human driven vehicles with autonomous vehicles, maximizing intersection throughput while minimizing collision risks. This research will benefit the U.S. economy by providing an inexpensive, scalable way of reducing congestion without the need to ban human-driven forms of transport (pedestrians, bicycles), and without the cost of having only autonomous vehicles. This research is at the interface of several disciplines including transportation engineering, control theory and human factors. The guidance of human-driven vehicles is critical to improve the capacity of future smart intersections safely. While these intersections show considerable potential benefit in a fully automated world, their performance strongly degrades if even a few vehicles are human-driven. Given a high penetration of augmented reality devices (smart glasses), and measurement data from human-driven and autonomous vehicles, including the predicted paths of autonomous vehicles, can human-driven vehicles be guided through a smart intersection as quickly and safely as possible? The answer requires one to simultaneously solve real-time estimation and control problems, in a dynamic environment, with uncertain actuation given the performance of humans. The project develops efficient algorithms to learn the expected performance of each driver. The routing of vehicles in a reservation-based intersection system takes into account human behavior and the physical limitations of vehicles. Strategies are developed to effectively communicate guidance information to drivers in a mixed-reality setting. These results will be validated on an experimental setup involving vehicles driven by humans and equipped with augmented reality devices. This project is jointly supported with the Department of Transportation.