2020 Benchmark Proposal: F1/10 Car with LIDAR by Radoslav Ivanov
In this benchmark, we consider a 1/10-scale autonomous racing car navigating a hallway environment. The car has a neural network controller that takes a lidar scan of the environment as input and produces steering for the car (assuming constant throttle). The verification problem is to verify that, starting from a set of initial positions, the car will not crash for some amount of time (e.g., 7s). In the simplest version of the problem, there are 21 lidar rays as input to the controller. The car has a standard continuous-time bicycle model. The lidar observation model contains a number of trigonometric functions and is hybrid since there is a different equation for each wall that lidar can reach depending on the car's state. The observation model is sampled at the controller sampling rate (10 Hz). A simulator, along with training code and a number of trained controllers, can be found online at the following github repository:
Is there a version with ReLU activations instead of Tanh? Is there a simple plant version that's continuous instead of hybrid? Is it possible to get the networks in ONNX format and a concrete simulation trace (something like start state s1 results in observation o1 at time t1 which is fed into the network and gives output o1 corresponding command c1, which results in state s2 at the next time step)?
Hi Stanley,
1) Yes, it is possible to train ReLU controllers. We have provided training code for two different reinforcement learning algorithms in the github repo. If we decide to use this benchmark, we can train some controllers on our side to make it easier for folks to use.
2) The plant dynamics mode is continuous (non-hybrid). However, the lidar model will always be hybrid by definition. Of course, it is possible to simulate the car without lidar and just provide feedback-control. The car simulator is also provided in the repo.
3) We don't have a general ONNX converter as that seemed quite involved. But we have scripts that can convert between various formats (Keras, mat, yaml). If you let us know what is convenient for you, I'm sure we can write another converter for that.
4) Yes, we do have an example of how to use the simulator. It is in the sim_trace.py script in the simulator folder in the repo. Once again, that simulator uses a neural network controller with lidar observations, but one can change that to feedback control if desired.
Rado
Hi everyone,
We have looked at this benchmark, and I would like to suggest excluding it from this year due to the complexity particularly related to the transformations. Our opinion is it's unlikely anyone except for Verisig would be able to analyze it due to the various code changes, etc. required to support it. It would be a good candidate in the future, but seems too complicated for this year given our time constraints.