Thesis Defense: Guocheng He (Electrical and Systems Engineering) – “Robustness of Trajectory Prediction Neural Network Models”

April 18, 2024
10:00 am - 11:00 am
Rodin Auditorium (Danforth Campus, Green Hall LL 0120)

“Robustness of Trajectory Prediction Neural Network Models”

Thesis lab: Ioannis (Yiannis) Kantaros (WashU Electrical & Systems Engineering)

Abstract: The application of autonomous vehicles in real life relies on trajectory prediction models based on perception and observation of the surrounding scene. The deep neural network model has been widely proven to provide relatively stable and excellent performance in various scenarios. Many formal approaches are used as verification of the prediction results of DNN models, where Conformal Prediction is one which can provide statistical safety guarantee region for DNN models. However, so far, no research has shown that conformal prediction possesses satisfactory robustness in dealing with purposed adversarial attacks. In this thesis, I propose an adversarial attack approach against trajectory prediction models that use conformal prediction to provide verification for deep neural network model prediction. While satisfying the assumption of conformal prediction, my approach could lead the deep neural network to generate erroneous results that following initial expectations without manually introducing a specific-designed target. With my crafted loss function, the attack’s effectiveness can meet user-defined objectives, addressing a wide range of practical requirements. I will also provide cases where the proposed adversarial attack can result in unsafe scenarios in multi-agent coordination problems. To my knowledge, this is the first adversarial attack model against deep neural networks equipped with conformal prediction.

View more event information

For inquiries contact Aaron Beagle at