Visible to the public Optimising Network Architectures for Provable Adversarial Robustness

TitleOptimising Network Architectures for Provable Adversarial Robustness
Publication TypeConference Paper
Year of Publication2020
AuthorsGouk, Henry, Hospedales, Timothy M.
Conference Name2020 Sensor Signal Processing for Defence Conference (SSPD)
Date Publishedsep
Keywordsartificial neural network, compositionality, Computational modeling, Computer vision, network architecture, Neural networks, Perturbation methods, Predictive Metrics, Predictive models, provable security, pubcrawl, Resiliency, Robustness, Training
AbstractExisting Lipschitz-based provable defences to adversarial examples only cover the L2 threat model. We introduce the first bound that makes use of Lipschitz continuity to provide a more general guarantee for threat models based on any Lp norm. Additionally, a new strategy is proposed for designing network architectures that exhibit superior provable adversarial robustness over conventional convolutional neural networks. Experiments are conducted to validate our theoretical contributions, show that the assumptions made during the design of our novel architecture hold in practice, and quantify the empirical robustness of several Lipschitz-based adversarial defence methods.
DOI10.1109/SSPD47486.2020.9272169
Citation Keygouk_optimising_2020