Visible to the public Resiliency of SNN on Black-Box Adversarial Attacks

TitleResiliency of SNN on Black-Box Adversarial Attacks
Publication TypeConference Paper
Year of Publication2021
AuthorsPaudel, Bijay Raj, Itani, Aashish, Tragoudas, Spyros
Conference Name2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)
Keywordsadversarial attacks, black-box attacks, Conferences, Deep Neural Network, Hardware, machine learning, neural network resiliency, Neural networks, Neuromorphics, pubcrawl, resilience, Resiliency, Robustness, Software, spiking neural network, SpiNNaker
AbstractExisting works indicate that Spiking Neural Networks (SNNs) are resilient to adversarial attacks by testing against few attack models. This paper studies adversarial attacks on SNNs using additional attack models and shows that SNNs are not inherently robust against many few-pixel L0 black-box attacks. Additionally, a method to defend against such attacks in SNNs is presented. The SNNs and the effects of adversarial attacks are tested on both software simulators as well as on SpiNNaker neuromorphic hardware.
DOI10.1109/ICMLA52953.2021.00132
Citation Keypaudel_resiliency_2021