Visible to the public Biblio

Filters: Author is Shi, Yu-Bo  [Clear All Filters]
2023-06-22
Cheng, Xin, Wang, Mei-Qi, Shi, Yu-Bo, Lin, Jun, Wang, Zhong-Feng.  2022.  Magical-Decomposition: Winning Both Adversarial Robustness and Efficiency on Hardware. 2022 International Conference on Machine Learning and Cybernetics (ICMLC). :61–66.
Model compression is one of the most preferred techniques for efficiently deploying deep neural networks (DNNs) on resource- constrained Internet of Things (IoT) platforms. However, the simply compressed model is often vulnerable to adversarial attacks, leading to a conflict between robustness and efficiency, especially for IoT devices exposed to complex real-world scenarios. We, for the first time, address this problem by developing a novel framework dubbed Magical-Decomposition to simultaneously enhance both robustness and efficiency for hardware. By leveraging a hardware-friendly model compression method called singular value decomposition, the defending algorithm can be supported by most of the existing DNN hardware accelerators. To step further, by using a recently developed DNN interpretation tool, the underlying scheme of how the adversarial accuracy can be increased in the compressed model is highlighted clearly. Ablation studies and extensive experiments under various attacks/models/datasets consistently validate the effectiveness and scalability of the proposed framework.
ISSN: 2160-1348