Title | TAaMR: Targeted Adversarial Attack against Multimedia Recommender Systems |
Publication Type | Conference Paper |
Year of Publication | 2020 |
Authors | Di Noia, Tommaso, Malitesta, Daniele, Merra, Felice Antonio |
Conference Name | 2020 50th Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W) |
Keywords | Adversarial Machine Learning, feature extraction, human factors, Measurement, Multimedia systems, Perturbation methods, pubcrawl, recommender systems, Resiliency, Scalability, Task Analysis, visualization |
Abstract | Deep learning classifiers are hugely vulnerable to adversarial examples, and their existence raised cybersecurity concerns in many tasks with an emphasis on malware detection, computer vision, and speech recognition. While there is a considerable effort to investigate attacks and defense strategies in these tasks, only limited work explores the influence of targeted attacks on input data (e.g., images, textual descriptions, audio) used in multimedia recommender systems (MR). In this work, we examine the consequences of applying targeted adversarial attacks against the product images of a visual-based MR. We propose a novel adversarial attack approach, called Target Adversarial Attack against Multimedia Recommender Systems (TAaMR), to investigate the modification of MR behavior when the images of a category of low recommended products (e.g., socks) are perturbed to misclassify the deep neural classifier towards the class of more recommended products (e.g., running shoes) with human-level slight images alterations. We explore the TAaMR approach studying the effect of two targeted adversarial attacks (i.e., FGSM and PGD) against input pictures of two state-of-the-art MR (i.e., VBPR and AMR). Extensive experiments on two real-world recommender fashion datasets confirmed the effectiveness of TAaMR in terms of recommendation lists changing while keeping the original human judgment on the perturbed images. |
DOI | 10.1109/DSN-W50199.2020.00011 |
Citation Key | di_noia_taamr_2020 |