Visible to the public Adversarial Machine Learning Security Problems for 6G: mmWave Beam Prediction Use-Case

TitleAdversarial Machine Learning Security Problems for 6G: mmWave Beam Prediction Use-Case
Publication TypeConference Paper
Year of Publication2021
AuthorsCatak, Evren, Catak, Ferhat Ozgur, Moldsvor, Arild
Conference Name2021 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom)
Date Publishedmay
Keywords6G mobile communication, Adversarial Machine Learning, AI, beam-forming, composability, Deep Learning, machine learning, machine learning algorithms, Millimeter wave technology, millimeter-wave, Prediction algorithms, Predictive models, privacy, pubcrawl, resilience, Resiliency, Transportation
Abstract6G is the next generation for the communication systems. In recent years, machine learning algorithms have been applied widely in various fields such as health, transportation, and the autonomous car. The predictive algorithms will be used in 6G problems. With the rapid developments of deep learning techniques, it is critical to take the security concern into account when applying the algorithms. While machine learning offers significant advantages for 6G, AI models' security is normally ignored. Due to the many applications in the real world, security is a vital part of the algorithms. This paper proposes a mitigation method for adversarial attacks against proposed 6G machine learning models for the millimeter-wave (mmWave) beam prediction using adversarial learning. The main idea behind adversarial attacks against machine learning models is to produce faulty results by manipulating trained deep learning models for 6G applications for mmWave beam prediction. We also present the adversarial learning mitigation method's performance for 6G security in millimeter-wave beam prediction application with fast gradient sign method attack. The mean square errors of the defended model under attack are very close to the undefended model without attack.
DOI10.1109/BlackSeaCom52164.2021.9527756
Citation Keycatak_adversarial_2021