Biblio
With the development of network, network security has become a topic of increasing concern. Recent years, machine learning technology has become an effective means of network intrusion detection. However, machine learning technology requires a large amount of data for training, and training data often contains privacy information, which brings a great risk of privacy leakage. At present, there are few researches on data privacy protection in the field of intrusion detection. Regarding the issue of privacy and security, we combine differential privacy and machine learning algorithms, including One-class Support Vector Machine (OCSVM) and Local Outlier Factor(LOF), to propose an hybrid intrusion detection system (IDS) with privacy protection. We add Laplacian noise to the original network intrusion detection data set to get differential privacy data sets with different privacy budgets, and proposed a hybrid IDS model based on machine learning to verify their utility. Experiments show that while protecting data privacy, the hybrid IDS can achieve detection accuracy comparable to traditional machine learning algorithms.
Keystroke dynamics is a behavioural biometric form of authentication based on the inherent typing behaviour of an individual. While this technique is gaining traction, protecting the privacy of the users is of utmost importance. Fully Homomorphic Encryption is a technique that allows performing computation on encrypted data, which enables processing of sensitive data in an untrusted environment. FHE is also known to be “future-proof” since it is a lattice-based cryptosystem that is regarded as quantum-safe. It has seen significant performance improvements over the years with substantially increased developer-friendly tools. We propose a neural network for keystroke analysis trained using differential privacy to speed up training while preserving privacy and predicting on encrypted data using FHE to keep the users' privacy intact while offering sufficient usability.
Pedestrian dead reckoning (PDR) is a widely used approach to estimate locations and trajectories. Accessing location-based services with trajectory data can bring convenience to people, but may also raise privacy concerns that need to be addressed. In this paper, a privacy-preserving pedestrian dead reckoning framework is proposed to protect a user’s trajectory privacy based on differential privacy. We introduce two metrics to quantify trajectory privacy and data utility. Our proposed privacy-preserving trajectory extraction algorithm consists of three mechanisms for the initial locations, stride lengths and directions. In addition, we design an adversary model based on particle filtering to evaluate the performance and demonstrate the effectiveness of our proposed framework with our collected sensor reading dataset.
Differential privacy is a concept to quantity the disclosure of private information that is controlled by the privacy parameter ε. However, an intuitive interpretation of ε is needed to explain the privacy loss to data engineers and data subjects. In this paper, we conduct a worst-case study of differential privacy risks. We generalize an existing model and reduce complexity to provide more understandable statements on the privacy loss. To this end, we analyze the impact of parameters and introduce the notion of a global privacy risk and global privacy leak.