Biblio

Filters: Author is He, Jianping  [Clear All Filters]
2023-08-18
Zheng, Chengxu, Wang, Xiaopeng, Luo, Xiaoyu, Fang, Chongrong, He, Jianping.  2022.  An OpenPLC-based Active Real-time Anomaly Detection Framework for Industrial Control Systems. 2022 China Automation Congress (CAC). :5899—5904.
In recent years, the design of anomaly detectors has attracted a tremendous surge of interest due to security issues in industrial control systems (ICS). Restricted by hardware resources, most anomaly detectors can only be deployed at the remote monitoring ends, far away from the control sites, which brings potential threats to anomaly detection. In this paper, we propose an active real-time anomaly detection framework deployed in the controller of OpenPLC, which is a standardized open-source PLC and has high scalability. Specifically, we add adaptive active noises to control signals, and then identify a linear dynamic system model of the plant offline and implement it in the controller. Finally, we design two filters to process the estimated residuals based on the obtained model and use χ2 detector for anomaly detection. Extensive experiments are conducted on an industrial control virtual platform to show the effectiveness of the proposed detection framework.
2023-05-12
Qin, Shuying, Fang, Chongrong, He, Jianping.  2022.  Towards Characterization of General Conditions for Correlated Differential Privacy. 2022 IEEE 19th International Conference on Mobile Ad Hoc and Smart Systems (MASS). :364–372.
Differential privacy is a widely-used metric, which provides rigorous privacy definitions and strong privacy guarantees. Much of the existing studies on differential privacy are based on datasets where the tuples are independent, and thus are not suitable for correlated data protection. In this paper, we focus on correlated differential privacy, by taking the data correlations and the prior knowledge of the initial data into account. The data correlations are modeled by Bayesian conditional probabilities, and the prior knowledge refers to the exact values of the data. We propose general correlated differential privacy conditions for the discrete and continuous random noise-adding mechanisms, respectively. In case that the conditions are inaccurate due to the insufficient prior knowledge, we introduce the tuple dependence based on rough set theory to improve the correlated differential privacy conditions. The obtained theoretical results reveal the relationship between the correlations and the privacy parameters. Moreover, the improved privacy condition helps strengthen the mechanism utility. Finally, evaluations are conducted over a micro-grid system to verify the privacy protection levels and utility guaranteed by correlated differential private mechanisms.
ISSN: 2155-6814
2021-06-02
Sun, Mingjing, Zhao, Chengcheng, He, Jianping.  2020.  Privacy-Preserving Correlated Data Publication with a Noise Adding Mechanism. 2020 IEEE 16th International Conference on Control Automation (ICCA). :494—499.
The privacy issue in data publication is critical and has been extensively studied. However, most of the existing works assume the data to be published is independent, i.e., the correlation among data is neglected. The correlation is unavoidable in data publication, which universally manifests intrinsic correlations owing to social, behavioral, and genetic relationships. In this paper, we investigate the privacy concern of data publication where deterministic and probabilistic correlations are considered, respectively. Specifically, (ε,δ)-multi-dimensional data-privacy (MDDP) is proposed to quantify the correlated data privacy. It characterizes the disclosure probability of the published data being jointly estimated with the correlation under a given accuracy. Then, we explore the effects of deterministic correlations on privacy disclosure. For deterministic correlations, it is shown that the successful disclosure rate with correlations increases compared to the one without knowing the correlation. Meanwhile, a closed-form solution of the optimal disclosure probability and the strict bound of privacy disclosure gain are derived. Extensive simulations on a real dataset verify our analytical results.