Visible to the public Biblio

Filters: Author is Li, Xianxian  [Clear All Filters]
2023-05-12
Wei, Yuecen, Fu, Xingcheng, Sun, Qingyun, Peng, Hao, Wu, Jia, Wang, Jinyan, Li, Xianxian.  2022.  Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation. 2022 IEEE International Conference on Data Mining (ICDM). :528–537.
Social networks are considered to be heterogeneous graph neural networks (HGNNs) with deep learning technological advances. HGNNs, compared to homogeneous data, absorb various aspects of information about individuals in the training stage. That means more information has been covered in the learning result, especially sensitive information. However, the privacy-preserving methods on homogeneous graphs only preserve the same type of node attributes or relationships, which cannot effectively work on heterogeneous graphs due to the complexity. To address this issue, we propose a novel heterogeneous graph neural network privacy-preserving method based on a differential privacy mechanism named HeteDP, which provides a double guarantee on graph features and topology. In particular, we first define a new attack scheme to reveal privacy leakage in the heterogeneous graphs. Specifically, we design a two-stage pipeline framework, which includes the privacy-preserving feature encoder and the heterogeneous link reconstructor with gradients perturbation based on differential privacy to tolerate data diversity and against the attack. To better control the noise and promote model performance, we utilize a bi-level optimization pattern to allocate a suitable privacy budget for the above two modules. Our experiments on four public benchmarks show that the HeteDP method is equipped to resist heterogeneous graph privacy leakage with admirable model generalization.
ISSN: 2374-8486
2022-07-29
Li, Xianxian, Fu, Xuemei, Yu, Feng, Shi, Zhenkui, Li, Jie, Yang, Junhao.  2021.  A Private Statistic Query Scheme for Encrypted Electronic Medical Record System. 2021 IEEE 24th International Conference on Computer Supported Cooperative Work in Design (CSCWD). :1033—1039.
In this paper, we propose a scheme that supports statistic query and authorized access control on an Encrypted Electronic Medical Records Databases(EMDB). Different from other schemes, it is based on Differential-Privacy(DP), which can protect the privacy of patients. By deploying an improved Multi-Authority Attribute-Based Encryption(MA-ABE) scheme, all authorities can distribute their search capability to clients under different authorities without additional negotiations. To our best knowledge, there are few studies on statistical queries on encrypted data. In this work, we consider that support differentially-private statistical queries. To improve search efficiency, we leverage the Bloom Filter(BF) to judge whether the keywords queried by users exists. Finally, we use experiments to verify and evaluate the feasibility of our proposed scheme.
2020-01-06
Li, Xianxian, Luo, Chunfeng, Liu, Peng, Wang, Li-e.  2019.  Information Entropy Differential Privacy: A Differential Privacy Protection Data Method Based on Rough Set Theory. 2019 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech). :918–923.

Data have become an important asset for analysis and behavioral prediction, especially correlations between data. Privacy protection has aroused academic and social concern given the amount of personal sensitive information involved in data. However, existing works assume that the records are independent of each other, which is unsuitable for associated data. Many studies either fail to achieve privacy protection or lead to excessive loss of information while applying data correlations. Differential privacy, which achieves privacy protection by injecting random noise into the statistical results given the correlation, will improve the background knowledge of adversaries. Therefore, this paper proposes an information entropy differential privacy solution for correlation data privacy issues based on rough set theory. Under the solution, we use rough set theory to measure the degree of association between attributes and use information entropy to quantify the sensitivity of the attribute. The information entropy difference privacy is achieved by clustering based on the correlation and adding personalized noise to each cluster while preserving the correlations between data. Experiments show that our algorithm can effectively preserve the correlation between the attributes while protecting privacy.