Visible to the public Biblio

Filters: Keyword is Control Theory and Privacy  [Clear All Filters]
2020-09-21
Lan, Jian, Gou, Shuai, Gu, Jiayi, Li, Gang, Li, Qin.  2019.  IoT Trajectory Data Privacy Protection Based on Enhanced Mix-zone. 2019 IEEE 3rd Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC). :942–946.
Trajectory data in the Internet of Things contains many behavioral information of users, and the method of Mix-zone can be used to separate the association among the user's movement trajectories. In this paper, the weighted undirected graph is used to establish a mathematical model for the Mix-zone, and a user flow-based algorithm is proposed to estimate the probability of migration between nodes in the graph. In response to the attack method basing on the migration probability, the traditional Mix-zone is improved. Finally, an algorithms for adaptively building enhanced Mix-zone is proposed and the simulation using real data sets shows the superiority of the algorithm.
Vasile, Mario, Groza, Bogdan.  2019.  DeMetrA - Decentralized Metering with user Anonymity and layered privacy on Blockchain. 2019 23rd International Conference on System Theory, Control and Computing (ICSTCC). :560–565.
Wear and tear are essential in establishing the market value of an asset. From shutter counters on DSLRs to odometers inside cars, specific counters, that encode the degree of wear, exist on most products. But malicious modification of the information that they report was always a concern. Our work explores a solution to this problem by using the blockchain technology, a layered encoding of product attributes and identity-based cryptography. Merging such technologies is essential since blockchains facilitate the construction of a distributed database that is resilient to adversarial modifications, while identity-based signatures set room for a more convenient way to check the correctness of the reported values based on the name of the product and pseudonym of the owner alone. Nonetheless, we reinforce security by using ownership cards deployed around NFC tokens. Since odometer fraud is still a major practical concern, we discuss a practical scenario centered on vehicles, but the framework can be easily extended to many other assets.
Akbay, Abdullah Basar, Wang, Weina, Zhang, Junshan.  2019.  Data Collection from Privacy-Aware Users in the Presence of Social Learning. 2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton). :679–686.
We study a model where a data collector obtains data from users through a payment mechanism to learn the underlying state from the elicited data. The private signal of each user represents her individual knowledge about the state. Through social interactions, each user can also learn noisy versions of her friends' signals, which is called group signals. Based on both her private signal and group signals, each user makes strategic decisions to report a privacy-preserved version of her data to the data collector. We develop a Bayesian game theoretic framework to study the impact of social learning on users' data reporting strategies and devise the payment mechanism for the data collector accordingly. Our findings reveal that, the Bayesian-Nash equilibrium can be in the form of either a symmetric randomized response (SR) strategy or an informative non-disclosive (ND) strategy. A generalized majority voting rule is applied by each user to her noisy group signals to determine which strategy to follow. When a user plays the ND strategy, she reports privacy-preserving data completely based on her group signals, independent of her private signal, which indicates that her privacy cost is zero. Both the data collector and the users can benefit from social learning which drives down the privacy costs and helps to improve the state estimation at a given payment budget. We derive bounds on the minimum total payment required to achieve a given level of state estimation accuracy.
Arrieta, Miguel, Esnaola, Iñaki, Effros, Michelle.  2019.  Universal Privacy Guarantees for Smart Meters. 2019 IEEE International Symposium on Information Theory (ISIT). :2154–2158.
Smart meters enable improvements in electricity distribution system efficiency at some cost in customer privacy. Users with home batteries can mitigate this privacy loss by applying charging policies that mask their underlying energy use. A battery charging policy is proposed and shown to provide universal privacy guarantees subject to a constraint on energy cost. The guarantee bounds our strategy's maximal information leakage from the user to the utility provider under general stochastic models of user energy consumption. The policy construction adapts coding strategies for non-probabilistic permuting channels to this privacy problem.
Pedram, Ali Reza, Tanaka, Takashi, Hale, Matthew.  2019.  Bidirectional Information Flow and the Roles of Privacy Masks in Cloud-Based Control. 2019 IEEE Information Theory Workshop (ITW). :1–5.
We consider a cloud-based control architecture for a linear plant with Gaussian process noise, where the state of the plant contains a client's sensitive information. We assume that the cloud tries to estimate the state while executing a designated control algorithm. The mutual information between the client's actual state and the cloud's estimate is adopted as a measure of privacy loss. We discuss the necessity of uplink and downlink privacy masks. After observing that privacy is not necessarily a monotone function of the noise levels of privacy masks, we discuss the joint design procedure for uplink and downlink privacy masks. Finally, the trade-off between privacy and control performance is explored.
Sultangazin, Alimzhan, Tabuada, Paulo.  2019.  Symmetries and privacy in control over the cloud: uncertainty sets and side knowledge*. 2019 IEEE 58th Conference on Decision and Control (CDC). :7209–7214.
Control algorithms, like model predictive control, can be computationally expensive and may benefit from being executed over the cloud. This is especially the case for nodes at the edge of a network since they tend to have reduced computational capabilities. However, control over the cloud requires transmission of sensitive data (e.g., system dynamics, measurements) which undermines privacy of these nodes. When choosing a method to protect the privacy of these data, efficiency must be considered to the same extent as privacy guarantees to ensure adequate control performance. In this paper, we review a transformation-based method for protecting privacy, previously introduced by the authors, and quantify the level of privacy it provides. Moreover, we also consider the case of adversaries with side knowledge and quantify how much privacy is lost as a function of the side knowledge of the adversary.
Zhang, Xuejun, Chen, Qian, Peng, Xiaohui, Jiang, Xinlong.  2019.  Differential Privacy-Based Indoor Localization Privacy Protection in Edge Computing. 2019 IEEE SmartWorld, Ubiquitous Intelligence Computing, Advanced Trusted Computing, Scalable Computing Communications, Cloud Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI). :491–496.

With the popularity of smart devices and the widespread use of the Wi-Fi-based indoor localization, edge computing is becoming the mainstream paradigm of processing massive sensing data to acquire indoor localization service. However, these data which were conveyed to train the localization model unintentionally contain some sensitive information of users/devices, and were released without any protection may cause serious privacy leakage. To solve this issue, we propose a lightweight differential privacy-preserving mechanism for the edge computing environment. We extend ε-differential privacy theory to a mature machine learning localization technology to achieve privacy protection while training the localization model. Experimental results on multiple real-world datasets show that, compared with the original localization technology without privacy-preserving, our proposed scheme can achieve high accuracy of indoor localization while providing differential privacy guarantee. Through regulating the value of ε, the data quality loss of our method can be controlled up to 8.9% and the time consumption can be almost negligible. Therefore, our scheme can be efficiently applied in the edge networks and provides some guidance on indoor localization privacy protection in the edge computing.

Zhang, Xianzhen, Chen, Zhanfang, Gong, Yue, Liu, Wen.  2019.  A Access Control Model of Associated Data Sets Based on Game Theory. 2019 International Conference on Machine Learning, Big Data and Business Intelligence (MLBDBI). :1–4.
With the popularity of Internet applications and rapid development, data using and sharing process may lead to the sensitive information divulgence. To deal with the privacy protection issue more effectively, in this paper, we propose the associated data sets protection model based on game theory from the point of view of realizing benefits from the access of privacy is about happen, quantify the extent to which visitors gain sensitive information, then compares the tolerance of the sensitive information owner and finally decides whether to allow the visitor to make an access request.
Ding, Hongfa, Peng, Changgen, Tian, Youliang, Xiang, Shuwen.  2019.  A Game Theoretical Analysis of Risk Adaptive Access Control for Privacy Preserving. 2019 International Conference on Networking and Network Applications (NaNA). :253–258.

More and more security and privacy issues are arising as new technologies, such as big data and cloud computing, are widely applied in nowadays. For decreasing the privacy breaches in access control system under opening and cross-domain environment. In this paper, we suggest a game and risk based access model for privacy preserving by employing Shannon information and game theory. After defining the notions of Privacy Risk and Privacy Violation Access, a high-level framework of game theoretical risk based access control is proposed. Further, we present formulas for estimating the risk value of access request and user, construct and analyze the game model of the proposed access control by using a multi-stage two player game. There exists sub-game perfect Nash equilibrium each stage in the risk based access control and it's suitable to protect the privacy by limiting the privacy violation access requests.

2020-04-20
Takbiri, Nazanin, Shao, Xiaozhe, Gao, Lixin, Pishro-Nik, Hossein.  2019.  Improving Privacy in Graphs Through Node Addition. 2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton). :487–494.

The rapid growth of computer systems which generate graph data necessitates employing privacy-preserving mechanisms to protect users' identity. Since structure-based de-anonymization attacks can reveal users' identity's even when the graph is simply anonymized by employing naïve ID removal, recently, k- anonymity is proposed to secure users' privacy against the structure-based attack. Most of the work ensured graph privacy using fake edges, however, in some applications, edge addition or deletion might cause a significant change to the key property of the graph. Motivated by this fact, in this paper, we introduce a novel method which ensures privacy by adding fake nodes to the graph. First, we present a novel model which provides k- anonymity against one of the strongest attacks: seed-based attack. In this attack, the adversary knows the partial mapping between the main graph and the graph which is generated using the privacy-preserving mechanisms. We show that even if the adversary knows the mapping of all of the nodes except one, the last node can still have k- anonymity privacy. Then, we turn our attention to the privacy of the graphs generated by inter-domain routing against degree attacks in which the degree sequence of the graph is known to the adversary. To ensure the privacy of networks against this attack, we propose a novel method which tries to add fake nodes in a way that the degree of all nodes have the same expected value.

2019-12-16
Hou, Ming, Li, Dequan, Wu, Xiongjun, Shen, Xiuyu.  2019.  Differential Privacy of Online Distributed Optimization under Adversarial Nodes. 2019 Chinese Control Conference (CCC). :2172-2177.

Nowadays, many applications involve big data and big data analysis methods appear in many fields. As a preliminary attempt to solve the challenge of big data analysis, this paper presents a distributed online learning algorithm based on differential privacy. Since online learning can effectively process sensitive data, we introduce the concept of differential privacy in distributed online learning algorithms, with the aim at ensuring data privacy during online learning to prevent adversarial nodes from inferring any important data information. In particular, for different adversary models, we consider different type graphs to tolerate a limited number of adversaries near each regular node or tolerate a global limited number of adversaries.