Visible to the public Privacy Models, 2014

SoS Newsletter- Advanced Book Block

 
SoS Logo

Privacy Models, 2014

 

Privacy issues have emerged as a major area of interest and research.  As with so much in the Science of Security, efforts to chart the scope and to develop models for visualizing privacy are a topic of prime interest.  The articles cited here appeared in 2014.


 

Hermans, J.; Peeters, R.; Preneel, B., "Proper RFID Privacy: Model and Protocols," Mobile Computing, IEEE Transactions on , vol.13, no.12, pp.2888,2902, Dec. 1 2014. doi: 10.1109/TMC.2014.2314127
Abstract: We approach RFID privacy both from modelling and protocol point of view. Our privacy model avoids the drawbacks of several proposed RFID privacy models that either suffer from insufficient generality or put forward unrealistic assumptions regarding the adversary's ability to corrupt tags. Furthermore, our model can handle multiple readers and introduces two new privacy notions to capture the recently discovered insider attackers. We analyse multiple existing RFID protocols, demonstrating the easy applicability of our model, and propose a new wide-forward-insider private RFID authentication protocol. This protocol provides sufficient privacy guarantees for most practical applications and is the most efficient of its kind, it only requires two scalar-EC point multiplications.
Keywords: protocols; radiofrequency identification; telecommunication security; RFID authentication protocol; RFID privacy models; insider attackers; privacy notions; proper RFID privacy; Authentication; Computational modeling; Privacy; Protocols; Radiofrequency identification; Computer security; RFID tags; authentication; cryptography; privacy (ID#: 15-5225)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6779604&isnumber=6939756

 

Al-Jaberi, M.F.; Zainal, A., "Data Integrity And Privacy Model In Cloud Computing," Biometrics and Security Technologies (ISBAST), 2014 International Symposium on, pp. 280, 284, 26-27 Aug. 2014. doi: 10.1109/ISBAST.2014.7013135
Abstract: Cloud computing is the future of computing industry and it is believed to be the next generation of computing technology. Among the major concern in cloud computing is data integrity and privacy. Clients require their data to be safe and private from any tampering or unauthorized access. Various algorithms and protocols (MD5, AES, and RSA-based PHE) are implemented by the various components of this model to provide the maximum levels of integrity management and privacy preservation for data stored in public cloud such as Amazon S3. The impact of algorithms and protocols, used to ensure data integrity and privacy, is studied to test the performance of the proposed model. The prototype system showed that data integrity and privacy are ensured against unauthorized parties. This model reduces the burden of checking the integrity of data stored in cloud storage by utilizing a third party, integrity checking service, and applies security mechanism that ensure privacy and confidentiality of data stored in cloud computing. This paper proposes an architecture based model that provides data integrity verification and privacy preserving in cloud computing.
Keywords: authorisation; cloud computing; data integrity; data privacy; cloud computing; data integrity; data privacy; unauthorized access; Cloud computing; Computational modeling; Data models; Data privacy; Encryption; Amazon S3;Cloud computing; data integrity; data privacy (ID#: 15-5226)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7013135&isnumber=7013076

 

Tbahriti, S.-E.; Ghedira, C.; Medjahed, B.; Mrissa, M., "Privacy-Enhanced Web Service Composition," Services Computing, IEEE Transactions on, vol. 7, no. 2, pp. 210, 222, April-June 2014. doi: 10.1109/TSC.2013.18
Abstract: Data as a Service (DaaS) builds on service-oriented technologies to enable fast access to data resources on the Web. However, this paradigm raises several new privacy concerns that traditional privacy models do not handle. In addition, DaaS composition may reveal privacy-sensitive information. In this paper, we propose a formal privacy model in order to extend DaaS descriptions with privacy capabilities. The privacy model allows a service to define a privacy policy and a set of privacy requirements. We also propose a privacy-preserving DaaS composition approach allowing to verify the compatibility between privacy requirements and policies in DaaS composition. We propose a negotiation mechanism that makes it possible to dynamically reconcile the privacy capabilities of services when incompatibilities arise in a composition. We validate the applicability of our proposal through a prototype implementation and a set of experiments.
Keywords: Web services; cloud computing; data privacy; Data as a Service; negotiation mechanism; privacy model; privacy policy; privacy requirements; privacy-enhanced Web service composition; privacy-preserving DaaS composition; privacy-sensitive information; DNA; Data models; Data privacy; Government; Phase change materials; Privacy; Web services; DaaS services; Service composition; negotiation; privacy (ID#: 15-5227)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6475932&isnumber=6828820

 

Xichen Wu; Guangzhong Sun, "A Novel Dummy-Based Mechanism to Protect Privacy on Trajectories," Data Mining Workshop (ICDMW), 2014 IEEE International Conference on, pp. 1120, 1125, 14-14 Dec. 2014. doi: 10.1109/ICDMW.2014.122
Abstract: In recent years, wireless communication technologies and accurate positioning devices enable us to enjoy various types of location-based services. However, revealing users location information to potentially untrusted LBS providers is one of the most significant privacy threats in location-based services. The dummy-based privacy-preserving approach is a popular technology that can protect real trajectories from exposing to attackers. Moreover, it does not need a trusted third part, while guaranteeing the quality of service. When user requests a service, dummy trajectories anony mize the real trajectory to satisfy privacy-preserving requirements. In this paper, we propose a new privacy model that includes three reasonable privacy metrics. We also design a new algorithm named adaptive dummy trajectories generation algorithm (ADTGA) to derive uniformly distributed dummy trajectories. Dummy trajectories generated by our algorithm can achieve stricter privacy-preserving requirements based on our privacy model. The experimental results show that our proposed algorithm can use fewer dummy trajectories to satisfy the same privacy-preserving requirement than existing algorithms, and the distribution of dummy trajectories is more uniformly.
Keywords: data privacy; ADTGA; adaptive dummy trajectories generation algorithm; distributed dummy trajectories; dummy-based mechanism; dummy-based privacy-preserving approach; location-based services; privacy metrics; privacy model; privacy protection; privacy-preserving requirements; quality of service; untrusted LBS providers; user requests; users location information; wireless communication technologies; Adaptation models; Algorithm design and analysis; Educational institutions; Measurement; Privacy; Trajectory; Dummy-based anonymization; Location-based services; Trajectory privacy (ID#: 15-5228)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7022721&isnumber=7022545

 

Abdolmaleki, B.; Baghery, K.; Akhbari, B.; Aref, M.R., "Attacks And Improvements On Two New-Found RFID Authentication Protocols," Telecommunications (IST), 2014 7th International Symposium on,  pp. 895, 900, 9-11 Sept. 2014. doi: 10.1109/ISTEL.2014.7000830
Abstract: In recent years, in order to provide secure communication between Radio Frequency Identification (RFID) users different RFID authentication protocols have been proposed. In this paper, we investigate weaknesses of two newfound RFID authentication protocols that proposed by Shi et al. and Liu et al. in 2014. The Ouafi-Phan privacy model is used for privacy analysis. We show that these two protocols have some weaknesses and could not provide the security and the privacy of RFID users. Furthermore, two improved protocols are proposed that eliminate existing weaknesses in Shi et al.'s and Liu et al.'s protocols.
Keywords: cryptographic protocols; data privacy; radiofrequency identification; Ouafi-Phan privacy model; RFID authentication protocols; privacy analysis; radiofrequency identification users; secure communication; Authentication; Games; Privacy; Protocols; Radiofrequency identification; Servers; CRC; Hash function; NTRU; RFID authentication protocols; public-key; security and privacy (ID#: 15-5229)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7000830&isnumber=7000650

 

Sohrabi-Bonab, Z.; Alagheband, M.R.; Aref, M.R., "Formal Cryptanalysis Of A CRC-Based RFID Authentication Protocol," Electrical Engineering (ICEE), 2014 22nd Iranian Conference on, pp. 1642, 1647, 20-22 May 2014. doi: 10.1109/IranianCEE.2014.6999801
Abstract: Recently, Pang et al. proposed a secure and efficient lightweight mutual authentication protocol [1]. Their scheme is EPC Class 1 Generation 2 compatible and based on both the cyclic redundancy codes (CRC) and the pseudo random number generator (PRNG). Although authors claimed that the proposed protocol is secure against all attacks, in this paper we utilize the Vaudenay's privacy model to prove that the scheme only supports the lowest privacy level and it is traceable as well. Furthermore, an improved scheme with higher privacy is proposed. Also, the privacy of the our proposed protocol is proved in formal model.
Keywords: cryptographic protocols; cyclic redundancy check codes; radiofrequency identification; random number generation; telecommunication security; CRC-based RFID lightweight mutual authentication protocol formal cryptanalysis; EPC Class 1 Generation 2;PRNG;Vaudenay privacy model; cyclic redundancy code; pseudorandom number generator; radiofrequency identification; Authentication; Cryptography; Polynomials; Privacy; Protocols; Radiofrequency identification; Standards; CRC function; Privacy; RFID authentication (ID#: 15-5230)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6999801&isnumber=6999486

 

Nagendrakumar, S.; Aparna, R.; Ramesh, S., "A Non-Grouping Anonymity Model For Preserving Privacy In Health Data Publishing," Science Engineering and Management Research (ICSEMR), 2014 International Conference on, pp. 1, 6, 27-29 Nov. 2014. doi: 10.1109/ICSEMR.2014.7043554
Abstract: Publishing health data may jeopardize privacy breaches, since they contain sensitive information about the individuals. Privacy preserving data publishing (PPDP) addresses the problem of revealing sensitive data when extracting the useful data. The existing privacy models are group based anonymity models. Hence, these models consider the privacy of the individual only in a group based manner. And those groups are the hunting ground for the adversaries. All data re-identification attacks are based on the group of records. The root cause behind our approach is that the k-anonymity problem can be viewed as a clustering approach. Though the k-anonymity problem does not insist on the number of clusters, it requires that each group must contain at least k-records. We propose a Non-Grouping Anonymity model; this gives a basic level of anonymization that prevents an individual being re-identified from their published data.
Keywords: data privacy; electronic publishing; medical information systems; pattern clustering; security of data; PPDP; anonymization; clustering approach; data re-identification attacks; group based anonymity model; health data publishing privacy; k-anonymity problem; nongrouping anonymity model; privacy breaches; privacy model; privacy preserving data publishing; sensitive data; sensitive information; Data models; Data privacy; Loss measurement; Privacy; Publishing; Taxonomy; Vegetation; Anonymity; Privacy in Data Publishing; data Privacy; data Utility (ID#: 15-5231)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7043554&isnumber=7043537

 

Zhang, X.; Dou, W.; Pei, J.; Nepal, S.; Yang, C.; Liu, C.; Chen, J., "Proximity-Aware Local-Recoding Anonymization with MapReduce for Scalable Big Data Privacy Preservation in Cloud," Computers, IEEE Transactions on, vol. PP, no.99, pp. 1, 1, 26 September 2014. doi: 10.1109/TC.2014.2360516
Abstract: Cloud computing provides promising scalable IT infrastructure to support various processing of a variety of big data applications in sectors such as healthcare and business. Data sets like electronic health records in such applications often contain privacy-sensitive information, which brings about privacy concerns potentially if the information is released or shared to third-parties in cloud. A practical and widely-adopted technique for data privacy preservation is to anonymize data via generalization to satisfy a given privacy model. However, most existing privacy preserving approaches tailored to small-scale data sets often fall short when encountering big data, due to their insufficiency or poor scalability. In this paper, we investigate the local-recoding problem for big data anonymization against proximity privacy breaches and attempt to identify a scalable solution to this problem. Specifically, we present a proximity privacy model with allowing semantic proximity of sensitive values and multiple sensitive attributes, and model the problem of local recoding as a proximity-aware clustering problem. A scalable two-phase clustering approach consisting of a t-ancestors clustering (similar to k-means) algorithm and a proximity-aware agglomerative clustering algorithm is proposed to address the above problem. We design the algorithms with MapReduce to gain high scalability by performing data-parallel computation in cloud. Extensive experiments on real-life data sets demonstrate that our approach significantly improves the capability of defending the proximity privacy breaches, the scalability and the time-efficiency of local-recoding anonymization over existing approaches.
Keywords: Big data; Couplings; Data models; Data privacy; Numerical models; Privacy; Scalability; Big Data; Cloud Computing; Data Anonymization; MapReduce; Proximity Privacy (ID#: 15-5232)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6911981&isnumber=4358213

 

Zhou, J.; Lin, X.; Dong, X.; Cao, Z., "PSMPA: Patient Self-controllable and Multi-level Privacy-preserving Cooperative Authentication in Distributed m-Healthcare Cloud Computing System," Parallel and Distributed Systems, IEEE Transactions on , vol. PP, no.99, pp.1,1, 27 March 2014. doi: 10.1109/TPDS.2014.2314119
Abstract: Distributed m-healthcare cloud computing system significantly facilitates efficient patient treatment for medical consultation by sharing personal health information among healthcare providers. However, it brings about the challenge of keeping both the data confidentiality and patients’ identity privacy simultaneously. Many existing access control and anonymous authentication schemes cannot be straightforwardly exploited. To solve the problem, in this paper, a novel authorized accessible privacy model (AAPM) is established. Patients can authorize physicians by setting an access tree supporting flexible threshold predicates. Then, based on it, by devising a new technique of attribute-based designated verifier signature, a patient selfcontrollable multi-level privacy-preserving cooperative authentication scheme (PSMPA) realizing three levels of security and privacy requirement in distributed m-healthcare cloud computing system is proposed. The directly authorized physicians, the indirectly authorized physicians and the unauthorized persons in medical consultation can respectively decipher the personal health information and/or verify patients’ identities by satisfying the access tree with their own attribute sets. Finally, the formal security proof and simulation results illustrate our scheme can resist various kinds of attacks and far outperforms the previous ones in terms of computational, communication and storage overhead.
Keywords: Authentication; Cloud computing; Computational modeling; Medical services; Privacy; Public key (ID#: 15-5233)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6779640&isnumber=4359390

 

Yilin Shen; Hongxia Jin, "Privacy-Preserving Personalized Recommendation: An Instance-Based Approach via Differential Privacy," Data Mining (ICDM), 2014 IEEE International Conference on, pp. 540, 549, 14-17 Dec. 2014. doi: 10.1109/ICDM.2014.140
Abstract: Recommender systems become increasingly popular and widely applied nowadays. The release of users' private data is required to provide users accurate recommendations, yet this has been shown to put users at risk. Unfortunately, existing privacy-preserving methods are either developed under trusted server settings with impractical private recommender systems or lack of strong privacy guarantees. In this paper, we develop the first lightweight and provably private solution for personalized recommendation, under untrusted server settings. In this novel setting, users' private data is obfuscated before leaving their private devices, giving users greater control on their data and service providers less responsibility on privacy protections. More importantly, our approach enables the existing recommender systems (with no changes needed) to directly use perturbed data, rendering our solution very desirable in practice. We develop our data perturbation approach on differential privacy, the state-of-the-art privacy model with lightweight computation and strong but provable privacy guarantees. In order to achieve useful and feasible perturbations, we first design a novel relaxed admissible mechanism enabling the injection of flexible instance-based noises. Using this novel mechanism, our data perturbation approach, incorporating the noise calibration and learning techniques, obtains perturbed user data with both theoretical privacy and utility guarantees. Our empirical evaluation on large-scale real-world datasets not only shows its high recommendation accuracy but also illustrates the negligible computational overhead on both personal computers and smart phones. As such, we are able to meet two contradictory goals, privacy preservation and recommendation accuracy. This practical technology helps to gain user adoption with strong privacy protection and benefit companies with high-quality personalized services on perturbed user data.
Keywords: calibration; data privacy; personal computing; recommender systems; trusted computing; computational overhead; data perturbation; differential privacy; high quality personalized services; noise calibration; perturbed user data; privacy preservation; privacy protections; privacy-preserving methods; privacy-preserving personalized recommendation; private recommender systems; provable privacy guarantees; recommendation accuracy; smart phones; strong privacy protection; theoretical privacy; untrusted server settings; user adoption; user private data; utility guarantees; Aggregates; Data privacy; Noise; Privacy; Sensitivity; Servers; Vectors; Data Perturbation; Differential Privacy; Learning and Optimization; Probabilistic Analysis; Recommender System (ID#: 15-5234)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7023371&isnumber=7023305

 

Abbasi, Khurrum Mustafa; ul Haq, Irfan; Malik, Ahmad Kamran; Khalid, Shehzad; Fazil, Saba; Durad, Hanif, "On Access Control Of Cloud Service Chains," Multi-Topic Conference (INMIC), 2014 IEEE 17th International, pp. 293, 298, 8-10 Dec. 2014. doi: 10.1109/INMIC.2014.7097354
Abstract: Service-oriented architecture may be regarded as an incubator for small resources entrepreneurs to bid and work on bigger projects. It also helps large enterprise to trade their resources at various levels. This has opened new gateways for renting out resources. Sometimes a single service is sold at different levels making the Cloud service a supply chain of added value. This supply chain which is built on the same resources but varying claims of ownership, poses novel challenges related to security, trust and privacy of data. There is still no popular system of governing body which can glue together the participating stakeholders through mutual trust and organizational policies. A governing mechanism that can preserve stakeholders' privacy issues and resolve their conflicts throughout the emerging service chains is also non-existent. In this paper we are introducing a mechanism of access control for such Cloud service chains. Building on our pevious work of SLA-based privacy model, we have discussed the realization of Role-based Access Control (RBAC) to services of federated-cloud. The main advantage of RBAC is that it provides an efficient control to resources and data access. We have also provided a preliminary analysis of this on-going research.

Keywords: Access control; Automation; Engines; Mathematical model; Privacy; Service-oriented architecture; Supply chains (ID#: 15-5235)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7097354&isnumber=7096896

 

Lo, N.-W.; Yeh, K.-H.; Fan, C.-Y., "Leakage Detection and Risk Assessment on Privacy for Android Applications: LRPdroid," Systems Journal, IEEE, vol. PP, no.99, pp. 1, 9, 18 December 2014. doi: 10.1109/JSYST.2014.2364202
Abstract: How to identify and manage information leakage of user privacy is a very crucial and sensitive topic for handheld mobile device manufacturers, telecommunication companies, and mobile device users. As the success of a financial fraud usually requires possessing a victim's private information, new types of personal identity theft and private information acquirement attack are developed and deployed along with various Apps in order to steal personal private information from mobile device users. With more than 50% of smartphone market share, Android-based mobile phone vendors and Internet service providers have to face the new challenge on user privacy management. In this paper, we present a user privacy analysis framework for an Android platform called LRPdroid. The goals of LRPdroid are to achieve information leakage detection, user privacy disclosure evaluation, and privacy risk assessment for Apps installed on Android-based mobile devices. With a formally defined user privacy model, LRPdroid can effectively support mobile users to manage their own privacy risks on targeted Apps. In addition, new privacy analysis viewpoints such as user perception and leakage awareness are introduced in LRPdroid. Two general App usage scenarios are evaluated with our system prototype to show the feasibility and practicability of the LRPdroid framework on user privacy management.
Keywords: Androids; Data privacy; Humanoid robots; Mobile communication ;Privacy; Smart phones; Android; information leakage; privacy disclosure; risk assessment; security (ID#: 15-5236)
URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6985559&isnumber=4357939


Note:

Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to news@scienceofsecurity.net for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.