Visible to the public Biblio

Filters: Keyword is Privacy Policies  [Clear All Filters]
2021-10-12
Liao, Guocheng, Chen, Xu, Huang, Jianwei.  2020.  Privacy Policy in Online Social Network with Targeted Advertising Business. IEEE INFOCOM 2020 - IEEE Conference on Computer Communications. :934–943.
In an online social network, users exhibit personal information to enjoy social interaction. The social network provider (SNP) exploits users' information for revenue generation through targeted advertising. The SNP can present ads to proper users efficiently. Therefore, an advertiser is more willing to pay for targeted advertising. However, the over-exploitation of users' information would invade users' privacy, which would negatively impact users' social activeness. Motivated by this, we study the optimal privacy policy of the SNP with targeted advertising business. We characterize the privacy policy in terms of the fraction of users' information that the provider should exploit, and formulate the interactions among users, advertiser, and SNP as a three-stage Stackelberg game. By carefully leveraging supermodularity property, we reveal from the equilibrium analysis that higher information exploitation will discourage users from exhibiting information, lowering the overall amount of exploited information and harming advertising revenue. We further characterize the optimal privacy policy based on the connection between users' information levels and privacy policy. Numerical results reveal some useful insights that the optimal policy can well balance the users' trade-off between social benefit and privacy loss.
Faurie, Pascal, Moldovan, Arghir-Nicolae, Tal, Irina.  2020.  Privacy Policy – ``I Agree''⁈ – Do Alternatives to Text-Based Policies Increase the Awareness of the Users? 2020 International Conference on Cyber Security and Protection of Digital Services (Cyber Security). :1–6.
Since GDPR was introduced, there is a reinforcement of the fact that users must give their consent before their personal data can be managed by any website. However, many studies have demonstrated that users often skip these policies and click the "I agree" button to continue browsing, being unaware of what the consent they gave was about, hence defeating the purpose of GDPR. This paper investigates if different ways of presenting users the privacy policy can change this behaviour and can lead to an increased awareness of the user in relation to what the user agrees with. Three different types of policies were used in the study: a full-text policy, a so-called usable policy, and a video-based policy. Results demonstrated that the type of policy has a direct influence on the user awareness and user satisfaction. The two alternatives to the text-based policy lead to a significant increase of user awareness in relation to the content of the policy and to a significant increase in the user satisfaction in relation to the usability of the policy.
Farooq, Emmen, Nawaz UI Ghani, M. Ahmad, Naseer, Zuhaib, Iqbal, Shaukat.  2020.  Privacy Policies' Readability Analysis of Contemporary Free Healthcare Apps. 2020 14th International Conference on Open Source Systems and Technologies (ICOSST). :1–7.
mHealth apps have a vital role in facilitation of human health management. Users have to enter sensitive health related information in these apps to fully utilize their functionality. Unauthorized sharing of sensitive health information is undesirable by the users. mHealth apps also collect data other than that required for their functionality like surfing behavior of a user or hardware details of devices used. mHealth software and their developers also share such data with third parties for reasons other than medical support provision to the user, like advertisements of medicine and health insurance plans. Existence of a comprehensive and easy to understand data privacy policy, on user data acquisition, sharing and management is a salient requirement of modern user privacy protection demands. Readability is one parameter by which ease of understanding of privacy policy is determined. In this research, privacy policies of 27 free Android, medical apps are analyzed. Apps having user rating of 4.0 and downloads of 1 Million or more are included in data set of this research.RGL, Flesch-Kincaid Reading Grade Level, SMOG, Gunning Fox, Word Count, and Flesch Reading Ease of privacy policies are calculated. Average Reading Grade Level of privacy policies is 8.5. It is slightly greater than average adult RGL in the US. Free mHealth apps have a large number of users in other, less educated parts of the World. Privacy policies with an average RGL of 8.5 may be difficult to comprehend in less educated populations.
Al Omar, Abdullah, Jamil, Abu Kaisar, Nur, Md. Shakhawath Hossain, Hasan, Md Mahamudul, Bosri, Rabeya, Bhuiyan, Md Zakirul Alam, Rahman, Mohammad Shahriar.  2020.  Towards A Transparent and Privacy-Preserving Healthcare Platform with Blockchain for Smart Cities. 2020 IEEE 19th International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom). :1291–1296.
In smart cities, data privacy and security issues of Electronic Health Record(EHR) are grabbing importance day by day as cyber attackers have identified the weaknesses of EHR platforms. Besides, health insurance companies interacting with the EHRs play a vital role in covering the whole or a part of the financial risks of a patient. Insurance companies have specific policies for which patients have to pay them. Sometimes the insurance policies can be altered by fraudulent entities. Another problem that patients face in smart cities is when they interact with a health organization, insurance company, or others, they have to prove their identity to each of the organizations/companies separately. Health organizations or insurance companies have to ensure they know with whom they are interacting. To build a platform where a patient's personal information and insurance policy are handled securely, we introduce an application of blockchain to solve the above-mentioned issues. In this paper, we present a solution for the healthcare system that will provide patient privacy and transparency towards the insurance policies incorporating blockchain. Privacy of the patient information will be provided using cryptographic tools.
Martiny, Karsten, Denker, Grit.  2020.  Partial Decision Overrides in a Declarative Policy Framework. 2020 IEEE 14th International Conference on Semantic Computing (ICSC). :271–278.
The ability to specify various policies with different overriding criteria allows for complex sets of sharing policies. This is particularly useful in situations in which data privacy depends on various properties of the data, and complex policies are needed to express the conditions under which data is protected. However, if overriding policy decisions constrain the affected data, decisions from overridden policies should not be suppressed completely, because they can still apply to subsets of the affected data. This article describes how a privacy policy framework can be extended with a mechanism to partially override decisions based on specified constraints. Our solution automatically generates complementary sets of decisions for both the overridden and the complementary, non-overridden subsets of the data, and thus, provides a means to specify a complex policies tailored to specific properties of the protected data.
Zaeem, Razieh Nokhbeh, Anya, Safa, Issa, Alex, Nimergood, Jake, Rogers, Isabelle, Shah, Vinay, Srivastava, Ayush, Barber, K. Suzanne.  2020.  PrivacyCheck's Machine Learning to Digest Privacy Policies: Competitor Analysis and Usage Patterns. 2020 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT). :291–298.
Online privacy policies are lengthy and hard to comprehend. To address this problem, researchers have utilized machine learning (ML) to devise tools that automatically summarize online privacy policies for web users. One such tool is our free and publicly available browser extension, PrivacyCheck. In this paper, we enhance PrivacyCheck by adding a competitor analysis component-a part of PrivacyCheck that recommends other organizations in the same market sector with better privacy policies. We also monitored the usage patterns of about a thousand actual PrivacyCheck users, the first work to track the usage and traffic of an ML-based privacy analysis tool. Results show: (1) there is a good number of privacy policy URLs checked repeatedly by the user base; (2) the users are particularly interested in privacy policies of software services; and (3) PrivacyCheck increased the number of times a user consults privacy policies by 80%. Our work demonstrates the potential of ML-based privacy analysis tools and also sheds light on how these tools are used in practice to give users actionable knowledge they can use to pro-actively protect their privacy.
Tavakolan, Mona, Faridi, Ismaeel A..  2020.  Applying Privacy-Aware Policies in IoT Devices Using Privacy Metrics. 2020 International Conference on Communications, Computing, Cybersecurity, and Informatics (CCCI). :1–5.
In recent years, user's privacy has become an important aspect in the development of Internet of Things (IoT) devices. However, there has been comparatively little research so far that aims to understanding user's privacy in connection with IoT. Many users are worried about protecting their personal information, which may be gathered by IoT devices. In this paper, we present a new method for applying the user's preferences within the privacy-aware policies in IoT devices. Users can prioritize a set of extendable privacy policies based on their preferences. This is achieved by assigning weights to these policies to form ranking criteria. A privacy-aware index is then calculated based on these ranking. In addition, IoT devices can be clustered based on their privacy-aware index value. In this paper, we present a new method for applying the user's preferences within the privacy-aware policies in IoT devices. Users can prioritize a set of extendable privacy policies based on their preferences. This is achieved by assigning weights to these policies to form ranking criteria. A privacy-aware index is then calculated based on these ranking. In addition, IoT devices can be clustered based on their privacy-aware index value.
Onu, Emmanuel, Mireku Kwakye, Michael, Barker, Ken.  2020.  Contextual Privacy Policy Modeling in IoT. 2020 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech). :94–102.
The Internet of Things (IoT) has been one of the biggest revelations of the last decade. These cyber-physical systems seamlessly integrate and improve the activities in our daily lives. Hence, creating a wide application for it in several domains, such as smart buildings and cities. However, the integration of IoT also comes with privacy challenges. The privacy challenges result from the ability of these devices to pervasively collect personal data about individuals through sensors in ways that could be unknown to them. A number of research efforts have evaluated privacy policy awareness and enforcement as key components for addressing these privacy challenges. This paper provides a framework for understanding contextualized privacy policy within the IoT domain. This will enable IoT privacy researchers to better understand IoT privacy policies and their modeling.
2021-06-02
Gohari, Parham, Hale, Matthew, Topcu, Ufuk.  2020.  Privacy-Preserving Policy Synthesis in Markov Decision Processes. 2020 59th IEEE Conference on Decision and Control (CDC). :6266—6271.
In decision-making problems, the actions of an agent may reveal sensitive information that drives its decisions. For instance, a corporation's investment decisions may reveal its sensitive knowledge about market dynamics. To prevent this type of information leakage, we introduce a policy synthesis algorithm that protects the privacy of the transition probabilities in a Markov decision process. We use differential privacy as the mathematical definition of privacy. The algorithm first perturbs the transition probabilities using a mechanism that provides differential privacy. Then, based on the privatized transition probabilities, we synthesize a policy using dynamic programming. Our main contribution is to bound the "cost of privacy," i.e., the difference between the expected total rewards with privacy and the expected total rewards without privacy. We also show that computing the cost of privacy has time complexity that is polynomial in the parameters of the problem. Moreover, we establish that the cost of privacy increases with the strength of differential privacy protections, and we quantify this increase. Finally, numerical experiments on two example environments validate the established relationship between the cost of privacy and the strength of data privacy protections.
2021-01-28
Fan, M., Yu, L., Chen, S., Zhou, H., Luo, X., Li, S., Liu, Y., Liu, J., Liu, T..  2020.  An Empirical Evaluation of GDPR Compliance Violations in Android mHealth Apps. 2020 IEEE 31st International Symposium on Software Reliability Engineering (ISSRE). :253—264.

The purpose of the General Data Protection Regulation (GDPR) is to provide improved privacy protection. If an app controls personal data from users, it needs to be compliant with GDPR. However, GDPR lists general rules rather than exact step-by-step guidelines about how to develop an app that fulfills the requirements. Therefore, there may exist GDPR compliance violations in existing apps, which would pose severe privacy threats to app users. In this paper, we take mobile health applications (mHealth apps) as a peephole to examine the status quo of GDPR compliance in Android apps. We first propose an automated system, named HPDROID, to bridge the semantic gap between the general rules of GDPR and the app implementations by identifying the data practices declared in the app privacy policy and the data relevant behaviors in the app code. Then, based on HPDROID, we detect three kinds of GDPR compliance violations, including the incompleteness of privacy policy, the inconsistency of data collections, and the insecurity of data transmission. We perform an empirical evaluation of 796 mHealth apps. The results reveal that 189 (23.7%) of them do not provide complete privacy policies. Moreover, 59 apps collect sensitive data through different measures, but 46 (77.9%) of them contain at least one inconsistent collection behavior. Even worse, among the 59 apps, only 8 apps try to ensure the transmission security of collected data. However, all of them contain at least one encryption or SSL misuse. Our work exposes severe privacy issues to raise awareness of privacy protection for app users and developers.

2020-04-03
Fawaz, Kassem, Linden, Thomas, Harkous, Hamza.  2019.  Invited Paper: The Applications of Machine Learning in Privacy Notice and Choice. 2019 11th International Conference on Communication Systems Networks (COMSNETS). :118—124.
For more than two decades since the rise of the World Wide Web, the “Notice and Choice” framework has been the governing practice for the disclosure of online privacy practices. The emergence of new forms of user interactions, such as voice, and the enforcement of new regulations, such as the EU's recent General Data Protection Regulation (GDPR), promise to change this privacy landscape drastically. This paper discusses the challenges towards providing the privacy stakeholders with privacy awareness and control in this changing landscape. We will also present our recent research on utilizing Machine learning to analyze privacy policies and settings.
Sadique, Farhan, Bakhshaliyev, Khalid, Springer, Jeff, Sengupta, Shamik.  2019.  A System Architecture of Cybersecurity Information Exchange with Privacy (CYBEX-P). 2019 IEEE 9th Annual Computing and Communication Workshop and Conference (CCWC). :0493—0498.
Rapid evolution of cyber threats and recent trends in the increasing number of cyber-attacks call for adopting robust and agile cybersecurity techniques. Cybersecurity information sharing is expected to play an effective role in detecting and defending against new attacks. However, reservations and or-ganizational policies centering the privacy of shared data have become major setbacks in large-scale collaboration in cyber defense. The situation is worsened by the fact that the benefits of cyber-information exchange are not realized unless many actors participate. In this paper, we argue that privacy preservation of shared threat data will motivate entities to share threat data. Accordingly, we propose a framework called CYBersecurity information EXchange with Privacy (CYBEX-P) to achieve this. CYBEX-P is a structured information sharing platform with integrating privacy-preserving mechanisms. We propose a complete system architecture for CYBEX-P that guarantees maximum security and privacy of data. CYBEX-P outlines the details of a cybersecurity information sharing platform. The adoption of blind processing, privacy preservation, and trusted computing paradigms make CYBEX-P a versatile and secure information exchange platform.
Lachner, Clemens, Rausch, Thomas, Dustdar, Schahram.  2019.  Context-Aware Enforcement of Privacy Policies in Edge Computing. 2019 IEEE International Congress on Big Data (BigDataCongress). :1—6.
Privacy is a fundamental concern that confronts systems dealing with sensitive data. The lack of robust solutions for defining and enforcing privacy measures continues to hinder the general acceptance and adoption of these systems. Edge computing has been recognized as a key enabler for privacy enhanced applications, and has opened new opportunities. In this paper, we propose a novel privacy model based on context-aware edge computing. Our model leverages the context of data to make decisions about how these data need to be processed and managed to achieve privacy. Based on a scenario from the eHealth domain, we show how our generalized model can be used to implement and enact complex domain-specific privacy policies. We illustrate our approach by constructing real world use cases involving a mobile Electronic Health Record that interacts with, and in different environments.
Renjan, Arya, Narayanan, Sandeep Nair, Joshi, Karuna Pande.  2019.  A Policy Based Framework for Privacy-Respecting Deep Packet Inspection of High Velocity Network Traffic. 2019 IEEE 5th Intl Conference on Big Data Security on Cloud (BigDataSecurity), IEEE Intl Conference on High Performance and Smart Computing, (HPSC) and IEEE Intl Conference on Intelligent Data and Security (IDS). :47—52.

Deep Packet Inspection (DPI) is instrumental in investigating the presence of malicious activity in network traffic and most existing DPI tools work on unencrypted payloads. As the internet is moving towards fully encrypted data-transfer, there is a critical requirement for privacy-aware techniques to efficiently decrypt network payloads. Until recently, passive proxying using certain aspects of TLS 1.2 were used to perform decryption and further DPI analysis. With the introduction of TLS 1.3 standard that only supports protocols with Perfect Forward Secrecy (PFS), many such techniques will become ineffective. Several security solutions will be forced to adopt active proxying that will become a big-data problem considering the velocity and veracity of network traffic involved. We have developed an ABAC (Attribute Based Access Control) framework that efficiently supports existing DPI tools while respecting user's privacy requirements and organizational policies. It gives the user the ability to accept or decline access decision based on his privileges. Our solution evaluates various observed and derived attributes of network connections against user access privileges using policies described with semantic technologies. In this paper, we describe our framework and demonstrate the efficacy of our technique with the help of use-case scenarios to identify network connections that are candidates for Deep Packet Inspection. Since our technique makes selective identification of connections based on policies, both processing and memory load at the gateway will be reduced significantly.

Garigipati, Nagababu, Krishna, Reddy V.  2019.  A Study on Data Security and Query privacy in Cloud. 2019 3rd International Conference on Trends in Electronics and Informatics (ICOEI). :337—341.

A lot of organizations need effective resolutions to record and evaluate the existing enormous volume of information. Cloud computing as a facilitator offers scalable resources and noteworthy economic assistances as the decreased operational expenditures. This model increases a wide set of security and privacy problems that have to be taken into reflexion. Multi-occupancy, loss of control, and confidence are the key issues in cloud computing situations. This paper considers the present know-hows and a comprehensive assortment of both previous and high-tech tasks on cloud security and confidentiality. The paradigm shift that supplements the usage of cloud computing is progressively enabling augmentation to safety and privacy contemplations linked with the different facades of cloud computing like multi-tenancy, reliance, loss of control and responsibility. So, cloud platforms that deal with big data that have sensitive information are necessary to use technical methods and structural precautions to circumvent data defence failures that might lead to vast and costly harms.

Gerl, Armin, Becher, Stefan.  2019.  Policy-Based De-Identification Test Framework. 2019 IEEE World Congress on Services (SERVICES). 2642-939X:356—357.
Protecting privacy of individuals is a basic right, which has to be considered in our data-centered society in which new technologies emerge rapidly. To preserve the privacy of individuals de-identifying technologies have been developed including pseudonymization, personal privacy anonymization, and privacy models. Each having several variations with different properties and contexts which poses the challenge for the proper selection and application of de-identification methods. We tackle this challenge proposing a policy-based de-identification test framework for a systematic approach to experimenting and evaluation of various combinations of methods and their interplay. Evaluation of the experimental results regarding performance and utility is considered within the framework. We propose a domain-specific language, expressing the required complex configuration options, including data-set, policy generator, and various de-identification methods.
Alom, Md. Zulfikar, Carminati, Barbara, Ferrari, Elena.  2019.  Adapting Users' Privacy Preferences in Smart Environments. 2019 IEEE International Congress on Internet of Things (ICIOT). :165—172.
A smart environment is a physical space where devices are connected to provide continuous support to individuals and make their life more comfortable. For this purpose, a smart environment collects, stores, and processes a massive amount of personal data. In general, service providers collect these data according to their privacy policies. To enhance the privacy control, individuals can explicitly express their privacy preferences, stating conditions on how their data have to be used and managed. Typically, privacy checking is handled through the hard matching of users' privacy preferences against service providers' privacy policies, by denying all service requests whose privacy policies do not fully match with individual's privacy preferences. However, this hard matching might be too restrictive in a smart environment because it denies the services that partially satisfy the individual's privacy preferences. To cope with this challenge, in this paper, we propose a soft privacy matching mechanism, able to relax, in a controlled way, some conditions of users' privacy preferences such to match with service providers' privacy policies. At this aim, we exploit machine learning algorithms to build a classifier, which is able to make decisions on future service requests, by learning which privacy preference components a user is prone to relax, as well as the relaxation tolerance. We test our approach on two realistic datasets, obtaining promising results.
Werner, Jorge, Westphall, Carla Merkle, Vargas, André Azevedo, Westphall, Carlos Becker.  2019.  Privacy Policies Model in Access Control. 2019 IEEE International Systems Conference (SysCon). :1—8.
With the increasing advancement of services on the Internet, due to the strengthening of cloud computing, the exchange of data between providers and users is intense. Management of access control and applications need data to identify users and/or perform services in an automated and more practical way. Applications have to protect access to data collected. However, users often provide data in cloud environments and do not know what was collected, how or by whom data will be used. Privacy of personal data has been a challenge for information security. This paper presents the development and use of a privacy policy strategy, i. e., it was proposed a privacy policy model and format to be integrated with the authorization task. An access control language and the preferences defined by the owner of information were used to implement the proposals. The results showed that the strategy is feasible, guaranteeing to the users the right over their data.
Liau, David, Zaeem, Razieh Nokhbeh, Barber, K. Suzanne.  2019.  Evaluation Framework for Future Privacy Protection Systems: A Dynamic Identity Ecosystem Approach. 2019 17th International Conference on Privacy, Security and Trust (PST). :1—3.
In this paper, we leverage previous work in the Identity Ecosystem, a Bayesian network mathematical representation of a person's identity, to create a framework to evaluate identity protection systems. Information dynamic is considered and a protection game is formed given that the owner and the attacker both gain some level of control over the status of other PII within the dynamic Identity Ecosystem. We present a policy iteration algorithm to solve the optimal policy for the game and discuss its convergence. Finally, an evaluation and comparison of identity protection strategies is provided given that an optimal policy is used against different protection policies. This study is aimed to understand the evolutionary process of identity theft and provide a framework for evaluating different identity protection strategies and future privacy protection system.
Bello-Ogunu, Emmanuel, Shehab, Mohamed, Miazi, Nazmus Sakib.  2019.  Privacy Is The Best Policy: A Framework for BLE Beacon Privacy Management. 2019 IEEE 43rd Annual Computer Software and Applications Conference (COMPSAC). 1:823—832.
Bluetooth Low Energy (BLE) beacons are an emerging type of technology in the Internet-of-Things (IoT) realm, which use BLE signals to broadcast a unique identifier that is detected by a compatible device to determine the location of nearby users. Beacons can be used to provide a tailored user experience with each encounter, yet can also constitute an invasion of privacy, due to their covertness and ability to track user behavior. Therefore, we hypothesize that user-driven privacy policy configuration is key to enabling effective and trustworthy privacy management during beacon encounters. We developed a framework for beacon privacy management that provides a policy configuration platform. Through an empirical analysis with 90 users, we evaluated this framework through a proof-of-concept app called Beacon Privacy Manager (BPM), which focused on the user experience of such a tool. Using BPM, we provided users with the ability to create privacy policies for beacons, testing different configuration schemes to refine the framework and then offer recommendations for future research.
2020-01-21
Vo, Tri Hoang, Fuhrmann, Woldemar, Fischer-Hellmann, Klaus-Peter, Furnell, Steven.  2019.  Efficient Privacy-Preserving User Identity with Purpose-Based Encryption. 2019 International Symposium on Networks, Computers and Communications (ISNCC). :1–8.
In recent years, users may store their Personal Identifiable Information (PII) in the Cloud environment so that Cloud services may access and use it on demand. When users do not store personal data in their local machines, but in the Cloud, they may be interested in questions such as where their data are, who access it except themselves. Even if Cloud services specify privacy policies, we cannot guarantee that they will follow their policies and will not transfer user data to another party. In the past 10 years, many efforts have been taken in protecting PII. They target certain issues but still have limitations. For instance, users require interacting with the services over the frontend, they do not protect identity propagation between intermediaries and against an untrusted host, or they require Cloud services to accept a new protocol. In this paper, we propose a broader approach that covers all the above issues. We prove that our solution is efficient: the implementation can be easily adapted to existing Identity Management systems and the performance is fast. Most importantly, our approach is compliant with the General Data Protection Regulation from the European Union.
2019-11-11
Barrett, Ayodele A., Matthee, Machdel.  2018.  A Critical Analysis of Informed Use of Context-aware Technologies. Proceedings of the Annual Conference of the South African Institute of Computer Scientists and Information Technologists. :126–134.
There is a move towards a future in which consumers of technology are untethered from the devices and technology recedes to the subconscious. One way of achieving this vision is with context-aware technologies, which smartphones exemplify. Key figures in the creation of modern technologies suggest that consumers are fully informed of the implications of the use of these technologies. Typically, privacy policy documents are used both to inform, and gain consent from users of these technologies, on how their personal data will be used. This paper examines opinions of African-based users of smartphones. There is also an examination of the privacy policy statement of a popular app, using Critical Discourse Analysis. The analysis reveals concerns of consumers regarding absence of choice, a lack of knowledge and information privacy erosion are not unfounded.
Martiny, Karsten, Denker, Grit.  2018.  Expiring Decisions for Stream-based Data Access in a Declarative Privacy Policy Framework. Proceedings of the 2Nd International Workshop on Multimedia Privacy and Security. :71–80.
This paper describes how a privacy policy framework can be extended with timing information to not only decide if requests for data are allowed at a given point in time, but also to decide for how long such permission is granted. Augmenting policy decisions with expiration information eliminates the need to reason about access permissions prior to every individual data access operation. This facilitates the application of privacy policy frameworks to protect multimedia streaming data where repeated re-computations of policy decisions are not a viable option. We show how timing information can be integrated into an existing declarative privacy policy framework. In particular, we discuss how to obtain valid expiration information in the presence of complex sets of policies with potentially interacting policies and varying timing information.
Wang, Xiaoyin, Qin, Xue, Bokaei Hosseini, Mitra, Slavin, Rocky, Breaux, Travis D., Niu, Jianwei.  2018.  GUILeak: Tracing Privacy Policy Claims on User Input Data for Android Applications. 2018 IEEE/ACM 40th International Conference on Software Engineering (ICSE). :37–47.
The Android mobile platform supports billions of devices across more than 190 countries around the world. This popularity coupled with user data collection by Android apps has made privacy protection a well-known challenge in the Android ecosystem. In practice, app producers provide privacy policies disclosing what information is collected and processed by the app. However, it is difficult to trace such claims to the corresponding app code to verify whether the implementation is consistent with the policy. Existing approaches for privacy policy alignment focus on information directly accessed through the Android platform (e.g., location and device ID), but are unable to handle user input, a major source of private information. In this paper, we propose a novel approach that automatically detects privacy leaks of user-entered data for a given Android app and determines whether such leakage may violate the app's privacy policy claims. For evaluation, we applied our approach to 120 popular apps from three privacy-relevant app categories: finance, health, and dating. The results show that our approach was able to detect 21 strong violations and 18 weak violations from the studied apps.
Subahi, Alanoud, Theodorakopoulos, George.  2018.  Ensuring Compliance of IoT Devices with Their Privacy Policy Agreement. 2018 IEEE 6th International Conference on Future Internet of Things and Cloud (FiCloud). :100–107.
In the past few years, Internet of Things (IoT) devices have emerged and spread everywhere. Many researchers have been motivated to study the security issues of IoT devices due to the sensitive information they carry about their owners. Privacy is not simply about encryption and access authorization, but also about what kind of information is transmitted, how it used and to whom it will be shared with. Thus, IoT manufacturers should be compelled to issue Privacy Policy Agreements for their respective devices as well as ensure that the actual behavior of the IoT device complies with the issued privacy policy. In this paper, we implement a test bed for ensuring compliance of Internet of Things data disclosure to the corresponding privacy policy. The fundamental approach used in the test bed is to capture the data traffic between the IoT device and the cloud, between the IoT device and its application on the smart-phone, and between the IoT application and the cloud and analyze those packets for various features. We test 11 IoT manufacturers and the results reveal that half of those IoT manufacturers do not have an adequate privacy policy specifically for their IoT devices. In addition, we prove that the action of two IoT devices does not comply with what they stated in their privacy policy agreement.