Visible to the public Biblio

Filters: Keyword is General Data Protection Regulation  [Clear All Filters]
2021-07-27
MacDermott, Áine, Carr, John, Shi, Qi, Baharon, Mohd Rizuan, Lee, Gyu Myoung.  2020.  Privacy Preserving Issues in the Dynamic Internet of Things (IoT). 2020 International Symposium on Networks, Computers and Communications (ISNCC). :1–6.
Convergence of critical infrastructure and data, including government and enterprise, to the dynamic Internet of Things (IoT) environment and future digital ecosystems exhibit significant challenges for privacy and identity in these interconnected domains. There are an increasing variety of devices and technologies being introduced, rendering existing security tools inadequate to deal with the dynamic scale and varying actors. The IoT is increasingly data driven with user sovereignty being essential - and actors in varying scenarios including user/customer, device, manufacturer, third party processor, etc. Therefore, flexible frameworks and diverse security requirements for such sensitive environments are needed to secure identities and authenticate IoT devices and their data, protecting privacy and integrity. In this paper we present a review of the principles, techniques and algorithms that can be adapted from other distributed computing paradigms. Said review will be used in application to the development of a collaborative decision-making framework for heterogeneous entities in a distributed domain, whilst simultaneously highlighting privacy preserving issues in the IoT. In addition, we present our trust-based privacy preserving schema using Dempster-Shafer theory of evidence. While still in its infancy, this application could help maintain a level of privacy and nonrepudiation in collaborative environments such as the IoT.
2021-01-28
Fan, M., Yu, L., Chen, S., Zhou, H., Luo, X., Li, S., Liu, Y., Liu, J., Liu, T..  2020.  An Empirical Evaluation of GDPR Compliance Violations in Android mHealth Apps. 2020 IEEE 31st International Symposium on Software Reliability Engineering (ISSRE). :253—264.

The purpose of the General Data Protection Regulation (GDPR) is to provide improved privacy protection. If an app controls personal data from users, it needs to be compliant with GDPR. However, GDPR lists general rules rather than exact step-by-step guidelines about how to develop an app that fulfills the requirements. Therefore, there may exist GDPR compliance violations in existing apps, which would pose severe privacy threats to app users. In this paper, we take mobile health applications (mHealth apps) as a peephole to examine the status quo of GDPR compliance in Android apps. We first propose an automated system, named HPDROID, to bridge the semantic gap between the general rules of GDPR and the app implementations by identifying the data practices declared in the app privacy policy and the data relevant behaviors in the app code. Then, based on HPDROID, we detect three kinds of GDPR compliance violations, including the incompleteness of privacy policy, the inconsistency of data collections, and the insecurity of data transmission. We perform an empirical evaluation of 796 mHealth apps. The results reveal that 189 (23.7%) of them do not provide complete privacy policies. Moreover, 59 apps collect sensitive data through different measures, but 46 (77.9%) of them contain at least one inconsistent collection behavior. Even worse, among the 59 apps, only 8 apps try to ensure the transmission security of collected data. However, all of them contain at least one encryption or SSL misuse. Our work exposes severe privacy issues to raise awareness of privacy protection for app users and developers.

2020-04-06
Huang, Wei-Chiao, Yeh, Lo-Yao, Huang, Jiun-Long.  2019.  A Monitorable Peer-to-Peer File Sharing Mechanism. 2019 20th Asia-Pacific Network Operations and Management Symposium (APNOMS). :1–4.
With the rise of blockchain technology, peer-to-peer network system has once again caught people's attention. Peer-to-peer (P2P) is currently being implemented on various kind of decentralized systems such as InterPlanetary File System (IPFS). However, P2P file sharing network systems is not without its flaws. Data stored in the other nodes cannot be deleted by the owner and can only be deleted by other nodes themselves. Ensuring that personal data can be completely removed is an important issue to comply with the European Union's General Data Protection Regulation (GDPR) criteria. To improve P2Ps privacy and security, we propose a monitorable peer-to-peer file sharing mechanism that synchronizes with other nodes to perform file deletion and to generate the File Authentication Code (FAC) of each IPFS nodes in order to make sure the system synchronized correctly. The proposed mechanism can integrate with a consortium Blockchain to comply with GDPR.
2020-04-03
Fawaz, Kassem, Linden, Thomas, Harkous, Hamza.  2019.  Invited Paper: The Applications of Machine Learning in Privacy Notice and Choice. 2019 11th International Conference on Communication Systems Networks (COMSNETS). :118—124.
For more than two decades since the rise of the World Wide Web, the “Notice and Choice” framework has been the governing practice for the disclosure of online privacy practices. The emergence of new forms of user interactions, such as voice, and the enforcement of new regulations, such as the EU's recent General Data Protection Regulation (GDPR), promise to change this privacy landscape drastically. This paper discusses the challenges towards providing the privacy stakeholders with privacy awareness and control in this changing landscape. We will also present our recent research on utilizing Machine learning to analyze privacy policies and settings.
Gerl, Armin, Becher, Stefan.  2019.  Policy-Based De-Identification Test Framework. 2019 IEEE World Congress on Services (SERVICES). 2642-939X:356—357.
Protecting privacy of individuals is a basic right, which has to be considered in our data-centered society in which new technologies emerge rapidly. To preserve the privacy of individuals de-identifying technologies have been developed including pseudonymization, personal privacy anonymization, and privacy models. Each having several variations with different properties and contexts which poses the challenge for the proper selection and application of de-identification methods. We tackle this challenge proposing a policy-based de-identification test framework for a systematic approach to experimenting and evaluation of various combinations of methods and their interplay. Evaluation of the experimental results regarding performance and utility is considered within the framework. We propose a domain-specific language, expressing the required complex configuration options, including data-set, policy generator, and various de-identification methods.
2020-03-09
Sion, Laurens, Van Landuyt, Dimitri, Wuyts, Kim, Joosen, Wouter.  2019.  Privacy Risk Assessment for Data Subject-Aware Threat Modeling. 2019 IEEE Security and Privacy Workshops (SPW). :64–71.
Regulatory efforts such as the General Data Protection Regulation (GDPR) embody a notion of privacy risk that is centered around the fundamental rights of data subjects. This is, however, a fundamentally different notion of privacy risk than the one commonly used in threat modeling which is largely agnostic of involved data subjects. This mismatch hampers the applicability of privacy threat modeling approaches such as LINDDUN in a Data Protection by Design (DPbD) context. In this paper, we present a data subject-aware privacy risk assessment model in specific support of privacy threat modeling activities. This model allows the threat modeler to draw upon a more holistic understanding of privacy risk while assessing the relevance of specific privacy threats to the system under design. Additionally, we propose a number of improvements to privacy threat modeling, such as enriching Data Flow Diagram (DFD) system models with appropriate risk inputs (e.g., information on data types and involved data subjects). Incorporation of these risk inputs in DFDs, in combination with a risk estimation approach using Monte Carlo simulations, leads to a more comprehensive assessment of privacy risk. The proposed risk model has been integrated in threat modeling tool prototype and validated in the context of a realistic eHealth application.
2020-01-21
Vo, Tri Hoang, Fuhrmann, Woldemar, Fischer-Hellmann, Klaus-Peter, Furnell, Steven.  2019.  Efficient Privacy-Preserving User Identity with Purpose-Based Encryption. 2019 International Symposium on Networks, Computers and Communications (ISNCC). :1–8.
In recent years, users may store their Personal Identifiable Information (PII) in the Cloud environment so that Cloud services may access and use it on demand. When users do not store personal data in their local machines, but in the Cloud, they may be interested in questions such as where their data are, who access it except themselves. Even if Cloud services specify privacy policies, we cannot guarantee that they will follow their policies and will not transfer user data to another party. In the past 10 years, many efforts have been taken in protecting PII. They target certain issues but still have limitations. For instance, users require interacting with the services over the frontend, they do not protect identity propagation between intermediaries and against an untrusted host, or they require Cloud services to accept a new protocol. In this paper, we propose a broader approach that covers all the above issues. We prove that our solution is efficient: the implementation can be easily adapted to existing Identity Management systems and the performance is fast. Most importantly, our approach is compliant with the General Data Protection Regulation from the European Union.
2019-01-21
Kittmann, T., Lambrecht, J., Horn, C..  2018.  A privacy-aware distributed software architecture for automation services in compliance with GDPR. 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA). 1:1067–1070.

The recently applied General Data Protection Regulation (GDPR) aims to protect all EU citizens from privacy and data breaches in an increasingly data-driven world. Consequently, this deeply affects the factory domain and its human-centric automation paradigm. Especially collaboration of human and machines as well as individual support are enabled and enhanced by processing audio and video data, e.g. by using algorithms which re-identify humans or analyse human behaviour. We introduce most significant impacts of the recent legal regulation change towards the automations domain at a glance. Furthermore, we introduce a representative scenario from production, deduce its legal affections from GDPR resulting in a privacy-aware software architecture. This architecture covers modern virtualization techniques along with authorization and end-to-end encryption to ensure a secure communication between distributes services and databases for distinct purposes.