Visible to the public Biblio

Filters: Keyword is GDPR  [Clear All Filters]
2023-07-20
Human, Soheil, Pandit, Harshvardhan J., Morel, Victor, Santos, Cristiana, Degeling, Martin, Rossi, Arianna, Botes, Wilhelmina, Jesus, Vitor, Kamara, Irene.  2022.  Data Protection and Consenting Communication Mechanisms: Current Open Proposals and Challenges. 2022 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW). :231—239.
Data Protection and Consenting Communication Mechanisms (DPCCMs) enable users to express their privacy decisions and manage their online consent. Thus, they can become a crucial means of protecting individuals' online privacy and agency, thereby replacing the current problematic practices such as “consent dialogues”. Based on an in-depth analysis of different DPCCMs, we propose an interdisciplinary set of factors that can be used for a comparison of such mechanisms. Moreover, we use the results from a qualitative expert study to identify some of the main multidisciplinary challenges that DPCCMs should address to become widely adopted data privacy mechanisms. We leverage both the factors and the challenges to compare two current open specifications, i.e. the Advanced Data Protection Control (ADPC) and the Global Privacy Control (GPC), and discuss future work.
2023-01-06
Banciu, Doina, Cîrnu, Carmen Elena.  2022.  AI Ethics and Data Privacy compliance. 2022 14th International Conference on Electronics, Computers and Artificial Intelligence (ECAI). :1—5.
Throughout history, technological evolution has generated less desired side effects with impact on society. In the field of IT&C, there are ongoing discussions about the role of robots within economy, but also about their impact on the labour market. In the case of digital media systems, we talk about misinformation, manipulation, fake news, etc. Issues related to the protection of the citizen's life in the face of technology began more than 25 years ago; In addition to the many messages such as “the citizen is at the center of concern” or, “privacy must be respected”, transmitted through various channels of different entities or companies in the field of ICT, the EU has promoted a number of legislative and normative documents to protect citizens' rights and freedoms.
2022-04-13
Godin, Jonathan, Lamontagne, Philippe.  2021.  Deletion-Compliance in the Absence of Privacy. 2021 18th International Conference on Privacy, Security and Trust (PST). :1–10.
Garg, Goldwasser and Vasudevan (Eurocrypt 2020) invented the notion of deletion-compliance to formally model the “right to be forgotten’, a concept that confers individuals more control over their digital data. A requirement of deletion-compliance is strong privacy for the deletion requesters since no outside observer must be able to tell if deleted data was ever present in the first place. Naturally, many real world systems where information can flow across users are automatically ruled out.The main thesis of this paper is that deletion-compliance is a standalone notion, distinct from privacy. We present an alternative definition that meaningfully captures deletion-compliance without any privacy implications. This allows broader class of data collectors to demonstrate compliance to deletion requests and to be paired with various notions of privacy. Our new definition has several appealing properties:•It is implied by the stronger definition of Garg et al. under natural conditions, and is equivalent when we add a strong privacy requirement.•It is naturally composable with minimal assumptions.•Its requirements are met by data structure implementations that do not reveal the order of operations, a concept known as history-independence.Along the way, we discuss the many challenges that remain in providing a universal definition of compliance to the “right to be forgotten.”
2022-02-24
Pedroza, Gabriel, Muntés-Mulero, Victor, Mart\'ın, Yod Samuel, Mockly, Guillaume.  2021.  A Model-Based Approach to Realize Privacy and Data Protection by Design. 2021 IEEE European Symposium on Security and Privacy Workshops (EuroS PW). :332–339.
Telecommunications and data are pervasive in almost each aspect of our every-day life and new concerns progressively arise as a result of stakes related to privacy and data protection [1]. Indeed, systems development becomes data-centric leading to an ecosystem where a variety of players intervene (citizens, industry, regulators) and where the policies regarding data usage and utilization are far from consensual. The new General Data Protection Regulation (GDPR) enacted by the European Commission in 2018 has introduced new provisions including principles for lawfulness, fairness, transparency, etc. thus endorsing data subjects with new rights in regards to their personal data. In this context, a growing need for approaches that conceptualize and help engineers to integrate GDPR and privacy provisions at design time becomes paramount. This paper presents a comprehensive approach to support different phases of the design process with special attention to the integration of privacy and data protection principles. Among others, it is a generic model-based approach that can be specialized according to the specifics of different application domains.
2022-02-03
García, Kimberly, Zihlmann, Zaira, Mayer, Simon, Tamò-Larrieux, Aurelia, Hooss, Johannes.  2021.  Towards Privacy-Friendly Smart Products. 2021 18th International Conference on Privacy, Security and Trust (PST). :1—7.
Smart products, such as toy robots, must comply with multiple legal requirements of the countries they are sold and used in. Currently, compliance with the legal environment requires manually customizing products for different markets. In this paper, we explore a design approach for smart products that enforces compliance with aspects of the European Union’s data protection principles within a product’s firmware through a toy robot case study. To this end, we present an exchange between computer scientists and legal scholars that identified the relevant data flows, their processing needs, and the implementation decisions that could allow a device to operate while complying with the EU data protection law. By designing a data-minimizing toy robot, we show that the variety, amount, and quality of data that is exposed, processed, and stored outside a user’s premises can be considerably reduced while preserving the device’s functionality. In comparison with a robot designed using a traditional approach, in which 90% of the collected types of information are stored by the data controller or a remote service, our proposed design leads to the mandatory exposure of only 7 out of 15 collected types of information, all of which are legally required by the data controller to demonstrate consent. Moreover, our design is aligned with the Data Privacy Vocabulary, which enables the toy robot to cross geographic borders and seamlessly adjust its data processing activities to the local regulations.
2021-10-12
Faurie, Pascal, Moldovan, Arghir-Nicolae, Tal, Irina.  2020.  Privacy Policy – ``I Agree''⁈ – Do Alternatives to Text-Based Policies Increase the Awareness of the Users? 2020 International Conference on Cyber Security and Protection of Digital Services (Cyber Security). :1–6.
Since GDPR was introduced, there is a reinforcement of the fact that users must give their consent before their personal data can be managed by any website. However, many studies have demonstrated that users often skip these policies and click the "I agree" button to continue browsing, being unaware of what the consent they gave was about, hence defeating the purpose of GDPR. This paper investigates if different ways of presenting users the privacy policy can change this behaviour and can lead to an increased awareness of the user in relation to what the user agrees with. Three different types of policies were used in the study: a full-text policy, a so-called usable policy, and a video-based policy. Results demonstrated that the type of policy has a direct influence on the user awareness and user satisfaction. The two alternatives to the text-based policy lead to a significant increase of user awareness in relation to the content of the policy and to a significant increase in the user satisfaction in relation to the usability of the policy.
2021-08-11
Joseph, Asha, John Singh, K.  2020.  A GDPR Compliant Proposal to Provide Security in Android and iOS Devices. 2020 International Conference on Emerging Trends in Information Technology and Engineering (ic-ETITE). :1—8.
The Security available in personal computers and laptops are not possible in mobile communication, since there is no controlling software such as an operating system. The European Union General Data Protection Regulation (GDPR) will require many organisations throughout the European Union to comply with new requirements that are intended to protect their user's personal data. The responsibilities of the organizations and the penalties related to the protection of personal data of the users are proved to be both organisationally and technically challenging. Under the GDPR's 'privacy by design' and 'privacy by default' requirements, organizations need to prove that they are in control of user data and have taken steps to protect it. There are a large number of organizations that makes use of mobile devices to process personal data of their customers. GDPR mandates that the organization shall be able to manage all devices that handles sensitive data so that the company can implement group updates, restrict apps and networks, and enforce security measures. In this work, we propose a Mobile Device Management solution using the built-in frameworks of Android and iOS mobile platforms which is compatible and incorporates GDPR articles relevant to a small to medium sized organization.
2021-03-29
Moreno, R. T., Rodríguez, J. G., López, C. T., Bernabe, J. B., Skarmeta, A..  2020.  OLYMPUS: A distributed privacy-preserving identity management system. 2020 Global Internet of Things Summit (GIoTS). :1—6.

Despite the latest initiatives and research efforts to increase user privacy in digital scenarios, identity-related cybercrimes such as identity theft, wrong identity or user transactions surveillance are growing. In particular, blanket surveillance that might be potentially accomplished by Identity Providers (IdPs) contradicts the data minimization principle laid out in GDPR. Hence, user movements across Service Providers (SPs) might be tracked by malicious IdPs that become a central dominant entity, as well as a single point of failure in terms of privacy and security, putting users at risk when compromised. To cope with this issue, the OLYMPUS H2020 EU project is devising a truly privacy-preserving, yet user-friendly, and distributed identity management system that addresses the data minimization challenge in both online and offline scenarios. Thus, OLYMPUS divides the role of the IdP among various authorities by relying on threshold cryptography, thereby preventing user impersonation and surveillance from malicious or nosy IdPs. This paper overviews the OLYMPUS framework, including requirements considered, the proposed architecture, a series of use cases as well as the privacy analysis from the legal point of view.

2021-03-16
Fiebig, T..  2020.  How to stop crashing more than twice: A Clean-Slate Governance Approach to IT Security. 2020 IEEE European Symposium on Security and Privacy Workshops (EuroS PW). :67—74.

"Moving fast, and breaking things", instead of "being safe and secure", is the credo of the IT industry. However, if we look at the wide societal impact of IT security incidents in the past years, it seems like it is no longer sustainable. Just like in the case of Equifax, people simply forget updates, just like in the case of Maersk, companies do not use sufficient network segmentation. Security certification does not seem to help with this issue. After all, Equifax was IS027001 compliant.In this paper, we take a look at how we handle and (do not) learn from security incidents in IT security. We do this by comparing IT security incidents to early and later aviation safety. We find interesting parallels to early aviation safety, and outline the governance levers that could make the world of IT more secure, which were already successful in making flying the most secure way of transportation.

2021-01-28
Santos, W., Sousa, G., Prata, P., Ferrão, M. E..  2020.  Data Anonymization: K-anonymity Sensitivity Analysis. 2020 15th Iberian Conference on Information Systems and Technologies (CISTI). :1—6.

These days the digitization process is everywhere, spreading also across central governments and local authorities. It is hoped that, using open government data for scientific research purposes, the public good and social justice might be enhanced. Taking into account the European General Data Protection Regulation recently adopted, the big challenge in Portugal and other European countries, is how to provide the right balance between personal data privacy and data value for research. This work presents a sensitivity study of data anonymization procedure applied to a real open government data available from the Brazilian higher education evaluation system. The ARX k-anonymization algorithm, with and without generalization of some research value variables, was performed. The analysis of the amount of data / information lost and the risk of re-identification suggest that the anonymization process may lead to the under-representation of minorities and sociodemographic disadvantaged groups. It will enable scientists to improve the balance among risk, data usability, and contributions for the public good policies and practices.

Fan, M., Yu, L., Chen, S., Zhou, H., Luo, X., Li, S., Liu, Y., Liu, J., Liu, T..  2020.  An Empirical Evaluation of GDPR Compliance Violations in Android mHealth Apps. 2020 IEEE 31st International Symposium on Software Reliability Engineering (ISSRE). :253—264.

The purpose of the General Data Protection Regulation (GDPR) is to provide improved privacy protection. If an app controls personal data from users, it needs to be compliant with GDPR. However, GDPR lists general rules rather than exact step-by-step guidelines about how to develop an app that fulfills the requirements. Therefore, there may exist GDPR compliance violations in existing apps, which would pose severe privacy threats to app users. In this paper, we take mobile health applications (mHealth apps) as a peephole to examine the status quo of GDPR compliance in Android apps. We first propose an automated system, named HPDROID, to bridge the semantic gap between the general rules of GDPR and the app implementations by identifying the data practices declared in the app privacy policy and the data relevant behaviors in the app code. Then, based on HPDROID, we detect three kinds of GDPR compliance violations, including the incompleteness of privacy policy, the inconsistency of data collections, and the insecurity of data transmission. We perform an empirical evaluation of 796 mHealth apps. The results reveal that 189 (23.7%) of them do not provide complete privacy policies. Moreover, 59 apps collect sensitive data through different measures, but 46 (77.9%) of them contain at least one inconsistent collection behavior. Even worse, among the 59 apps, only 8 apps try to ensure the transmission security of collected data. However, all of them contain at least one encryption or SSL misuse. Our work exposes severe privacy issues to raise awareness of privacy protection for app users and developers.

2021-01-11
Kuperberg, M..  2020.  Towards Enabling Deletion in Append-Only Blockchains to Support Data Growth Management and GDPR Compliance. 2020 IEEE International Conference on Blockchain (Blockchain). :393–400.
Conventional blockchain implementations with append-only semantics do not support deleting or overwriting data in confirmed blocks. However, many industry-relevant use cases require the ability to delete data, especially when personally identifiable information is stored or when data growth has to be constrained. Existing attempts to reconcile these contradictions compromise on core qualities of the blockchain paradigm, as they include backdoor-like approaches such as central authorities with elevated rights or usage of specialized chameleon hash algorithms in chaining of the blocks. The contribution of this paper is a novel architecture for the blockchain ledger and consensus, which uses a tree of context chains with simultaneous validity. A context chain captures the transactions of a closed group of entities and persons, thus structuring blocks in a precisely defined way. The resulting context isolation enables consensus-steered deletion of an entire context without side effects to other contexts. We show how this architecture supports truncation, data rollover and separation of concerns, how the GDPR regulations can be fulfilled by this architecture and how it differs from sidechains and state channels.
2020-11-16
Hagan, M., Siddiqui, F., Sezer, S..  2019.  Enhancing Security and Privacy of Next-Generation Edge Computing Technologies. 2019 17th International Conference on Privacy, Security and Trust (PST). :1–5.
The advent of high performance fog and edge computing and high bandwidth connectivity has brought about changes to Internet-of-Things (IoT) service architectures, allowing for greater quantities of high quality information to be extracted from their environments to be processed. However, recently introduced international regulations, along with heightened awareness among consumers, have strengthened requirements to ensure data security, with significant financial and reputational penalties for organisations who fail to protect customers' data. This paper proposes the leveraging of fog and edge computing to facilitate processing of confidential user data, to reduce the quantity and availability of raw confidential data at various levels of the IoT architecture. This ultimately reduces attack surface area, however it also increases efficiency of the architecture by distributing processing amongst nodes and transmitting only processed data. However, such an approach is vulnerable to device level attacks. To approach this issue, a proposed System Security Manager is used to continuously monitor system resources and ensure confidential data is confined only to parts of the device that require it. In event of an attack, critical data can be isolated and the system informed, to prevent data confidentiality breach.
2020-03-09
Sion, Laurens, Van Landuyt, Dimitri, Wuyts, Kim, Joosen, Wouter.  2019.  Privacy Risk Assessment for Data Subject-Aware Threat Modeling. 2019 IEEE Security and Privacy Workshops (SPW). :64–71.
Regulatory efforts such as the General Data Protection Regulation (GDPR) embody a notion of privacy risk that is centered around the fundamental rights of data subjects. This is, however, a fundamentally different notion of privacy risk than the one commonly used in threat modeling which is largely agnostic of involved data subjects. This mismatch hampers the applicability of privacy threat modeling approaches such as LINDDUN in a Data Protection by Design (DPbD) context. In this paper, we present a data subject-aware privacy risk assessment model in specific support of privacy threat modeling activities. This model allows the threat modeler to draw upon a more holistic understanding of privacy risk while assessing the relevance of specific privacy threats to the system under design. Additionally, we propose a number of improvements to privacy threat modeling, such as enriching Data Flow Diagram (DFD) system models with appropriate risk inputs (e.g., information on data types and involved data subjects). Incorporation of these risk inputs in DFDs, in combination with a risk estimation approach using Monte Carlo simulations, leads to a more comprehensive assessment of privacy risk. The proposed risk model has been integrated in threat modeling tool prototype and validated in the context of a realistic eHealth application.
2020-02-24
van Aubel, Pol, Poll, Erik, Rijneveld, Joost.  2019.  Non-Repudiation and End-to-End Security for Electric-Vehicle Charging. 2019 IEEE PES Innovative Smart Grid Technologies Europe (ISGT-Europe). :1–5.
In this paper we propose a cryptographic solution that provides non-repudiation and end-to-end security for the electric-vehicle-charging ecosystem as it exists in the Netherlands. It is designed to provide long-term non-repudiation, while allowing for data deletion in order to comply with the GDPR. To achieve this, we use signatures on hashes of individual data fields instead of on the combination of fields directly, and we use Merkle authentication trees to reduce the overhead involved.
2019-11-19
Dijkhuis, Sander, van Wijk, Remco, Dorhout, Hidde, Bharosa, Nitesh.  2018.  When Willeke Can Get Rid of Paperwork: A Lean Infrastructure for Qualified Information Exchange Based on Trusted Identities. Proceedings of the 19th Annual International Conference on Digital Government Research: Governance in the Data Age. :89:1-89:10.

As a frequent participant in eSociety, Willeke is often preoccupied with paperwork because there is no easy to use, affordable way to act as a qualified person in the digital world. Confidential interactions take place over insecure channels like e-mail and post. This situation poses risks and costs for service providers, civilians and governments, while goals regarding confidentiality and privacy are not always met. The objective of this paper is to demonstrate an alternative architecture in which identifying persons, exchanging information, authorizing external parties and signing documents will become more user-friendly and secure. As a starting point, each person has their personal data space, provided by a qualified trust service provider that also issues a high level of assurance electronic ID. Three main building blocks are required: (1) secure exchange between the personal data space of each person, (2) coordination functionalities provided by a token based infrastructure, and (3) governance over this infrastructure. Following the design science research approach, we developed prototypes of the building blocks that we will pilot in practice. Policy makers and practitioners that want to enable Willeke to get rid of her paperwork can find guidance throughout this paper and are welcome to join the pilots in the Netherlands.

2019-11-11
Subahi, Alanoud, Theodorakopoulos, George.  2018.  Ensuring Compliance of IoT Devices with Their Privacy Policy Agreement. 2018 IEEE 6th International Conference on Future Internet of Things and Cloud (FiCloud). :100–107.
In the past few years, Internet of Things (IoT) devices have emerged and spread everywhere. Many researchers have been motivated to study the security issues of IoT devices due to the sensitive information they carry about their owners. Privacy is not simply about encryption and access authorization, but also about what kind of information is transmitted, how it used and to whom it will be shared with. Thus, IoT manufacturers should be compelled to issue Privacy Policy Agreements for their respective devices as well as ensure that the actual behavior of the IoT device complies with the issued privacy policy. In this paper, we implement a test bed for ensuring compliance of Internet of Things data disclosure to the corresponding privacy policy. The fundamental approach used in the test bed is to capture the data traffic between the IoT device and the cloud, between the IoT device and its application on the smart-phone, and between the IoT application and the cloud and analyze those packets for various features. We test 11 IoT manufacturers and the results reveal that half of those IoT manufacturers do not have an adequate privacy policy specifically for their IoT devices. In addition, we prove that the action of two IoT devices does not comply with what they stated in their privacy policy agreement.
2018-02-15
Brkan, Maja.  2017.  AI-supported Decision-making Under the General Data Protection Regulation. Proceedings of the 16th Edition of the International Conference on Articial Intelligence and Law. :3–8.
The purpose of this paper is to analyse the rules of the General Data Protection Regulation on automated decision making in the age of Big Data and to explore how to ensure transparency of such decisions, in particular those taken with the help of algorithms. The GDPR, in its Article 22, prohibits automated individual decision-making, including profiling. On the first impression, it seems that this provision strongly protects individuals and potentially even hampers the future development of AI in decision making. However, it can be argued that this prohibition, containing numerous limitations and exceptions, looks like a Swiss cheese with giant holes in it. Moreover, in case of automated decisions involving personal data of the data subject, the GDPR obliges the controller to provide the data subject with 'meaningful information about the logic involved' (Articles 13 and 14). If we link this information to the rights of data subject, we can see that the information about the logic involved needs to enable him/her to express his/her point of view and to contest the automated decision. While this requirement fits well within the broader framework of GDPR's quest for a high level of transparency, it also raises several queries particularly in cases where the decision is taken with the help of algorithms: What exactly needs to be revealed to the data subject? How can an algorithm-based decision be explained? Apart from technical obstacles, we are facing also intellectual property and state secrecy obstacles to this 'algorithmic transparency'.