Visible to the public Biblio

Filters: Keyword is privacy-sensitive data  [Clear All Filters]
2020-12-07
Labib, N. S., Brust, M. R., Danoy, G., Bouvry, P..  2019.  Trustworthiness in IoT – A Standards Gap Analysis on Security, Data Protection and Privacy. 2019 IEEE Conference on Standards for Communications and Networking (CSCN). :1–7.
With the emergence of new digital trends like Internet of Things (IoT), more industry actors and technical committees pursue research in utilising such technologies as they promise a better and optimised management, improved energy efficiency and a better quality living through a wide array of value-added services. However, as sensing, actuation, communication and control become increasingly more sophisticated, such promising data-driven systems generate, process, and exchange larger amounts of security-critical and privacy-sensitive data, which makes them attractive targets of attacks. In turn this affirms the importance of trustworthiness in IoT and emphasises the need of a solid technical and regulatory foundation. The goal of this paper is to first introduce the concept of trustworthiness in IoT, its main pillars namely, security, privacy and data protection, and then analyse the state-of-the-art in research and standardisation for each of these subareas. Throughout the paper, we develop and refer to Unmanned Aerial Vehicles (UAVs) as a promising value-added service example of mobile IoT devices. The paper then presents a thorough gap analysis and concludes with recommendations for future work.
2019-02-13
Won, J., Bertino, E..  2018.  Securing Mobile Data Collectors by Integrating Software Attestation and Encrypted Data Repositories. 2018 IEEE 4th International Conference on Collaboration and Internet Computing (CIC). :26–35.
Drones are increasingly being used as mobile data collectors for various monitoring services. However, since they may move around in unattended hostile areas with valuable data, they can be the targets of malicious physical/cyber attacks. These attacks may aim at stealing privacy-sensitive data, including secret keys, and eavesdropping on communications between the drones and the ground station. To detect tampered drones, a code attestation technique is required. However, since attestation itself does not guarantee that the data in the drones' memory are not leaked, data collected by the drones must be protected and secret keys for secure communications must not be leaked. In this paper, we present a solution integrating techniques for software-based attestation, data encryption and secret key protection. We propose an attestation technique that fills up free memory spaces with data repositories. Data repositories consist of pseudo-random numbers that are also used to encrypt collected data. We also propose a group attestation scheme to efficiently verify the software integrity of multiple drones. Finally, to prevent secret keys from being leaked, we utilize a technique that converts short secret keys into large look-up tables. This technique prevents attackers from abusing free space in the data memory by filling up the space with the look-up tables. To evaluate the integrated solution, we implemented it on AR.Drone and Raspberry Pi.
2018-09-28
Tsou, Y., Chen, H., Chen, J., Huang, Y., Wang, P..  2017.  Differential privacy-based data de-identification protection and risk evaluation system. 2017 International Conference on Information and Communication Technology Convergence (ICTC). :416–421.

As more and more technologies to store and analyze massive amount of data become available, it is extremely important to make privacy-sensitive data de-identified so that further analysis can be conducted by different parties. For example, data needs to go through data de-identification process before being transferred to institutes for further value added analysis. As such, privacy protection issues associated with the release of data and data mining have become a popular field of study in the domain of big data. As a strict and verifiable definition of privacy, differential privacy has attracted noteworthy attention and widespread research in recent years. Nevertheless, differential privacy is not practical for most applications due to its performance of synthetic dataset generation for data query. Moreover, the definition of data protection by randomized noise in native differential privacy is abstract to users. Therefore, we design a pragmatic DP-based data de-identification protection and risk of data disclosure estimation system, in which a DP-based noise addition mechanism is applied to generate synthetic datasets. Furthermore, the risk of data disclosure to these synthetic datasets can be evaluated before releasing to buyers/consumers.

2018-01-23
Backes, M., Berrang, P., Bieg, M., Eils, R., Herrmann, C., Humbert, M., Lehmann, I..  2017.  Identifying Personal DNA Methylation Profiles by Genotype Inference. 2017 IEEE Symposium on Security and Privacy (SP). :957–976.

Since the first whole-genome sequencing, the biomedical research community has made significant steps towards a more precise, predictive and personalized medicine. Genomic data is nowadays widely considered privacy-sensitive and consequently protected by strict regulations and released only after careful consideration. Various additional types of biomedical data, however, are not shielded by any dedicated legal means and consequently disseminated much less thoughtfully. This in particular holds true for DNA methylation data as one of the most important and well-understood epigenetic element influencing human health. In this paper, we show that, in contrast to the aforementioned belief, releasing one's DNA methylation data causes privacy issues akin to releasing one's actual genome. We show that already a small subset of methylation regions influenced by genomic variants are sufficient to infer parts of someone's genome, and to further map this DNA methylation profile to the corresponding genome. Notably, we show that such re-identification is possible with 97.5% accuracy, relying on a dataset of more than 2500 genomes, and that we can reject all wrongly matched genomes using an appropriate statistical test. We provide means for countering this threat by proposing a novel cryptographic scheme for privately classifying tumors that enables a privacy-respecting medical diagnosis in a common clinical setting. The scheme relies on a combination of random forests and homomorphic encryption, and it is proven secure in the honest-but-curious model. We evaluate this scheme on real DNA methylation data, and show that we can keep the computational overhead to acceptable values for our application scenario.