Biblio
Filters: Keyword is usable privacy [Clear All Filters]
Usable Differential Privacy for the Internet-of-Things. 2021 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops). :426–427.
.
2021. Current implementations of Differential Privacy (DP) focus primarily on the privacy of the data release. The planned thesis will investigate steps towards a user-centric approach of DP in the scope of the Internet-of-Things (IoT) which focuses on data subjects, IoT developers, and data analysts. We will conduct user studies to find out more about the often conflicting interests of the involved parties and the encountered challenges. Furthermore, a technical solution will be developed to assist data subjects and analysts in making better informed decisions. As a result, we expect our contributions to be a step towards the development of usable DP for IoT sensor data.
PrivacyCheck's Machine Learning to Digest Privacy Policies: Competitor Analysis and Usage Patterns. 2020 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT). :291–298.
.
2020. Online privacy policies are lengthy and hard to comprehend. To address this problem, researchers have utilized machine learning (ML) to devise tools that automatically summarize online privacy policies for web users. One such tool is our free and publicly available browser extension, PrivacyCheck. In this paper, we enhance PrivacyCheck by adding a competitor analysis component-a part of PrivacyCheck that recommends other organizations in the same market sector with better privacy policies. We also monitored the usage patterns of about a thousand actual PrivacyCheck users, the first work to track the usage and traffic of an ML-based privacy analysis tool. Results show: (1) there is a good number of privacy policy URLs checked repeatedly by the user base; (2) the users are particularly interested in privacy policies of software services; and (3) PrivacyCheck increased the number of times a user consults privacy policies by 80%. Our work demonstrates the potential of ML-based privacy analysis tools and also sheds light on how these tools are used in practice to give users actionable knowledge they can use to pro-actively protect their privacy.