Scalable Privacy Analysis - October 2022
PI(s), Co-PI(s), Researchers:
- Serge Egelman (ICSI)
- Narseo Vallina-Rodriguez (IMDEA)
- Primal Wijesekera (ICSI)
HARD PROBLEM(S) ADDRESSED
Scalability and Composability, Policy-Governed Secure Collaboration, Metrics
PUBLICATIONS
- Presented:
Noura Alomar and Serge Egelman. Developers Say the Darnedest Things: Privacy Compliance Processes Followed by Developers of Child-Directed Apps. In Proceedings on Privacy Enhancing Technologies (PoPETS), 2022(4). - Accepted:
C. Gilsenan, F. Shakir, N. Alomar, and S. Egelman. Security and Privacy Failures in Popular 2FA Apps. Proceedings of the 2023 USENIX Security Symposium. Under review. - Submitted:
Allan Lyons, Julien Gamba, Austin Shawaga, Joel Reardon, Juan Tapiador, Serge Egelman, and Narseo Vallina-Rodriguez. Oh the Places Your Logs May Go! Measuring the Logging of Sensitive Data in the Android Ecosystem. Submitted to USENIX Security '23. - Submitted:
Noura Alomar, Yulie Park, Frank Li, Primal Wijesekera, and Serge Egelman. Understanding Organizational Vulnerability Remediation Processes. Submitted to USENIX Security '23. - Submitted:
Nikita Samarin, Shayna Kothari, Zaina Siyed, Oscar Bjorkman, Reena Yuan, Primal Wijesekera, Noura Alomar, Jordan Fischer, Chris Hoofnagle, and Serge Egelman. Measuring the Compliance of Android App Developers with the California Consumer Privacy Act (CCPA). Submitted to PETS 2023.
KEY HIGHLIGHTS
- Privacy compliance processes. At the 22nd Privacy Enhancing Technologies Symposium (PETS), we presented a paper examining privacy compliance processes followed by developers of child-directed mobile apps. PETS brings together privacy experts from around the world to discuss recent advances and new perspectives on research in privacy technologies.
- Developers Say the Darnedest Things: Privacy Compliance Processes Followed by Developers of Child-Directed Apps. Abstract: We investigate the privacy compliance processes followed by developers of child-directed mobile apps. While children's online privacy laws have existed for decades in the US, prior research found relatively low rates of compliance. Yet, little is known about how compliance issues come to exist and how compliance processes can be improved to address them. Our results, based on surveys (n=127) and interviews (n=27), suggest that most developers rely on app markets to identify privacy issues, they lack complete understandings of the third-party SDKs they integrate, and they find it challenging to ensure that these SDKs are kept up-to-date and privacy-related options are configured correctly. As a result, we find that well-resourced app developers outsource most compliance decisions to auditing services, and that smaller developers follow "best-effort" models, by assuming that their apps are compliant so long as they have not been rejected by app markets. We highlight the need for usable tools that help developers identify and fix mobile app privacy issues.
- Developers Say the Darnedest Things: Privacy Compliance Processes Followed by Developers of Child-Directed Apps. Abstract: We investigate the privacy compliance processes followed by developers of child-directed mobile apps. While children's online privacy laws have existed for decades in the US, prior research found relatively low rates of compliance. Yet, little is known about how compliance issues come to exist and how compliance processes can be improved to address them. Our results, based on surveys (n=127) and interviews (n=27), suggest that most developers rely on app markets to identify privacy issues, they lack complete understandings of the third-party SDKs they integrate, and they find it challenging to ensure that these SDKs are kept up-to-date and privacy-related options are configured correctly. As a result, we find that well-resourced app developers outsource most compliance decisions to auditing services, and that smaller developers follow "best-effort" models, by assuming that their apps are compliant so long as they have not been rejected by app markets. We highlight the need for usable tools that help developers identify and fix mobile app privacy issues.
- Logging of sensitive data. In this study, we examined how sensitive data is logged in the Android ecosystem. Analysis of the data we collected revealed that several types of sensitive data were entered into system logs by various system components, including device drivers, in violation of Google's policies. We also discovered that some user-installed apps are logging data inappropriately. This includes some SDKs that are logging incoming and outgoing network traffic. This work was submitted to the 32nd USENIX Security Symposium (USENIX Security '23). This symposium brings together researchers, practitioners, system administrators, system programmers, and others interested in the latest advances in the security and privacy of computer systems and networks.
- Oh the Places Your Logs May Go! Measuring the Logging of Sensitive Data in the Android Ecosystem. Abstract: Android mobile phones offer a shared system that multiplexes all logged data from all system components, including both the operating system and the console output of all the apps that run on it. A security mechanism ensures that user-space apps can only read the log entries that they themselves made, though many "privileged" apps are exempt from this restriction. This includes preloaded system apps provided by Google, the phone manufacturer, the cellular carrier, as well as those sharing the same signature. Consequently, and explicitly for privacy reasons, Google advises developers to not log sensitive information to the system log. In this work, we examine the logging of sensitive data in the Android ecosystem. With a field study we show that most devices have some user identifying information in the logs. We show that the logging of "activity" names can inadvertently reveal information about users through their use of the apps. In a case study we find that Adobe's tracking SDK is, more than nine times in ten, configured in a debug setting that logs analytics network traffic to the shared log. We also test the default logging of personal identifiers for different smartphones, examine preinstalled apps with static analysis that access the system logs, and analyze the privacy policies of manufacturers that report collecting system logs.
- Oh the Places Your Logs May Go! Measuring the Logging of Sensitive Data in the Android Ecosystem. Abstract: Android mobile phones offer a shared system that multiplexes all logged data from all system components, including both the operating system and the console output of all the apps that run on it. A security mechanism ensures that user-space apps can only read the log entries that they themselves made, though many "privileged" apps are exempt from this restriction. This includes preloaded system apps provided by Google, the phone manufacturer, the cellular carrier, as well as those sharing the same signature. Consequently, and explicitly for privacy reasons, Google advises developers to not log sensitive information to the system log. In this work, we examine the logging of sensitive data in the Android ecosystem. With a field study we show that most devices have some user identifying information in the logs. We show that the logging of "activity" names can inadvertently reveal information about users through their use of the apps. In a case study we find that Adobe's tracking SDK is, more than nine times in ten, configured in a debug setting that logs analytics network traffic to the shared log. We also test the default logging of personal identifiers for different smartphones, examine preinstalled apps with static analysis that access the system logs, and analyze the privacy policies of manufacturers that report collecting system logs.
- Third-party 2FA TOTP authenticator apps. Using our dynamic analysis tools, we examined the security of network backups and reverse-engineered their security protocols. We uncovered numerous vulnerabilities, all of which involving TOTP secrets being stored insecurely or apps leaking information about their users, such as which online accounts they have. We have responsibly disclosed our results. This work has been accepted for presentation at USENIX Security 2023.
- Security and Privacy Failures in Popular 2FA Apps. Abstract: The Time-based One-Time Password (TOTP) algorithm is a 2FA method that is widely deployed because of its relatively low implementation costs and purported security benefits over SMS 2FA. However, users of TOTP 2FA apps face a critical usability challenge: maintain access to the secrets stored within the TOTP app, or risk getting locked out of their accounts. To help users avoid this fate, popular TOTP apps implement a wide range of backup mechanisms, each with varying security and privacy implications. In this paper, we define an assessment methodology for conducting systematic security and privacy analyses of the backup and recovery functionality of TOTP apps. We identified all general-purpose Android TOTP apps in the Google Play Store with at least 100k installs that implemented a backup mechanism (n = 22). Our findings show that most backup strategies end up placing trust in the same technologies they are meant to supersede passwords and SMS. Many backup implementations shared personal user information with third parties, had serious flaws in the implementation and/or usage of cryptography, and allowed the app developers access to users' TOTP secrets. We present our findings and recommend ways to improve the security and privacy of TOTP 2FA app backup mechanisms.
- Security and Privacy Failures in Popular 2FA Apps. Abstract: The Time-based One-Time Password (TOTP) algorithm is a 2FA method that is widely deployed because of its relatively low implementation costs and purported security benefits over SMS 2FA. However, users of TOTP 2FA apps face a critical usability challenge: maintain access to the secrets stored within the TOTP app, or risk getting locked out of their accounts. To help users avoid this fate, popular TOTP apps implement a wide range of backup mechanisms, each with varying security and privacy implications. In this paper, we define an assessment methodology for conducting systematic security and privacy analyses of the backup and recovery functionality of TOTP apps. We identified all general-purpose Android TOTP apps in the Google Play Store with at least 100k installs that implemented a backup mechanism (n = 22). Our findings show that most backup strategies end up placing trust in the same technologies they are meant to supersede passwords and SMS. Many backup implementations shared personal user information with third parties, had serious flaws in the implementation and/or usage of cryptography, and allowed the app developers access to users' TOTP secrets. We present our findings and recommend ways to improve the security and privacy of TOTP 2FA app backup mechanisms.
- Organizational vulnerability remediation. For this work, we interviewed professionals involved in remediation processes to learn more about the steps organizations take to remediate vulnerabilities they uncover. We found that the best remediation processes occur when 1) there is better coordination between everyone involved (internal and external testers, developers, managers, and security engineers), 2) decision makers have up-to-date information on their assets and code owners, and 3) those responsible for remediation receive technical instructions on how to carry it out. This work was submitted to USENIX Security '23.
- Understanding Organizational Vulnerability Remediation Processes. Abstract: Sustainable vulnerability remediation processes enable taking timely actions to handle any discovered vulnerability and prevent it from reappearing in the future. To remediate a vulnerability, organizations are assumed to make many strategic decisions, including how to evaluate its potential impact, identify all affected systems, find those who are capable of remediating, and validate the fix. These decisions require being able to contextualize the vulnerability at hand and have timely discussions with those involved in building, testing or managing the affected system(s). We evaluate the processes followed by organizations to remediate discovered vulnerabilities. We conducted interviews and a survey study with 21 and 41 security professionals who have been involved in remediation processes, respectively. We analyze the phases of the remediation process, including how vulnerabilities are triaged, reproduced, and remediated, as well as how vulnerability fixes are validated. We find that improving remediation processes hinges upon having better levels of coordination between all those involved (internal and external testers, developers, managers and security engineers), equipping decision makers with up-to-date information on their assets and code owners, and providing those responsible for doing so with technical remediation instructions. We provide recommendations for how vulnerability handling can be made easier for organizations.
- Understanding Organizational Vulnerability Remediation Processes. Abstract: Sustainable vulnerability remediation processes enable taking timely actions to handle any discovered vulnerability and prevent it from reappearing in the future. To remediate a vulnerability, organizations are assumed to make many strategic decisions, including how to evaluate its potential impact, identify all affected systems, find those who are capable of remediating, and validate the fix. These decisions require being able to contextualize the vulnerability at hand and have timely discussions with those involved in building, testing or managing the affected system(s). We evaluate the processes followed by organizations to remediate discovered vulnerabilities. We conducted interviews and a survey study with 21 and 41 security professionals who have been involved in remediation processes, respectively. We analyze the phases of the remediation process, including how vulnerabilities are triaged, reproduced, and remediated, as well as how vulnerability fixes are validated. We find that improving remediation processes hinges upon having better levels of coordination between all those involved (internal and external testers, developers, managers and security engineers), equipping decision makers with up-to-date information on their assets and code owners, and providing those responsible for doing so with technical remediation instructions. We provide recommendations for how vulnerability handling can be made easier for organizations.
- California Consumer Privacy Act (CCPA) compliance. We investigated CCPA compliance for 160 top-ranked Android mobile app developers from the U.S. Google Play Store. CCPA requires developers to provide accurate privacy notices and o respond to "right to know" requests by disclosing personal information that they have collected, used, or shared about consumers for a business or commercial purpose. We found that at least 39% of the apps we studied shared device-specific identifiers and at least 26% shared geolocation information with third parties without disclosing it in response to our request. This work was submitted to PETS 2023.
- Measuring the Compliance of Android App Developers with the California Consumer Privacy Act (CCPA). Abstract: The California Consumer Privacy Act (CCPA) provides California residents with a range of enhanced privacy protections and rights. Our research investigated the extent to which Android app developers comply with the provisions of the CCPA that require them to provide consumers with accurate privacy notices and respond to "right to know" requests by disclosing personal information that they have collected, used, or shared about consumers for a business or commercial purpose. We compared the actual network traffic of 109 apps that we believe must comply with the CCPA to the data that apps state they collect in their privacy policies and the data contained in responses to "right to know" requests that we submitted to the app's developers. Of the 69 app developers who substantively replied to our requests, all but one provided specific pieces of personal data (as opposed to only categorical information). However, a significant percentage of apps collected information that was not disclosed, including identifiers (64 apps, 93%), geolocation data (25 apps, 36%), and customer records information (13 apps, 19%) among other categories. We discuss improvements to the CCPA that could help app developers comply with "right to know" and related regulations.
COMMUNITY ENGAGEMENTS
- PI Egelman has been interviewed by several reporters about online privacy issues.
- PoPETS paper was presented.
EDUCATIONAL ADVANCES:
- Several graduate and undergraduate students are participating in this research.
Groups: