Visible to the public Scalable Privacy Analysis - January 2022Conflict Detection Enabled

PI(s), Co-PI(s), Researchers:

  • Serge Egelman (ICSI)
  • Narseo Vallina-Rodriguez (ICSI)
  • Primal Wijesekera (ICSI)
  • Abbas Razaghpannah (ICSI)

HARD PROBLEM(S) ADDRESSED
Scalability and Composability, Policy-Governed Secure Collaboration, Metrics

PUBLICATIONS

  • Nothing to report this quarter.

KEY HIGHLIGHTS

  • Root detection study:
    We have designed and implemented tools to taxonomize root, emulation, and debugging environment detection methods; devised counter-measures to these methods so that we can test apps in our dynamic analysis environment; and planned a measurement study around our findings to study how app behavior changes under different conditions. We have finished analyzing our testing data from ~10k apps, and expect to submit to a top tier conference in the next quarter. (We're in the process of writing up our results.)

  • Studying developers:
    We just concluded a study of developers of child-directed apps, which we have submitted to PETS. In it, we performed two surveys that asked developers about their compliance practices, which we then followed up with interviews. Abstract:

    • We investigate the privacy compliance processes followed by developers of child-directed mobile apps. While children's online privacy laws have existed for decades in the US, prior research found relatively low rates of compliance. Yet, little is known about how compliance issues come to exist and how compliance processes can be improved to address them. Our results, based on surveys (n=127) and interviews (n=27), suggest that most developers rely on app markets to identify privacy issues, they lack complete understandings of the third-party SDKs they integrate, and they find it challenging to ensure that these SDKs are kept up-to-date and privacy-related options are configured correctly. As a result, we find that well-resourced app developers outsource most compliance decisions to auditing services, and that smaller developers follow "best-effort" models, by assuming that their apps are compliant so long as they have not been rejected by app markets. We highlight the need for usable tools that help developers identify and fix mobile app privacy issues.

  • Log study:
    We have been examining the personal data that ends up in system logs on mobile devices, and then where that data goes. While user-installed apps are not supposed to be able to access this data, there exists an entire ecosystem of pre-installed apps and SDKs that creates severe security issues (e.g., there is no oversight over the apps that are pre-installed by manufacturers and carriers, many of which uses the same privacy-invasive ad SDKs as user-installed third-party apps). We spent the past several months using web assembly to build a website that allows us to collect system logs from Android devices, search it for personal information, and then report anonymous aggregate statistics, all from within the browser (so that no sensitive data leaves the network). Our data collection site can be examined here: https://pages.cpsc.ucalgary.ca/~allan.lyons/webadb/participate.html

    We are now collecting data and expect to submit this paper to PETS or USENIX Security by the next deadline (June?).

  • We received the Data Protection Research Award from the Spanish Data Protection Authority (AEPD).

  • We will receive the Data Protection Research Award from the French Data Protection Authority (CNIL), though it has been postponed (along with CPDP) to May.

COMMUNITY ENGAGEMENTS

  • PI Egelman has been interviewed by several reporters about online privacy issues.

EDUCATIONAL ADVANCES:

  • Several additional undergraduates are now participating in this research.