Understanding Developers' Reasoning about Privacy and Security - UMD - January 2015
Public Audience
PI(s): Katie Shilton, Elaine Shi, Charalampos Papamanthou, Mohit Tiwari
Researchers: Youngsam Park, Donal Heidenblad
PROJECT GOAL
Our goal is to discover, understand, and quantify challenges that developers face in writing secure and privacy preserving programs. Several research thrusts will enable this goal.
Qualitative studies of developers will discover cultural and workplace dynamics that encourage or discourage privacy and security by design. And experiments with alternative design schemas will test how to facilitate adoption.
Understanding design settings:
Interviews with, and observations of, application developers will discover factors within design settings (such as work practices, institutional arrangements, or social norms) which encourage developers to value privacy and security design, and adopt techniques to protect privacy and security. We will conduct interviews with professional developers in a diversity of development settings (small and large companies, contractors and independent developers) in Washington DC and Silicon Valley. And we will observe design meetings at companies as well as hackathons. Analyzing field notes and transcripts of interviews will reveal how developers discover and learn about new privacy and security techniques, what encourages developers to adopt new privacy and security practices, and how application developers make choices between privacy, security and other priorities.
Facilitating adoption:
Techniques such as information flow control can offer strong privacy guarantees but have failed to achieve traction among developers. Concepts such as lattices of security labels and scrubbing implicit flow leaks from programs require developers to learn security concepts in order to work correctly on an information flow secure platform (Jif, Flume). We have developed an alternative scheme called Bubbles that requires developers to partition their apps based on functionality (analogous to a model-view-controller pattern) instead of using labels and information-flow secure compilers. We will conduct developer studies using A-B testing to determine the ease of programming using information flow versus our programming model. Similarly, we will study design patterns for security features in applications. For example, privilege separation in applications, key management in a distributed application, mandatory access control policies for app components. These design patterns will enable even non-security-expert developers to write secure & private applications by default.
HARD PROBLEM(S) ADDRESSED
Development of models of human behavior that enable the design, modeling, and analysis of systems with specified security properties.
Human behavior
PUBLICATIONS
Papers published in this quarter as a result of this research. Include title, author(s), venue published/presented, and a short description or abstract. Identify which hard problem(s) the publication addressed. Papers that have not yet been published should be reported in region 2 below.
Krontiris, I., Langheinrichz, M. & Shilton, K. (2014). Trust and Privacy in Mobile Experience Sharing - Future Challenges and Avenues for Research. IEEE Communications, August 2014. http://cps-vo.org/node/17109
Martin, K. and Shilton, K. (in press) "Why Experience Matters To Privacy: How Context-Based Experience Moderates Consumer Privacy Expectations for Mobile Applications." Journal of the Association for Information Science & Technology.
ACCOMPLISHMENT HIGHLIGHTS
Work during the current quarter included interviews with mobile developers and development of the Bubbles contextual privacy software.
We have implemented a simplified version of Bubbles platform including the Bubbles trusted viewer and the centralized database server. The Bubbles trusted viewer resides in a user’s Android device and provides other applications with a trusted platform service. With Bubbles platform, a user groups various application data into a single Bubble based on its context. Then the user can share a Bubble only with the people he has selected at the time of Bubble creation. Therefore, the Bubble platform prevent any malicious applications from sharing the user data with anyone who is not authorized by the data owner.
We have prepared for a user study to measure developers’ reasoning about privacy and security. For non-security-expert undergraduate students, we will measure how well they understand Bubble platform’s security model and how easily they can convert a non-secure Android application into a secure, Bubbles-compatible version. To support this test, we have implemented a simple Android application where a user can write a text memo and store it in a local database. The students will be provided with the Bubbles trusted viewer, the centralized database server and the simple Android application and will be asked to implement missing parts necessary for compatibility with Bubbles platform.
We converted the text memo app with the minimal changes less than 200 lines of source code. Based on our experience, we have generated a lab document explaining the basic concepts of Bubbles platform and APIs/interfaces for Bubbles platform that the students will use to convert the text memo app into Bubbles-compatible version. The user study will measure differences in the lines of source code required for implementation as well as time taken during implementation. A survey will also ask students to evaluate the usability of the tool.