Understanding Developers' Reasoning about Privacy and Security - UMD - July 2015
Public Audience
Purpose: To highlight project progress. Information is generally at a higher level which is accessible to the interested public. All information contained in the report (regions 1-3) is a Government Deliverable/CDRL.
PI(s): Michelle Mazurek, Charalampos Papamanthou, Elaine Shi, Mohit Tiwari
Researchers: Youngsam Park, Casen Hunger
PROJECT GOAL
Our goal is to discover, understand, and quantify challenges that developers face in writing secure and privacy preserving programs. Several research thrusts will enable this goal.
Qualitative studies of developers will discover cultural and workplace dynamics that encourage or discourage privacy and security by design. And experiments with alternative design schemas will test how to facilitate adoption.
Understanding design settings:
Interviews with, and observations of, application developers will discover factors within design settings (such as work practices, institutional arrangements, or social norms) which encourage developers to value privacy and security design, and adopt techniques to protect privacy and security. We will conduct interviews with professional developers in a diversity of development settings (small and large companies, contractors and independent developers) in Washington DC and Silicon Valley. And we will observe design meetings at companies as well as hackathons. Analyzing field notes and transcripts of interviews will reveal how developers discover and learn about new privacy and security techniques, what encourages developers to adopt new privacy and security practices, and how application developers make choices between privacy, security and other priorities.
Understanding users' behavior:
In order to help developers make the right choices when writing privacy-preserving applications, we need to understand what the privacy needs of the users are. We plan to observe and document how users make decisions when it comes to maintaining their privacy. Towards that goal we will perform interviews and will also analyze how users cluster and share their personal data such as emails or photo folders.
Facilitating adoption:
Techniques such as information-flow control can offer strong privacy guarantees but have failed to achieve traction among developers. Concepts such as lattices of security labels and scrubbing implicit flow leaks from programs require developers to learn security concepts in order to work correctly on an information flow secure platform (Jif, Flume). We have developed an alternative scheme that requires developers to partition their apps based on functionality (analogous to a model-view-controller pattern) instead of using labels and information-flow secure compilers. We will conduct developer studies using A-B testing to determine the ease of programming using information flow versus our programming model. Similarly, we will study design patterns for security features in applications. For example, privilege separation in applications, key management in a distributed application, mandatory access control policies for app components. These design patterns will enable even non-security-expert developers to write secure & private applications by default.
HARD PROBLEM(S) ADDRESSED
Human behavior
PUBLICATIONS
Papers published in this quarter as a result of this research. Include title, author(s), venue published/presented, and a short description or abstract. Identify which hard problem(s) the publication addressed. Papers that have not yet been published should be reported in region 2 below.
Krontiris, I., Langheinrichz, M. & Shilton, K. (2014). Trust and Privacy in Mobile Experience Sharing - Future Challenges and Avenues for Research. IEEE Communications, August 2014. http://cps-vo.org/node/17109
Martin, K. and Shilton, K. (in press) "Why Experience Matters To Privacy: How Context-Based Experience Moderates Consumer Privacy Expectations for Mobile Applications." Journal of the Association for Information Science & Technology.
ACCOMPLISHMENT HIGHLIGHTS
We have completed interviews with mobile application developers focused on cultural and workplace dynamics. Analysis of the rich qualitative data collected from these interviews will take place in coming months. We have implemented a version of the Bubbles contextual privacy software, and have signed up a third-party, commercial application to pilot the evaluation of the Bubbles platform. The third-party application -- Bluehub Health -- enables patients to get copies of their medical records from hospitals and then share it with other doctors on a per-encounter basis. Bluehub Health provides a compelling use case and at the same time, a good stress test for Bubbles’ distributed, mandatory access control system. In addition, we have trained another team of third-party developers in using Bubbles' API -- this team is porting Bluehub Health to Bubbles now and can be hired by future third-party app developers in porting their apps to Bubbles quickly.