USE: User Security Behavior (CMU/Berkeley/University of Pittsburgh Collaborative Proposal) - October 2016
Public Audience
Purpose: To highlight progress. Information is generally at a higher level which is accessible to the interested public.
PI(s): A. Acquisti, L.F. Cranor, N. Christin, R. Telang
Researchers: Alain Forget (CMU), Serge Egelman (Berkeley), and Scott Beach (Univ of Pittsburgh)
1) HARD PROBLEM(S) ADDRESSED (with short descriptions)
This refers to Hard Problems, released November 2012.
The Security Behavior Observatory addresses the hard problem of "Understanding and Accounting for Human Behavior" by collecting data directly from people's own home computers, thereby capturing people's computing behavior "in the wild". This data is the closest to the ground truth of the users' everyday security and privacy challenges that the research community has ever collected. We expect the insights discovered by analyzing this data will profoundly impact multiple research domains, including but not limited to behavioral sciences, computer security & privacy, economics, and human-computer interaction.
2) PUBLICATIONS
- C. Canfield, B. Fischoff, A. Davis, A. Forget, S. Pearman, and J. Thomas. 2016. Comparing Phishing Vulnerability in the Lab to the Real World. Submitted to CHI 2017. Pending review.
- S. Pearman, A. Kumar, N. Munson, C. Sharma, L. Slyper, J. Thomas, L. Bauer, N. Christin, and S. Egelman. 2016. Risk Compensation in Home-User Computer Security Behavior: A Mixed-Methods Exploratory Study. Poster and extended abstract presented at 12th Symposium on Usable Privacy and Security (SOUPS 2016), Denver, CO, June 22-24, 2016. Presented by Sarah Pearman. Received a SOUPS '16 Distinguished Poster Award.
- A. Forget, S. Pearman, J. Thomas, A. Acquisti, N. Christin, L.F. Cranor, S. Egelman, M. Harbach, and R. Telang. Do or Do Not, There Is No Try: User Engagement May Not Improve Security Outcomes. In Proceedings of the 12th Symposium on Usable Privacy and Security (SOUPS '16), Denver, CO, June 22-24, 2016. Presented by Alain Forget.
- S. Egelman, M. Harbach, and E. Peer. Behavior Ever Follows Intention? A Validation of the Security Behavior Intentions Scale (SeBIS). 2016. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '16). Presented by Serge Egelman. Received a CHI '16 Best Paper Honorable Mention award.
- A. Forget, S. Komanduri, A. Acquisti, N. Christin, L.F. Cranor, and R. Telang. 2014. Security Behavior Observatory: Infrastructure for Long-term Monitoring of Client Machines. Carnegie Mellon University CyLab Technical Report CMU-CyLab-14-009. https://www.cylab.cmu.edu/research/techreports/2014/tr_cylab14009.html (accessed 2014-09-05)
- A. Forget, S. Komanduri, A. Acquisti, N. Christin, L.F. Cranor, and R. Telang. 2014. Building the Security Behavior Observatory: An Infrastructure for Long-term Monitoring of Client Machines. Invited talk and poster at the IEEE Symposium and Bootcamp on the Science of Security (HotSoS) 2014.
In September 2016, we submitted a paper to the CHI 2017 conference. This paper (Canfield et al., pending review) replicated a past study that had used signal detection theory to assess participants' vulnerability to phishing attacks and employed SBO data to test the construct validity and predictive validity of the measures used in the previous study. The previous study had relied upon a survey administered to online participants. The paper pending review reports the results of administering the same survey to SBO participants and comparing SBO participants' survey responses to other measures of the security of their computers, including their overall malware infection rates. The paper reports some evidence of construct validity. However, it does not report evidence of predictive validity: SBO participants' signal detection measures did not appear to be related to measures of security outcomes (e.g., counts of malware infection rates or visits to blacklisted URLs).
In June 2016, we presented a paper and a poster at the SOUPS 2016 conference. The paper (Forget et al., 2016) combined the quantitative data collected via this infrastructure with qualitative findings from interviews with our research participants. As we continue to build more secure, reliable, and robust infrastructure, we will acquire more and better data, resulting in more publications.
Our SOUPS 2016 poster (Pearman et al., 2016) examined the application of risk homeostasis theory (a theory commonly applied to safety and traffic science) to end-user computer security. For this analysis, we used a combination of SBO data and survey self-reports to examine whether users who had antivirus software installed were more likely to engage in other risky behaviors as a result of believing that they were protected by their security software.
As mentioned in the previous report, our collaborator at UC Berkeley presented a publication at the SIGCHI Conference on Human Factors in Computing Systems (ACM CHI 2016) in which behavioral data is analyzed to validate the Security Behavior Intentions Scale. This publication emerged from some preliminary analysis of SBO data and additional online studies of security-related behaviors run at UC Berkeley. This paper was presented at CHI 2016 in May. We plan to build on this preliminary work in the future by conducting additional research on the relationships between the Security Behavior Intentions scale and certain security-related behaviors observed in SBO client data.
3) KEY HIGHLIGHTS
- In September 2016, we submitted a paper to the CHI 2017 conference. This paper (Canfield et al., pending review) sought to replicate and elaborate upon past work that had used signal detection theory to assess participants' vulnerability to phishing attacks. A previous study on this topic had relied upon a survey administered to online participants. In this project, we administered the same survey to SBO participants and also employed SBO field data to test the construct validity and predictive validity of the signal detection measures used in the previous study. The paper pending review reports the results of administering the same survey to SBO participants and comparing SBO participants' survey responses to other measures of the security of their computers, including their overall malware infection rates. The paper reports some evidence of construct validity. However, it does not report evidence of predictive validity: SBO participants' signal detection measures did not appear to be related to measures of security outcomes (e.g., counts of malware infection rates or visits to blacklisted URLs).
- We completed development on the next version of our client software and are deploying it to our study participants in phases over the next few months. This version of the client software primarily focused on improving the stability and performance of the client sensors and data transmission to our servers. Additionally, we included significant changes to the server applications to more easily handle a larger scale of participants, both in terms of computer resources consumed and administrative effort required to enroll and manage participants.