File preview
Now Do Voters Notice Review Screen Anomalies?
Michael D. Byrne! Departments of Psychology! and Computer Science! Rice University! Houston, TX! byrne@acm.org! http://chil.rice.edu/!
Usability and Security
! Consider
the amount of time and energy spent on voting system security, for example:
• • • •
! But
California s Top-to-Bottom review
Ohio s EVEREST review
Many other papers past and present EVT/WOTE
! This
despite a lack of conclusive evidence that any major U.S. election has been stolen due to security flaws in DREs
Though of course this could have happened
we know major U.S. elections have turned on voting system usability
2
Usability and Security
! There
are numerous other examples of this
• • • •
See the 2008 Brennan Center report
! This
is not to suggest that usability is more important than security
Though we d argue that it does deserve equal time, which has not been the case
! Furthermore,
usability and security are intertwined
The voter is the first line of defense against malfunctioning and/or malicious systems
Voters may be able to detect when things are not as they should be
✦
The oft-given check the review screen advice
4
Usability and Review Screens
! Other
usability findings from our previous work regarding DREs vs. older technologies
• • •
! But
Voters are not more accurate voting with a DRE
Voters are not faster voting with a DRE
However, DREs are vastly preferred to older voting technologies
do voters actually check the review screen?
Or rather, how closely do they check?
Assumption has certainly been that voters do
• • •
5
! Everett
(2007) research
Two experiments on review screen anomaly detection using the VoteBox DRE
7
Everett (2007)
! Results
of two studies on anomaly detection
• •
First study: 32% noticed the anomalies
Second study: 37% noticed the anomalies
! Also
examined what other variables did and did not influence detection performance
detection performance:
! Affected
• •
! Did
Time spent on review screen
Whether or not voters were given a list of candidates
not affect detection performance:
Number of anomalies
Location on the ballot of anomalies
• •
Followup Study
! Explicit
instructions
•
Voting instructions, both prior to and on the review screen, explicitly warned voters to check the accuracy of the review screen
! Review
screen interface alterations
• •
Undervotes were highlighted in a bright red-orange color
Party affiliation markers were added to candidate names on the review screen.
10
Results: Anomaly Detection
! 50%
of voters detected the review screen anomalies
• •
! So,
95% confidence interval: 40.1% to 59.9%
Clear improvement beyond Everett (2007), but still less than ideal
what drove anomaly detection?
Time spent on review screen (p = .003)
Anomaly type (p = .02)
Self-reported care in checking review screen (p = .04)
• • • •
! Non-significant
factors
Age, education, computer experience, news following, personality variables
How It All Got Started
! I
have been doing interdisciplinary work since I was an undergraduate (dual major Psyc & Engineering)
• •
! But
Master s in Computer Science along with Psych Ph.D.
Applied Cognitive Science Most of my work is in computational modeling of human cognition for HF/HCI
I never thought about voting until:
I gave a talk at Rice in CS about the importance of usability
Dan Wallach (CS security type) called me because even if we can engineer a voting system that s secure, if nobody can actually use it, that won t solve the problem
• •
! Joined
ACCURATE and have been doing voting ever since
10
Michael D. Byrne! Departments of Psychology! and Computer Science! Rice University! Houston, TX! byrne@acm.org! http://chil.rice.edu/!
Usability and Security
! Consider
the amount of time and energy spent on voting system security, for example:
• • • •
! But
California s Top-to-Bottom review
Ohio s EVEREST review
Many other papers past and present EVT/WOTE
! This
despite a lack of conclusive evidence that any major U.S. election has been stolen due to security flaws in DREs
Though of course this could have happened
we know major U.S. elections have turned on voting system usability
2
Usability and Security
! There
are numerous other examples of this
• • • •
See the 2008 Brennan Center report
! This
is not to suggest that usability is more important than security
Though we d argue that it does deserve equal time, which has not been the case
! Furthermore,
usability and security are intertwined
The voter is the first line of defense against malfunctioning and/or malicious systems
Voters may be able to detect when things are not as they should be
✦
The oft-given check the review screen advice
4
Usability and Review Screens
! Other
usability findings from our previous work regarding DREs vs. older technologies
• • •
! But
Voters are not more accurate voting with a DRE
Voters are not faster voting with a DRE
However, DREs are vastly preferred to older voting technologies
do voters actually check the review screen?
Or rather, how closely do they check?
Assumption has certainly been that voters do
• • •
5
! Everett
(2007) research
Two experiments on review screen anomaly detection using the VoteBox DRE
7
Everett (2007)
! Results
of two studies on anomaly detection
• •
First study: 32% noticed the anomalies
Second study: 37% noticed the anomalies
! Also
examined what other variables did and did not influence detection performance
detection performance:
! Affected
• •
! Did
Time spent on review screen
Whether or not voters were given a list of candidates
not affect detection performance:
Number of anomalies
Location on the ballot of anomalies
• •
Followup Study
! Explicit
instructions
•
Voting instructions, both prior to and on the review screen, explicitly warned voters to check the accuracy of the review screen
! Review
screen interface alterations
• •
Undervotes were highlighted in a bright red-orange color
Party affiliation markers were added to candidate names on the review screen.
10
Results: Anomaly Detection
! 50%
of voters detected the review screen anomalies
• •
! So,
95% confidence interval: 40.1% to 59.9%
Clear improvement beyond Everett (2007), but still less than ideal
what drove anomaly detection?
Time spent on review screen (p = .003)
Anomaly type (p = .02)
Self-reported care in checking review screen (p = .04)
• • • •
! Non-significant
factors
Age, education, computer experience, news following, personality variables
How It All Got Started
! I
have been doing interdisciplinary work since I was an undergraduate (dual major Psyc & Engineering)
• •
! But
Master s in Computer Science along with Psych Ph.D.
Applied Cognitive Science Most of my work is in computational modeling of human cognition for HF/HCI
I never thought about voting until:
I gave a talk at Rice in CS about the importance of usability
Dan Wallach (CS security type) called me because even if we can engineer a voting system that s secure, if nobody can actually use it, that won t solve the problem
• •
! Joined
ACCURATE and have been doing voting ever since
10