20+ authentications/day - Summer Research Institute

Usable Security –
Are we nearly there yet?
M. Angela Sasse
Head of Information Security Research
Director, Research Institute in Science of Cyber
University College London, UK
History (Ancient)
1. The system must be substantially, if not mathematically,
2. The system must not require secrecy and can be stolen by the enemy
without causing trouble;
3. It must be easy to communicate and remember the keys without
requiring written notes, it must also be easy to change or modify
the keys with different participants;
4. The system ought to be compatible with telegraph communication;
5. The system must be portable, and its use must not require more
than one person;
6. Finally, regarding the circumstances in which such system is
applied, it must be easy to use and must neither require stress of
mind nor the knowledge of a long series of rules.
Auguste Kerckhoffs, ‘La cryptographie militaire’,
Journal des sciences militaires, vol. IX, pp. 5–38, Jan. 1883, pp. 161–191, Feb. 1883.
History (Middle Ages)
“It is essential that the human interface be designed
for ease of use, so that users routinely and
automatically apply the protection mechanisms
correctly. Also, to the extent that the user’s mental
image of his protection goals matches the
mechanisms he must use, mistakes will be
J. H. Saltzer & M. D. Schroeder, ‘The protection of information in computer systems’
Proceedings of the IEEE, vol. 63, no. 9, pp. 1278-1308, Sept. 1975
History (Recent)
• Study on escalating cost
of password resets at BT
– too high workload
– leads to shortcut
security mechanisms
– Users don’t understand
threats and risks
• Also 1999: Whitten &
Tygar “Why Johnny can’t
Adams & Sasse CACM 1999
What Has Happened Over The Past Decade?
– Lots, arguably:
• ACM SOUPS (Symposium on Usable Security and
Privacy) since 2004
• SHB (Security & Human Behaviour) since 2008
• Papers in CHI, CCS, Usenix, NSPW …
• Books: Cranor & Garfinkel, Shostack, Lacey
• University modules on usable security
• US National Academy of Sciences Workshop on
Usable Security and Privacy 2009
And – is security more usable?
Exhibit 1: Authentication
Exhibit 2: Access Control
Exhibit 3: Encryption
Exhibit 4: CAPTCHAs
• Lots of alternative
authentication proposals
• Mostly graphicall;
example: Passfaces
• Very memorable
• … until you have more
than one Passfaces
password (Everitt et al.,
CHI 2009)
• Selection biases result in
low guessing difficulty
Wiedenbeck et al. IJHCS 2005
Draw-a-Secret & BDAS
Yan et. al
More ‘usable’ authentication ...
• Authentication via Rorschach inkblot tests
• Singing your password (Reynaud et al., NSPW
• Thinking your password - free EEG thrown in
(Thorpe et al., NSPW 2005) – now possible with
Emotiv helmet?
• More biometrics (some dubious, some useful)
• Ringing up your friends in the middle of the night
to provide you with previously entrusted re-set
codes (Microsoft)
Passwords are plaguing people more than
ever before
• 6-8 passwords per employee within organisation, despite
single sign-on (SSO) (Inglesant & Sasse, CHI 2010)
• Getting worse:
– Longer passwords
– Increasing number of self-service re-sets, with
– New layers of credentials added (e.g. challenge
– Interaction on new devices
Old security + new device =
not usable, not secure
• 50%+ of pw entries
on touchscreens
• entry time & errors 35X higher (Schaub et
al., MUM 2012)
•  severely reduced
password space
The Great Authentication Fatigue
• More authentication than before, culminating in
The Great Authentication Fatigue (Sasse et al.,
Procs HCII 2014)
• Illustrated by NIST study:
– 20+ authentications/day
– 10% failure rate (with ensuing recovery activity)
– Significant impact on individual and organisational
– Not just time spent on security task: cost of disruption
Authentication ‘Wall of Disruption’
Employees’ coping strategies
1. Batching and planning of activities to limit the
number of logins
2. Storing passwords or writing them down
Impact on productivity – long-term
1. User opt out of services, return devices
Improves their productivity, but often reduces
organizational productivity (example: email)
Organization has less control over alternatives
2. Stifling innovation: new opportunities that would
require changes in security
3. Staff leaving organization to be more
productive/creative elsewhere
Impact on security
1. User errors - even when they are trying to be
2. Coping strategies create vulnerabilities
3. Non-compliance/workarounds to get tasks done
4. ‘Noise' created by habitual non-compliance
makes malicious behavior harder to detect
5. Lack of appreciation of/respect for security
creates a dysfunctional security culture
• When breaking rules
becomes the norm
– People forget
– High signal/noise
ration, which
makes hostile
activity harder to
High cost – and patchy security
Password leak 1…
… and Password Leak 2!
Comp8 zombie – and why it could get worse
• Comp8 password standard – usable for 1-2
password with frequent use, but dictionary +
history checks & expiry create impossible
• What is the security risk? Can be managed better
without burdening users
• Why it could get worse: on basis of mTurk study,
Shay et al. (CHI 2014) argue that 12-15 char
passwords are “usable”
Security: “users should make the effort”
“An hour from each of 180 million online users (in
the US) is worth approximately $2.5 billion. A major
error in security thinking has been to treat as free a
resource that is actually extremely valuable. ”
C Herley, More Is Not The Answer
IEEE S&P Jan/Feb 2014
“Technology should be smarter than this!”
• Move from explicit to implicit authentication:
1. Proximity sensors to detect user presence
2. Behavioural biometrics: zero-effort, one-step,
two-factor authentication
3. Exploit modality of interaction: use video-based
authentication in video, audio in audio, etc.
4. Web fingerprinting can identify users – why not
use it for good?
Digital Natives are
getting restless
… and elephants are getting
Research green shoots: PICO
• Cambridge University research project
headed up by Frank Stajano
• Aim: “To liberate computer users from the
inconvenience and insecurity of passwords.”
• Design directive. “You won't have to
remember any secrets to authenticate.”
• Method: moving from something you know
(passwords) to something you have
(wearable cryptographic technology)
• See https://www.cl.cam.ac.uk/~fms27/pico/
Exhibit 2: Access Control
• Access control settings – RBAC, Sharepoint etc.
• Widespread circumvention via emailing, password
sharing, Dropbox (Bartsch & Sasse, ECIS 2013)
• Over-entitlements: access reviews by managers –
battle ground in many organisations
• Green shoots: user self-reviews
Exhibit 3: Encryption
• Special Agent Johnny still can’t encrypt (Clark et
al., USENIX 2011)
• PKI-based solutions could fix many problems (e.g.
phishing) but are too difficult for users,
developers, and too expensive
• Green shoots: Simply Secure foundation aiming to
create usable encryption tools and blueprint for
development process
Exhibit 4: CAPTCHAs – making humans
prove they are not bots
Not a particularly
effective security
Not usable: failure rate
around 40% - so
customers go elsewhere
“CAPTCHAs waste 17
years of human effort
every day”
(Pogue, Scientific
American March 2012)
Our non-compliance studies
Financial institution (8 interviews)
Technology company (9 interviews)
US government agency (24 interviews)
Utility company (118 int + 1200 survey responses)
Telco (98 ints + 600 survey responses)
UK defence (6 interviews with auditors)
• Mechanisms: authentication, access control, USB,
encryption, tokens/badges
Stop obsessing about the UI – focus on
• Design failures are deeper than the UI – misalignment of goals and risk perceptions
• Must account for the cost of security – accept
there is a limited budget, and work with it
• Need to focus on, and fit with, user goals and
• Fit least-disruptive mechanism – automate if
Organisational cost of compliance
The Compliance Budget (Beautement et al. NSPW 2008)
Perceived indiv. cost
exceeds perceived benefit
Perceived individual cost of compliance
Security: wake up and small the coffee
“… security must make its way in an extremely
competitive environment. Not only are there no unclaimed pools of user effort to be had, it is difficult
to preserve existing pools from incursions. It is
hard to reserve time, effort, screen real-estate or
techniques for security when each of them is a
valuable and monetizable resource.“
C. Herley: More Is Not the Answer
IEEE Security & Privacy Jan/Feb 2014
• Are we nearly there yet? No – if anything, things
have become worse
• Need to minimise workload and friction of security
in use, and model/predict it during the design
• Radical thought: give security a budget (say - 3%)
, and ask them to use it wisely
Work in progress
• Transforming existing deployments
– Workload and friction audit, database
– Use ‘Shadow Security’ practices as starting point for redesign (Kirlappos et al., 2014)
– Transform security habits (Pfleeger et al. 2014)
• During development
– Use cases with personas and workload gauges
(Sentire, Porter et al. RE 2014)
– Assess enrolment tasks with NASA TLX with small user
– Develop personas with specific demand thresholds
Final appeal: End obstacle security
and engage and work with users, instead of
patronising them

similar documents