Visible to the public File preview

Security as an Issue for MedicalDevice Software
Rance Cleaveland, PhD
Professor of Computer Science, University of Maryland and Executive & Scientific Director, Fraunhofer USA Center for Experimental Software Engineering (CESE)

© 2012 University of Maryland and Fraunhofer USA

Fraunhofer CESE
• • • • • Applied-research institute in software engineering, founded 1998 Located in U. Maryland research park Staff: 30 (~22 FTEs): 16 technical (10 PhDs), 12 students / visitors Annual budget: US $4.5m Affiliated with U. Maryland, Fraunhofer USA, Fraunhofer Gesellschaft (€2bn nonprofit research organization in all areas of applied science and engineering)

© 2012 University of Maryland and Fraunhofer USA

2

CESE Overview
• Mission
Better software-development technologies, practices and processes

• Technical expertise
Software design, verification and validation, project management

• Target sectors
Aerospace / defense, automotive, medical

• Biggest customer

© 2012 University of Maryland and Fraunhofer USA

3

This Meeting vs. This Talk
• Is this talk about …
– Standards? No – Tools? No

• Why not? Because they don’t exist for medical-device software security! • Nevertheless
– Security is a growing industrial concern for medical devices – Something has to be done – So: this talk talks about a method for security analysis of medical devices
© 2012 University of Maryland and Fraunhofer USA

4

Some Software Companies

© 2012 University of Maryland and Fraunhofer USA

5

Framing the Problem
Primary drivers for medical-device certification:
– Safety (“Is it possible for the device to harm the patient?”) – Efficacy (“Will the device provide clinical benefit to the patient?”)

© 2012 University of Maryland and Fraunhofer USA

6

A Fundamental Assumption
Hazards occur accidentally
– A power source blows out due to a random surge – A wireless receiver accidentally picks up communication meant for someone else

© 2012 University of Maryland and Fraunhofer USA

7

What about “Malign Intent”?
• This assumption no longer holds!
– An attacker who wants to deprive a patient of insulin could both run down the battery and deactivate the alarm on an insulin pump – Triggering conditions for hazards considered “independent” now have a common cause – Hazard-mitigation measures put into place under this assumption of independence”are no longer sufficient
© 2012 University of Maryland and Fraunhofer USA

8

Malign Intent

© 2012 University of Maryland and Fraunhofer USA

9

© 2012 University of Maryland and Fraunhofer USA

10

Why Attack Medical Devices?
• Bloody-mindedness: “because you can”
• Desire to inflict harm on specific individuals • Data harvesting

• Attacks difficult to detect and trace back to the perpetrator
– A successful attack may be considered an accidental device mal-function
© 2012 University of Maryland and Fraunhofer USA

11

More..
• Researchers have shown how it is possible to wirelessly induce fatal heart rhythms (ventricular fibrillation) in an Implantable Cardioverter Defibrillator
[Halperin et al, Pacemakers and Implantable Cardiac Defibrillators:Software Radio Attacks and Zero-Power Defenses. IEEE Symposium on Security and Privacy. 2008]

© 2012 University of Maryland and Fraunhofer USA

12

Generic, non-devicespecific attacks also possible

© 2012 University of Maryland and Fraunhofer USA

13

Wait There Is More…
• So far we have looked at attacks where fraudulent control of device is sought to be imposed

• And then, there is medical identity theft

© 2012 University of Maryland and Fraunhofer USA

14

Electronic Data On Medical Devices
• Treatment regimes
• Medication doses

• Values of vital parameters
• Personally identifiable information
– Even when personally identifiable info is not there, the attacker may find it by other means

© 2012 University of Maryland and Fraunhofer USA

15

© 2012 University of Maryland and Fraunhofer USA

16

A single health record fetches $50 on the black market [Digital Health Conference, New York City]

© 2012 University of Maryland and Fraunhofer USA

17

How It’s Going To Get Worse
• Interoperability!
– Different devices plugged into centralized communication infrastructure exchanging data and control
Interoperability Infrastructure

© 2012 University of Maryland and Fraunhofer USA

18

Why Things Are Going to Get Worse
• Connecting to an interoperability infrastructure provides new attack surfaces
– These things were designed to be standalone devices. Not nodes on a network”

• The interoperability infrastructure will itself become a target for attack
© 2012 University of Maryland and Fraunhofer USA

19

This All Came to a Head
• A large medical-device company approached Fraunhofer
– They wanted security audits for different portable infusion pumps – They had $$$

• What to do?
– We “knew” security – We “knew” embedded software – We didn’t know security for embedded software … but we are not alone!
• No standards • No tools • No methods

© 2012 University of Maryland and Fraunhofer USA

20

So What To Do?
• We devised a “security by design” approach
– Security vulnerability: use of an interface to trigger a hazard

– Security analysis can build on hazard analysis, system design (for the interfaces)
– No such thing as a “secure device”

• We applied it to three separate devices in late 2011 / early 2012

© 2012 University of Maryland and Fraunhofer USA

21

A Security By Design Methodology
Identify hazards Identify design decisions affected

Identify attack surfaces

Devise countermeasures

Enumerate attack scenarios

Assess ease / risk of attack scenarios

© 2012 University of Maryland and Fraunhofer USA

22

Example – Wearable Infusion Pump
• Interfaces
– Wireless (for remote control) – Infrared (for flashing software, upgrades) – Device keypad

• Hazards
– Drug overdelivery – Drug underdelivery

– Patient data compromise
© 2012 University of Maryland and Fraunhofer USA

23

Attack Scenarios
• Principle: use interfaces to trigger hazards
• How can attacker access interfaces?
– Specialized equipment (RF / IR transceivers)
– Internet

• “Internet”?
– Pump includes software for storing data, changing pump settings on PC

– If PC is connected to internet …!
© 2012 University of Maryland and Fraunhofer USA

24

Kinds of Attacks
• Command injection / replay (capture commands sent to pump, replay latter to modify treatment regime) • Denial-of-service (bombard pump with bogus commands to run down battery) • Pump configuration changes (change maximum / minimum dosages, alter clock, etc.) • Data compromise (steal treatment data from pump / PC)
© 2012 University of Maryland and Fraunhofer USA

25

Attack Ranking
• Protections incur costs
• Which should be invested in?
– An economic question
– Data to base decisions on are scarce

– Our advice
• Coarse guesses about ease of attack, level of reward

• Use these guesses to guide decision-making
© 2012 University of Maryland and Fraunhofer USA

26

Typical Counter-Measures
• Time-stamping data
• Password-protection

• Data encryption
• Logging (“forensic dissuasion”)

© 2012 University of Maryland and Fraunhofer USA

27

What About…
• “Security by obscurity”?
– “We use a proprietary protocol” – Issues
• Single line of defense • Proprietary does not mean secret (cf. patent disclosures)

• Range of communication
– Better

– Beware of “range-extenders” (cf. Internet)!
© 2012 University of Maryland and Fraunhofer USA

28

Design Designs
• Security, like performance and usability, is a design dimension • Strengthening one dimension may weaken the other
Requiring a user to input a password every-time he/she wants a bolus may strengthen security but will introduce humanfactors safety risk

• Some security vulnerabilities may not be mitigated based on considerations on other design axes • If so, document WHY the decision was taken
We could have user input password every time he wants a bolus but the safety implications are too severe

© 2012 University of Maryland and Fraunhofer USA

29

Now Over To Interoperability…
• The interoperability infrastructure may become the target of attacks
– Spurious or compromised or spoofed devices exploit vulnerabilities in infrastructure software

• The interoperability infra-structure may become the source of attacks
– Compromised interoperability infrastructure attacks devices connected to it
© 2012 University of Maryland and Fraunhofer USA

30

What We Need
• Prevent bad things from happening
– Mutual authentication scheme for infrastructure and devices (“I am who I say I am”) – Signed behavioral guarantees (“I shall behave in this pre-defined way) at registration-time – Real-time compliance-checking (“I do what I said I would do”)
– Mandatory encryption (“I don’t reveal data to sniffers”)
© 2012 University of Maryland and Fraunhofer USA

31

What We Need
• Be pro-active when bad things happen
– Detect “bad” behavior of device
– Quarantine it from network

– Maintain secure logs of activities so that attacks can be forensically analyzed
• Distributed logging and log-analysis is a challenge

© 2012 University of Maryland and Fraunhofer USA

32

What about “Standards / Methods / Tools, and Efficacy”?
• Standards document perceived best practices • Standards lag practice as a result
– Practices must first be identified – Consensus of sorts must be achieved

• Methods / tools precede, inform standards

© 2012 University of Maryland and Fraunhofer USA

33

Conclusions
• Security is a new concern in medical device (and other, cf. automotive) arenas
– No standards

– Not likely to be any soon

• Methods / tools needed in mean time

• Our approach: use hazards and interfaces to drive security analysis

© 2012 University of Maryland and Fraunhofer USA

34

Thank You!
Rance Cleaveland
University of Maryland / Fraunhofer USA CESE

rcleaveland@fc-md.umd.edu
+1 240-487-2905

www.fc-md.umd.edu

© 2012 University of Maryland and Fraunhofer USA

35