Biblio
Traditional cyber security techniques have led to an asymmetric disadvantage for defenders. The defender must detect all possible threats at all times from all attackers and defend all systems against all possible exploitation. In contrast, an attacker needs only to find a single path to the defender's critical information. In this article, we discuss how this asymmetry can be rebalanced using cyber deception to change the attacker's perception of the network environment, and lead attackers to false beliefs about which systems contain critical information or are critical to a defender's computing infrastructure. We introduce game theory concepts and models to represent and reason over the use of cyber deception by the defender and the effect it has on attackerperception. Finally, we discuss techniques for combining artificial intelligence algorithms with game theory models to estimate hidden states of the attacker using feedback through payoffs to learn how best to defend the system using cyber deception. It is our opinion that adaptive cyber deception is a necessary component of future information systems and networks. The techniques we present can simultaneously decrease the risks and impacts suffered by defenders and dramatically increase the costs and risks of detection for attackers. Such techniques are likely to play a pivotal role in defending national and international security concerns.
We report on whether cyber attacker behaviors contain decision making biases. Data from a prior experiment were analyzed in an exploratory fashion, making use of think-aloud responses from a small group of red teamers. The analysis provided new observational evidence of traditional decision-making biases in red team behaviors (confirmation bias, anchoring, and take-the-best heuristic use). These biases may disrupt red team decisions and goals, and simultaneously increase their risk of detection. Interestingly, at least part of the bias induction may be related to the use of cyber deception. Future directions include the development of behavioral measurement techniques for these and additional cognitive biases in cyber operators, examining the role of attacker traits, and identifying the conditions where biases can be induced successfully in experimental conditions.
Defensive deception provides promise in rebalancing the asymmetry of cybersecurity. It makes an attacker’s job harder because it does more than just block access; it impacts the decision making causing him or her to waste time and effort as well as expose his or her presence in the network. Pilot studies conducted by NSA research demonstrated the plausibility and necessity for metrics of success including difficulty attacking the system, behavioral changes caused, cognitive and emotional reactions aroused, and attacker strategy changes due to deception. Designing reliable and valid measures of effectiveness is a worthy (though often overlooked) goal for industry and government alike.
The Tularosa study was designed to understand how defensive deception—including both cyber and psychological—affects cyber attackers. Over 130 red teamers participated in a network penetration test over two days in which we controlled both the presence of and explicit mention of deceptive defensive techniques. To our knowledge, this represents the largest study of its kind ever conducted on a professional red team population. The design was conducted with a battery of questionnaires (e.g., experience, personality, etc.) and cognitive tasks (e.g., fluid intelligence, working memory, etc.), allowing for the characterization of a "typical" red teamer, as well as physiological measures (e.g., galvanic skin response, heart rate, etc.) to be correlated with the cyber events. This paper focuses on the design, implementation, population characteristics, lessons learned, and planned analyses.
As infamous hacker Kevin Mitnick describes in his book The Art of Deception, "the human factor is truly security's weakest link". Deception has been widely successful when used by hackers for social engineering and by military strategists in kinetic warfare [26]. Deception affects the human's beliefs, decisions, and behaviors. Similarly, as cyber defenders, deception is a powerful tool that should be employed to protect our systems against humans who wish to penetrate, attack, and harm them.