This paper proposes a risk-management framework Behavioral Economics of Cyberspace Operations (BECO) for hardening Cyberspace Operations (CO) with the Behavioral Economics (BE) models of cognitive biases in judgment and decision-making. In applying BE to CO, BECO augments a common assumption of a rational cyber warrior with more realistic expressions of human behavior in cyberspace. While the current development of the cyber workforce emphasizes education and training, BECO addresses typical conditions under which rational decision-making fails and knowledge is neglected. The BECO framework encompasses a full set of cyber actors, including attackers, defenders, and users on the friendly and adversary sides, across the full CO spectrum in space and time, and offers a structured approach to the cognitive bias mitigation.
Bringing BE into CO
This paper proposes enhancements of Cyberspace Operations (CO) by adapting Behavioral Economics (BE) models in a novel framework Behavioral Economics of Cyberspace Operations (BECO). The essence of BECO is the identification of cognitive biases of CO actors, mitigation of biases on the friendly side, and exploitation of biases on the adversary side. BECO is a CO-focused extension of the Behavioral Economics of Cybersecurity (BEC) framework (Fineberg, 2014) that augments the National Institute of Standards and Technology’s Risk Management Framework (RMF) of information security (NIST SP 800-39, 2011) by introducing a new class of vulnerabilities corresponding to persistent human biases. BECO takes it further by applying the BEC risk management approach to cyber operations and CO-specific cyberactors. Figure 1 depicts the progression from BE to BEC and BECO and the concepts that link them.
While the current cognitive analysis of warfighting is rooted in psychology (Grossman and Christensen, 2007), the awareness of the BE discoveries is rising in the military community (Mackay & Tatham, 2011; Holton, 2011). However, in the existing work, the BE relevance is limited to providing general analogies between the BE findings and military scenarios, without offering a practical approach for using BE in the operations. In contrast, BECO provides an overarching framework of behavioral models encompassing the full spectrum of operational scenarios and cyberactors. The goals of this work are to raise the awareness of persistent human biases of CO actors that cannot be eliminated by traditional training, provide a framework for identifying and mitigating critical biases, and influence policies guiding cyberspace security and operations.
Cyberspace Operations and BECO
The CO concept is evolving, and this paper uses the current tenets of the United States Cyber Command (USCYBERCOM) as the basis for analyzing the CO characteristics addressed in BECO. CO are conducted in cyberspace, which Department of Defense (DoD) has designated as a warfighting domain (Stavridis & Parker, 2012, p. 62) and a part of the Information Environment (IE) that exists in three dimensions: Physical, Informational, and Cognitive. CO is a component of the Information Operations (IO) conducted in IE, as shown in Figure 2.
The joint doctrine defines the Information Environment (IE) as “the aggregate of individuals, organizations, and systems that collect, process, disseminate, or act on information” (JP 3-13, 2012, p. vii); the Information Operations (IO) as “the integrated employment, during military operations, of [Information Related Capabilities] IRCs in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision making of adversaries and potential adversaries while protecting our own” (p. vii);Cyberspace as “a global domain within the information environment consisting of the interdependent network of information technology infrastructures and resident data, including the Internet, telecommunications networks, computer systems, and embedded processors and controllers” (p. II-9); and the Cyberspace Operations (CO) as “the employment of cyberspace capabilities where the primary purpose is to achieve objectives in or through cyberspace” (p. II-9). The IE migration towards the Joint Information Environment (JIE) will facilitate the cyberspace defense, and BECO will enhance JIE’s cognitive dimension.
The USCYBERCOM’s mission is to conduct the full-spectrum CO in the three focus areas including the defense of the DoD Information Networks (DoDIN), support of combatant commanders, and response to cyber attacks (U.S. Cyber Command, 2013). Correspondingly, USCYBERCOM operates across three Lines Of Operation (LOO) including DoD Network Operations (DNO), Defensive Cyber Operations (DCO), and Offensive Cyber Operations (OCO) (Pellerin, 2013a). DNO provides a static defense of the DoDIN perimeter. DCO includes maneuvers within the perimeter to stop attacks that have passed the static DNO defenses, actions outside the perimeter to stop impending attacks, and employment of Red Teams. OCO is “the ability to deliver a variety of effects outside our own network to satisfy national security requirements” (Pellerin, 2013a). Figure 3 below provides a graphical representation of these COs.
BECO uses the full-spectrum nature of USCYBERCOM to define a comprehensive set of cognitive CO scenarios, as discussed below.
This section provides some BE background with the emphasis on the BECO relevance.
Behavioral Economics (BE) is a recent science that emerged at the confluence of psychology and economics to correct Standard Economics (SE) models for cognitive biases demonstrated in psychological experiments. SE relies on the rational-agent model of the preference-maximizing human behavior. In contrast, BE is based on the statistically significant evidence of systematic deviations of the economic actors’ behavior from the rationality assumed in SE. Economists use the terms ‘rationality’ and ‘biases’ in a specific context. Kahneman, a 2002 winner of the Nobel Memorial Prize in Economic Sciences, explains that rationality is logical coherence, which could be reasonable or not (2011). The rational-agent model assumes that people use information optimally and that the cost of thinking is constant. However, empirical evidence shows that even high-stake strategic decisions are biased (Kahneman, 2013). A bias is a systematic error, an average system error that is different from zero (Kahneman, 2006). BE studies biases that represent psychological mechanisms skewing people’s decisions in specific directions, beyond the considerations of rationality and prudence.
Psychology: Fast and Slow Thinking
The differences between biased and rational decision making can be traced to the distinction between two types of thinking that Kahneman (2011) calls System 1 (S1) and System 2 (S2), respectively. S1 refers to the fast, automatic, intuitive thinking; and S2 refers to the slow, deliberate, effortful thinking. The S1 thinking includes automatic activities of memory and perception; andintuitive thoughts of two types, the expert and the heuristic. The expert thought is fast due to prolonged practice, and the heuristicthought is exemplified by one’s ability to complete the phrase ‘bread and …’ and answer 2 + 2 = ? In contrast with S1, S2 performs effortful mental activities that require concentration. Examples of S2 activities include parking a car in a narrow space, filling out tax forms, and complex computations. Figure 4 summarizes the key features of S1 and S2 with the emphasis on the S1-based heuristics that are the main cause of cognitive biases in judgment and decision making.
Interactions between S1 and S2 are complex and generally favor decisions made by S1, even though S2 has some limited capacity to program normally-automatic functions of attention and memory. S1 produces biases, which are systematic errors it makes in specific circumstances, such as answering easier questions than those asked and misunderstanding logic and statistics.
S2 is used to focus on a task, but the intense focus blinds people to other stimuli and cannot be sustained for prolonged periods of time. Most thinking originates in S1, but S2 takes over when decisions are difficult and has the last word. While it may be desirable to switch from S1 to S2 in order to avoid making biased choices, Kahneman notes that “because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, errors can be prevented only by the enhanced monitoring and effortful activity of System 2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical” (2011, p. 28). Furthermore, “effort is required to maintain simultaneously in memory several ideas that require separate action” (p. 36) and “switching from one task to another is effortful, especially under time pressure” (p. 37).
The fast and slow thinking patterns of S1 and S2 apply to all areas of decision making including economics (BE), cybersecurity (BEC), and cyber operations (BECO). When cyberactors focus on absorbing tasks, they are oblivious to other important signals and commit biases that override their experience and training.