The Indications and Warnings subfield of intelligence has traditionally divided warnings into a dichotomy of “ambiguous” and “unambiguous” that gives policymakers a false sense of security. In this episode, Regan Copple examines why unambiguous warning has become an inadequate planning tool that can lead to dire consequences in the quest for certainty.
E-mail usarmy.carlisle.awc.mbx.parameters@army.mil to give feedback on this podcast or the genesis article.
Podcast record date: November 20, 2024
Keywords: intelligence, military planning, warning, decision making, strategic planning
Episode Transcript
Stephanie Crider (Host)
You are listening to Decisive Point. The views and opinions expressed in this podcast are those of the guests and are not necessarily those of the Department of the Army, the US Army War College, or any other agency of the US government.
I am talking with Regan Copple today, author of “The Fallacy of Unambiguous Warning,” which you can find in the Autumn 2024 issue of Parameters. Copple is a research associate at the Institute for Defense Analyses in Alexandria, Virginia, where her work focuses on strategy development and war gaming. She is also a doctoral student at George Mason University.
Welcome to Decisive Point, Regan.
Regan Copple
Thank you. Thanks for having me.
Host
What are the working definitions of ambiguous warnings and unambiguous warnings in the context of your article?
Copple
For ambiguous warning, the easy way to break it down is the idea of you know something’s coming, but you’re not quite sure what or when. In practice, this would look like seeing some sort of mass mobilization but not knowing where those forces were mobilizing to or where the first attack might occur. Whereas unambiguous warning, you know something’s going to happen. You know what’s going to happen, when is it going to happen, and how is it going to happen. It sounds super simple, but in reality, this is a really high bar to be able to pin down the exact when and how. A good example of unambiguous warning is the ideal conceptualization of tactical warning about two to three days or a week before an attack would happen.
Host
How has the traditional distinction between ambiguous and unambiguous warnings contributed to a false sense of security in military planning, and how might this thinking be revised?
Copple
This distinction isn’t so much of what’s creating the false sense of security, it’s the expectation that you’re going to receive unambiguous warning, which both very recent events like Ukraine and the Hamas attack on Israel, and modern history more broadly—like Pearl Harbor, the Korean War and [the Yom Kippur War] show—is that what’s creating the false sense of security is the idea of we will know when this is going to happen. And, we will know exactly what is coming and when, which hasn’t been true.
In terms of how can we reverse this thinking, I’d say we need to start thinking about writing. When we write about unambiguous warning in plans and in our day jobs, rather than think about it as a necessary condition, [we should] start treating it as a “nice to have” rather than a “must have” to further confirm our existing assumptions. Because if we’re waiting for the confirmation that we’re right, that confirmation normally comes in the form of being attacked, which, needless to say, is not a preferred outcome.
Host
Your article discusses the failure of Israeli intelligence during the Yom Kippur War due to their expectation of unambiguous warnings. What lessons can modern military strategists learn from these kinds of historical intelligence failures?
Copple
The biggest reason Egypt succeeded in obfuscating their warnings was because they employed an especially well-thought-out deception plan. I think the biggest lesson from this conflict is that adversaries understand the victim state might be watching, so they have an incentive to obscure what they’re actually doing and misrepresent what they’re doing, which means planners and strategists have to factor in responses to deception or contingency plans and think about what happens if the opponent would try and execute some sort of deception. What might this look like? How might we be able to counter that? Basically, the bottom line is don’t expect the enemy to make it easy for you because they have a vested interest in not doing so.
Host
How can military planners better utilize ambiguous warnings in their intelligence collection and analysis process to avoid surprises like Pearl Harbor or the Yom Kippur War?
Copple
The biggest take away from Pearl Harbor wasn’t that we didn’t have most of the information we needed to make a decision. We did. The bigger issue was that the right people didn’t have the right information at the right time, in no small part due to security classification issues. Now in 2024, a lot of this has been fixed by technology because now we don’t need to burn letters flown halfway across the world minutes after they’re read. But, the underlying message that remains for today is that information sharing is hard. We shouldn’t assume that everyone gets every piece of information they need the second they need it—and plan around that—and basically understand that institutional bureaucratic stove pipes can get in the way of information sharing and those things are difficult to break down over time.–
Host
The article suggests that the Intelligence Community’s process is not designed to predict specific events but to assess probabilities. How can decision makers ensure they act on high-probability intelligence without over-relying on the elusive certainty of unambiguous warnings?
Copple
Much like how lots of the solution rests with changing the way that planners and strategists think about warning, this requires educating decisionmakers inside—but also, mostly outside—the Department of Defense on what do we mean when we say “warning?” Because the DOD has its own very specific language where things that we may say in our day-to-day jobs may mean something very, very different to a person with no previous defense experience or very little defense experience.
We also need to talk about what a given probability means. What does a low-confidence assessment mean? What does a high-confidence assessment mean? What are some of the implications of that? And also, educate that just because we don’t have unambiguous warning, that doesn’t mean something isn’t going to happen. That just means that we don’t have a crystal-clear picture on what we think is going to happen next.
Both the beauty and the curse of this problem is primarily that it’s a solution that’s driven by a mindset change. It’s a beauty in that it doesn’t cost us any money or people to make this change, which is nice. But, at the same time, it’s a curse in that entrenched beliefs within the DOD and the national security establishment are incredibly hard to dislodge once those beliefs have been established.
Host
We have a few extra minutes if you’re willing to entertain another question or two.
I’d love to know what inspired you to write this article.
Copple
A few years ago, I was sitting in some planning discussions out in INDOPACOM [Indo-Pacific Command], and some of the planners at the table looked around and they said, “Hey, you know, maybe we should define unambiguous warning in the document so that way everybody in the future knows and it’s clear, that they understand what we meant when we wrote this.” Everybody at the time thought that was a great idea. So, then everyone started to share what to them looked like unambiguous warning. And, what started off as a very civil, casual discussion very quickly turned into a very acrimonious argument, and by the very end, everyone was further apart in understanding what unambiguous warning meant, not closer together. That sort of prompted me to think if there are so many different views on what unambiguous warning is, is it really unambiguous? And, that’s what really sparked my research—and looking at case studies and realizing that there’s a trend here. What I experienced wasn’t just a one-off conversation.
Host
Once you started researching it, did you find any surprises or unexpected information?
Copple
I think the underlying thread that I found the most was that a lot of the most successful surprise attacks have a very big deception component where it’s not just that the victim state misses something altogether. That’s normally not what happens. There’s normally some sort of active deception and obfuscation going on on the part of the attacking state.
Host
Do you have any concluding thoughts you’d like to share?
Copple
Bottom line: warning is hard, and I think we forget that sometimes. We tend to think as long as we check off every box on a list of things we observe, that means we have a certain level of warning, and we know what’s going to happen based off of that list. It doesn’t. We not only have to think about what we’re seeing, but why are we seeing what we’re seeing? What other explanations or adversary motivations could be out there that’s driving the adversary to make those decisions, or those maneuvers, and figure out are we falling into some sort of mental trap? Are they doing what we expect them to do? And, if they’re not, maybe why not? And, think through all of this because once we do, we’re going to get a little bit closer to understanding the true picture of what’s going on.
Host
Thank you for making time to speak with me today. I really enjoyed it.
Copple
Thank you. Pleasure to be here.
Host
Listeners, you can read the article at press.armywarcollege.edu/parameters. Look for volume 54, issue 3.
For more Army War College podcasts, check out Conversations on Strategy, SSI Live, CLSC Dialogues, and A Better Peace.