SSI

US Army War College

Zachary Kallenborn – “InfoSwarms: Drone Swarms and Information Warfare”

Released  23 June 2022.

This podcast discusses drone swarms, which can be used at sea, on land, in the air, and even in space, are fundamentally information-dependent weapons. No study to date has examined drone swarms in the context of information warfare writ large. This article explores the dependence of these swarms on information and the resultant connections with areas of information warfare—electronic, cyber, space, and psychological—drawing on open-source research and qualitative reasoning. Overall, the article offers insights into how this important emerging technology fits into the broader defense ecosystem and outlines practical approaches to strengthening related information warfare capabilities.

Click here to read the article.

Keywords: information warfare, drone swarms, unmanned systems, cyberwarfare, electronic warfare

Episode Transcript

Stephanie Crider (Host)

Welcome to Decisive Point, a US Army War College Press production featuring distinguished authors and contributors who get to the heart of the matter in national security affairs.

The views and opinions expressed on this podcast are those of the podcast guest, and are not necessarily those of the Department of the Army, US Army War College, or any other agency of the US government.

(Guest 1: Zachary Kallenborn)

(Host)

Decisive Point welcomes Zachary Kallenborn, author of “InfoSwarms: Drone Swarms and Information Warfare,” which was featured in the summer 2022 issue of Parameters. Kallenborn is a policy fellow at the Schar School of Policy and Government, a research affiliate of the Unconventional Weapons and Technology program at the National Consortium for the Study of Terrorism and Response to Terrorism (National Consortium for the Study of Terrorism and Responses to Terrorism), a senior consultant at ABS Group, and a self-proclaimed US Army “mad scientist.” He is the author of publications on autonomous weapons, drone swarms, weapons of mass destruction, and terrorism involving weapons of mass destruction.

Zach, I’m glad you’re here. Thanks for making time to chat with me today.

(Kallenborn)

Thanks for having me.

(Host)

Your article explores the dependence of drone swarms on information and the resultant connections with areas of information warfare—electronic, cyber, space, and psychological warfare—drawing on open-source research and qualitative reasoning. Put this in context for us, please.

(Kallenborn)

The context of the discussion is looking at drone swarms—a rapidly emerging technology that numerous states are developing. Obviously got the big players—China, Russia, the United States are all developing this technology. But even smaller powers, like South Africa. And this technology is even already being used in combat. We saw just last year that Israel used a drone swarm in combat in the fight against Gaza. Now, before jumping into these sort of larger issues of information warfare, it’s important to understand briefly what we mean by “drone swarm” here.

We’re not necessarily talking about simply large numbers of drones used en masse, which is often how the term is used within media, but really what we’re talking about is drones that have some level of communication and coordination between them so that they’re operating effectively as a singular unit instead of, say, 10, 15 individual drones.

Now, what that means is there’s potentially a range of capability within that. Because if we’re talking about simply coordination and communication at the basic level, that’s a pretty simple thing. So in the case of like the Israel example, likely all they’re really doing is just doing some coordinated searches over an area to help identify a target. It’s not anything all that fancy or unusual, but we can imagine in the future how artificial intelligence and autonomy may make coordination communication fairly significant. You could imagine, for example, a drone swarm that’s made up of multiple different types of drones operating in different domains with different types of payloads where they’re intelligently selecting, like, “Alright, let’s use this antitank weapon against this identified tank over here, and we’re going to send our antipersonnel weapons to this infantry unit over here,” and collectively sort of adapt and engage with the reality on the ground working across multiple domains, and all autonomously.

But the important thing, and what we’re going to get at in this article, is that regardless of that complexity, one of the key issues when it comes to drone swarms is information warfare, and, particularly, their dependence on those various subsets: electronic, cyber, space, and psychological warfare.

(Host)

Let’s start with electronic warfare. Electronic jamming: What do we need to know?

(Kallenborn)

We can start with individual drones because that’s the simple example here. One of the things that we’ve seen is that electronic jamming has been a really common form of attempting to defeat that system—namely, because drones typically rely on some sort of communication signal between the operator and the actual drone platform itself. So the idea is if you sever that communication link, then the drone may not necessarily be able to operate at all.

And that problem scales extensively when we talk about drone swarms because, necessarily, you have that same problem. You have an operator who has to send links and information to that swarm. But it becomes more complicated because to get at that point, the key issue of swarming is that communication and coordination between the different drones, which means you have another opportunity to jam, and, particularly, that inner-swarm communication. Because if you break down that communication, the notion of a swarm stops being meaningful.

Now one of the open questions, though, is how exactly that works in practice. Based on the way that the algorithms work from having, say, a dedicated leader—that sort of helping organize things to massively decentralized approaches—to having different communication pathways through a complex network. So how exactly jamming might work in practice may vary a lot.

And of course, if we’re talking about multidomain swarms where you start getting undersea drones interacting with surface vehicles, that becomes more complicated. But, nonetheless, the notion of jamming signals is equally appropriate there.

(Host)

What about cyber? What do we need to know?

(Kallenborn)

There is sort of a silly depiction, but I think it has some seriousness to it, of drones as sort of flying computers. And in a sense, that’s what they are. They’re kind of onboard systems that manage all aspects of the drone’s flight from specific flight controllers that manage the propellers and the various movement and the habitation of that system.

And when we start talking about drone swarms, that’s even more the case because you now have algorithms and systems to manage that broader communication, that coordination, the broader behavior of that swarm, so that at a very basic level, you may have—say, for example, task allocation algorithms to say, like, “OK, these drones are going over here, these drones are going over here, these drones are going to do this, those drones are going to do that.” And that’s a very cyber-based system, which of course means that it also creates vulnerabilities in the cyberspace for those systems.

So what happens if you have cyberattacks aimed at the systems that control the flight controllers? What happens if you attempt to manipulate those? You could get all sorts of problems, from the drones not operating correctly to crashing into one another, and you can imagine other types of cyber activities.

So, for example, if you’re using artificial intelligence-based machine vision in the swarm to help navigate the world around it, that could potentially be manipulated through cyber means by, for example, it manipulating the training data that goes into creating those machine vision systems through what are called, “(artificial intelligence or) AI poisoning attacks.” Conversely, you can also imagine using cyber to, say, again, break that link between the operator and the actual system itself. Except, rather than using . . . jamming the actual signal, you are manipulating the system that is actually responding and taking in those orders. And then, at an extreme level, you could imagine cyber being used to actively take control of and manipulate the swarm, which is perhaps the scariest, where your friendly swarm is now turned against you and destroying your own military forces, which could be quite bad.

(Host)

What about space warfare and the Global Navigation Satellite System? How do they relate to drone swarms?

(Kallenborn)

Yeah, so, there’s a couple dimensions to it. Firstly, there’s obviously the communication element. At the moment, most of the drone swarms are very sort of tactical, very short-range type operations. But as we start scaling, you could potentially look and need satellite-based communication to relay some of those signals. Likewise, we know that, at least at the moment, many of these drone swarms are heavily dependent on global position navigation, a positioning system for navigation timing. Basically, they need to understand where are they potentially located in space?

Now that’s one place where technology is developing, and there has been work on using a more autonomous, image-based recognition to help the swarm actually navigate. But, at the moment, there’s a close dependence on (the Global Positioning System or) GPS. But how exactly that works is again going to be very dependent on the nature of the swarm.

So, an interesting example of this is undersea swarms where you know GPS navigation is pretty difficult to get to those units. Instead, what researchers are doing are things like creating buoys on the surface that can access GPS signals and then use acoustic signals from the drones to the buoy, and then using that to sort of figure out where the drones are located in space and time to orient themselves to accomplish their missions.

(Host)
You say drone swarms have the least relevance for psychological warfare. How so?

(Kallenborn)
When we think about psychological at the more strategic aspect, we’re talking about shaping society’s perceptions and that type of thing. And in that case, you’re not really going to have much real use there. It’s pretty difficult to imagine, like, drone swarms spreading much propaganda pamphlets. And I suppose you could do that. But I don’t really know why you would want to. I think where the relevance starts is drone swarms as an object for mis- and malinformation.

There’s been broad global interest in placing bans on autonomous weapons. A big example of that, of course, is the Stop Killer Robots movement that wants complete bans on all autonomous systems. Now, personally, I think some of that is a little bit ridiculous, but, nonetheless, there is certainly a public concern about autonomous weapons generally. Many NATO countries have concern about this. So the concern is if we have all these drone swarms being used on the battlefield, well, what happens if some adversary says, “Oh, these are autonomous weapons, and they have all of the concerns that those systems do”? From escalation concerns to risk to civilians to accidentally targeting third, neutral parties to a conflict—that type of thing. And so, in that way, it can become a little bit relevant in which people create those concerns. And that’s exacerbated that many of the autonomous systems is a matter of programming and how the system is set up.

So, if the United States is accused of using some concerning system, how does the United States disprove that without showing directly the code? And that’s going to be something both probably very sensitive and—because we’re talking about potentially classified software systems—and, also, difficult to understand because your average person on the street—they’re not going to understand at all some complex coding of how the system works.

(Host)
How do you see AI and robotics shaping the future of drones?

(Kallenborn)

So technology advances hit on all of these aspects of information warfare for both good and bad. So, if we talk about electronic jamming, for example, a significant development is autonomy for drones in general. That is, if a drone doesn’t necessarily require human input, then they don’t really care that much about jamming, right?

Similarly, if we talk about GPS satellite information used to help navigation, if you have advanced AI and autonomy that can read and understand the area around them so that they can navigate by themselves without having GPS, then that dependency goes away. Conversely, we can imagine on the other direction as well, where artificial intelligence and robotics may improve some of these capabilities. So say, for example, using . . . combining artificial intelligence to improve electronic jamming capabilities to more effectively attack and defeat swarms. Alternatively, you can imagine swarms incorporating some of those same capabilities. What if a drone has a payload that has a jammer on it, perhaps to attack another swarm to make it more difficult to operate? That can then support the use of other types of payloads that are more directly destroying the actual platform.

(Host)

Let’s kind of pull it all together here. What are your recommendations going forward?

(Kallenborn)

The biggest picture is more conceptual . . . being commanders and military leaders needing to think seriously around the interconnections between drone swarms and information warfare.

Now to an extent, of course, information warfare is relevant to all forms of warfare. Even regular human beings need to communicate between one another. But unlike human beings, drones do not necessarily have the complex cognition to make decisions where they can potentially operate without that human control. And so, being aware of that, I think, is really critical. That of course comes across multiple aspects of conflict that starts with simply drone swarm acquisition phases, to understand, “What are the potential vulnerabilities that these systems may have,” that then extend potentially to use on the battlefield as well as training with that system to understand how it might be used or some of the risks. And then, when you actually use it in the battlefield, understanding, “What are the information warfare risks that come into play in that particular context?”

And then, likewise, I think there’s, at a bigger-picture level, the necessity to consider the broader information space. Because there has been at least some evidence, albeit a little bit mixed, that the United States has had some issues when it comes to information warfare from not using capabilities as much as they should or potentially having, in the case of cyber, widespread vulnerabilities, as was identified in a recent Government Accountability Office report. Now I’ve seen some pushback on those, but, nonetheless, I think there’s a broader need to the extent that drone swarms are part of the battlefield and, more broadly, drones generally are part of the battlefield to think through that sort of complex information environment and think through what gaps might the United States have, and where might they create vulnerabilities, and how might we plug them to be successful in the future.

(Host)

It was a treat to talk with you. Thanks for your time.

(Kallenborn)

Yeah—thanks for having me.

(Host)

Listeners, if you’d like to learn more; get some rich details; and get some relevant, current-day examples of “InfoSwarms: Drone Swarms and Information Warfare,” I urge you to read this article. You can find it at press.armywarcollege.edu/parameters. Look for volume 52, issue 2.

If you enjoyed this episode of Decisive Point and would like to hear more, look for us on Amazon Music, Spotify, Apple Podcasts, Stitcher, or any other major podcasting platform.

Author information:
Zachary Kallenborn is a policy fellow at the Schar School of Policy and Government, a research affiliate of the Unconventional Weapons and Technology program at the National Consortium for the Study of Terrorism and Responses to Terrorism, a senior consultant at ABS Group, and the officially proclaimed US Army “mad scientist.” He is the author of publications on autonomous weapons, drone swarms, weapons of mass destruction, and terrorism involving weapons of mass destruction.