In July 2019, a swarm of drones descended on a group of U.S. Navy destroyers off the coast of California, hovering near them for an extended period and then flying away to points unknown. Almost two years later, no one seems to know who controlled them or what their purpose was. Similar incidents have happened around nuclear power facilities in the U.S. and Europe. And of course there have been the widely-reported series of incidents in recent years in which American military aircraft have tracked unidentified phenomena, sparking a renewed cultural conversation about aliens.
Unexplained aerial phenomena are nothing new, of course. But where sightings of UFOs in the 1940s and 1950s – during the first wave of public interest – were accompanied only by eyewitness accounts or, in a few cases, extremely grainy film or photographs, today there should be far more data available: radar tracks, high-quality digital video and photographs, other types of signatures, and so on. In a world where more and more data is being produced and captured, it should be difficult for anything to move through it without leaving a trail.
The emphasis there is very much on “should.” Amidst an ever-increasing drumbeat of rapid technological advancement, we can be fooled into thinking that it should be possible to do things which are legitimately difficult: obtaining a positive ID on small, rapidly-moving airborne objects, for example.
And that, in turn, points to a more fundamental problem with the architecture of how we approach emerging security challenges. We have been told over and over that the current revolution in military affairs is data-led. Certainly, there is an element of truth here: the speed with which sensors and processing equipment have advanced in the last four or five decades has wildly surpassed the speed with which other technologies have.
But it would be a mistake to assume that more data means more information, or that more information necessarily means more knowledge. More data mostly means more complexity and is not inherently accompanied by the ability to make sense of it. We are told that the solution to this is, effectively, more sophisticated tools to manage and parse and interpret that data. Artificial intelligence and better human-machine interfaces will sort the huge masses of incoming data and allow humans to make decisions based on quick, accurate synthesis.
But waiting for those technologies to mature to the point where they can even begin to grapple with the problem isn’t enough. The amount of data being generated in the world which can be exploited in some fashion has been growing exponentially for years. This is a structural problem: the drivers are deeply entwined with the world’s economic and cultural trajectory.
And this does not even take into account the profusion of what might, broadly, be called “bad” data: disinformation, misinformation, deepfakes, propaganda, and so forth. Those complicate the picture, but the fundamental problem is not just information warfare: it is inherent to the complex relationship of our societies, our technology, and our world.
Here, I should add a caveat. The world has always been complex, and it is a categorical mistake to conflate actual complexity with our ability to perceive complexity. And the ability to perceive complexity does not inherently make the world more or less safe. It is the gap between the two where the danger lies: to believe we understand the world better because we have more data, to convince ourselves that more data means more control. Neither is the case.
And that brings us back to the mysterious flying objects. Are the mystery drones a threat? It is entirely possible that they are not, and entirely likely that they reflect a set of random, unrelated occurrences rather than some kind of coordinated plot. But in a world where we can watch wars unfold – or the global shipping network descend into chaos thanks to one big stuck boat – in real time from our phones, we increasingly expect that mysteries will be solved quickly, cleanly, and comprehensively. We would do well to remember the space between the amount of data we have and our ability to make sense of it and be mindful of that gap.