Hopp til innhold

Tilbake til arkivet

Dan «Animal» Javorsek Nr. 1 April 2024

Tools to Teammates: AI is here to stay and is coming to a cockpit near you

Software enabled AI systems will undoubtedly play a role in the future of Norwegian Air Power. To successfully shape and influence its adoption it will require the Royal Norwegian Air Force to embrace the Augmented Age we are currently living. AI driven software, enabled by performant hardware, is alive and well in our domestic lives and is already making its way into military systems.

Tema: AI og autonome systemer
Lesetid: 13 min

To better understand both the opportunities and challenges associated with the future adoption of Artificial Intelligence (AI) and autonomy in the Royal Norwegian Air Force it is helpful to start with a brief discussion of the history of the Automatic Ground Collision Avoidance System.

The F-16 is a single-seat fighter aircraft developed in the 1970’s and was unique because of it’s “fly-by-wire” digital flight control system. The F-16 digital flight control system was different from most aircraft at the time because electrical signals transmitted digital inputs from the control stick to the flight controls instead of the traditional cables and pulleys found on preceding aircraft. While the digital flight controls were important for a variety of reasons, they play a critical role in the aircraft’s superior maneuverability. For example, the F-16 is capable of generating, and sustaining, nine (9) times the force of gravity which allows the pilot to outmaneuver adversaries when in a visual engagement. However, an unfortunate side effect of such high forces is that they cause blood to pool in the pilot’s extremities, pushing the limits of their circulatory system. In fact, these high forces create a large enough blood pressure discrepancy between the cranium and lower body to prevent intracranial perfusion that can cause the pilot to pass out. In a single seat aircraft like the F-16, this is very bad and since the end of Vietnam it has led to more pilot deaths than the adversary.

The software called Automatic Ground Collision Avoidance System (or AGCAS) which takes control of the aircraft when it realizes the aircraft is on a trajectory that will hit the ground took several years to develop and was being deployed from 2014. As it was just software, why did such a clearly beneficial and life-saving automated system take 40 years to make it onto the F-16?

In the years since we have good records on accidents (which didn’t begin until 1990), we had 51 crashes where we lost the F-16 aircraft and the aircrew. Since this number doesn’t include the entire lifetime of the aircraft and is limited to only the F-16, the actual ­numbers of losses are much, much higher…and this problem exists in every high G-force aircraft (for example the F-15, F/A-18, F-22, F-35, and trainers).

The business case for the Automatic Ground Collision Avoidance System software was clear. Simply saving a single aircraft is more than double the cost of the development and testing to install it. In fact, many of our more modern aircraft are actually several times the cost of the software. (By the way, even today, this life-saving software has still not made it onto the US Air Force’s F-15 or the US Navy’s F/A-18 so we continue to lose pilots and aircraft every year.)

So again, why was a program that was/is so obviously good to do, so difficult to implement and gain support even to this day? 

Horses against machines

I believe the answer to why it took so long comes down to distrust of autonomy that is often coupled with the threat posed by a disruptive, emerging technology. To explain what I mean, I’d like to use an example from World War II.

In 1938 during the buildup to WWII, Maj Gen John Knowles Herr was promoted to the Chief of the Cavalry. Shortly afterward, he re-­introduced the saber and took a hard line against the mechanization associated with the introduction of the internal combustion engine. Instead, he supported the virtues of the horse.

The software and the data that powers it have become infinitely more important than the hardware that has dominated military ­acquisitions since antiquity

When Army Chief of Staff, Gen George C Marshall, asked Herr of his plan to combat the mechanized, German Blitzkrieg, Maj Gen Herr stated that they had been watching the intelligence reports and felt confident that his mounted cavalry would win. They would trailer their horses to the front line and with them fresh, they would ride circles around the tanks and armored personnel carriers to be victorious on the battlefield as they had been for the last thousand years.

After Pearl Harbor and the official declaration of war, in spite of Maj Gen Herr’s resistance, Gen Marshall finally canceled the Cavalry.

New technology – a threat?

With the privilege of hindsight, it appears that like Maj Gen Herr, our more modern pilots appear foolish in their resistance to a disruptive, emerging technology that is designed to help and not hurt them. However, after living through this as a test pilot myself I concluded that although this reluctance naively appears irrational it actually makes sense if we consider things from the pilot’s (or Maj Gen Herr’s) perspective.

In each case, when not messaged properly, the new technology appears to threaten the Heritage, Honor, Values, and even the Dignity, of those sacrificing daily to accomplish the mission.

The real message here is that I see this theme repeating itself today with AI and autonomy. On the precipice of disrupting far more disciplines than ever before it is at risk of being viewed as a threat. In fact, the ubiquity of AI provides us with either a nice opportunity or a significant challenge, depending on your perspective.

The cyborgs are here already

Although AI has been with us almost since the invention of the computer, the successes of Large Language Models like ChatGPT and other advances in Natural Language Processing have catapulted it to the forefront of the popular psyche. As a result, it serves as a nice opportunity to review the unique moment we are living in human history.

In fact, we are witnessing a significant shift in the way we do work and in general there have been four major historical eras along these lines. The Hunter-­Gatherer Age lasted millions of years. The Agricultural Age lasted several thousand years. The Industrial Age lasted a couple of centuries. And the Information Age has lasted only a few decades.

Whenever I brief on this subject I often stop and ask my audience for any augmented cyborgs to please raise their hands. While this usually garners a bit of a chuckle, I argue that is exactly what we now are. For example, if you were having drinks at a cocktail party and trying to remember the details behind the story of Maj Gen Herr I mentioned earlier, it would be almost second nature to reach down and grab your cell phone to refresh your memory.

If your phone makes you an augmented cyborg imagine what that makes us pilots. For most of my military career, I strapped myself into a flying supercomputer able to go higher and faster than I ever imagined possible. This aircraft came with sensory systems that could see for hundreds of miles in parts of the spectrum I could only dream about. It also came complete with a digital central nervous system to help make sense of it all.

From physical to software

These systems have been evolving since the dawn of the information age but are now finally seeing fantastic gains that will completely reshape our world, due in large part to the recent development of sufficiently powerful computer hardware and the data to make biologically-inspired AI approaches viable.

While hardware meets a requirement, software actually solves the problems warfighters care about. The software and the data that powers it have become infinitely more important than the hardware that has dominated military acquisitions since antiquity. From the British Longbow at the Battle of Agincourt in 1415, to the Gatling Gun in 1862, or even the F-35 at the end of the 20th Century, the fundamentals had always been the same…to increase capability we had to physically build something new. But that all changed with Moore’s Law and the digital revolution that powers the Augmented Age.

When not ­messaged properly, the new technology appears to threaten the Heritage, Honor, Values, and even the Dignity, of those sacrificing daily to accomplish the mission

Just look at all the big companies that are defining the rules we live by today. Amazon, Google, Netflix, Uber, X, are all data companies…they are data-driven organizations. I believe we need to allow this insight to influence military applications as well. Over the last decade and a half, I was fortunate to have had the chance to shape the future of Air Dominance and Air Combat. When I decided to become a Program Manager at the Defense Advanced Research Projects Agency (DARPA) I very deliberately only started software programs.

One of the virtues of software is that democratization of it on hardware systems enables a healthy ecosystem of developers that inherently resists the cost overruns and price gouging that comes with monopolies like those discussed in the 60 Minutes Investigation reported on last year (https://www.cbsnews.com/video/price-gouging-pentagon-military-contracts-60-minutes-video-2023-05-21/).

Roadmap for military future

Even if the US military is formally struggling with realizing their own rhetoric there are a couple of nice indicators that a reset to address our growing tech debt is underway as militaries around the world are valuing the democratization of software. In fact, I often like to discuss three relatively new efforts currently being led by junior officers but that will have a huge impact on the future.

First, an initiative from Air Combat Command (ACC) called Crowd Sourced Flight Data (CSFD) is enabling precisely the kind of data collection we need to inform future autonomy development. From collections using tactical data recorders and other instrumentation sources, coupled with the requisite infrastructure and processing, we are just now beginning to realize the potential of Data as a Weapon (https://www.af.mil/News/Article-Display/Article/3226148/qrip-equipped-caf-f-35s-set-the-stage-for-future-crowd-sourced-flight-data-plat/). While still in its infancy, the Crowd Sourced Flight Data has started collections with F-35’s and we are already extending it to other platforms as well. Although early successes have simply helped us expose the test community analysts to a richer and more realistic dataset composed of non-traditional test sources, it represents a remarkable ability to enable human-on-the-loop risk reduction for autonomy at a variety of scales.

Second, ACC has also developed the Federal Laboratory (or FedLab) as a Left-of-Requirement entity with a government-owned Open System Enclave (OSE) to get capability onboard fielded weapon systems by going around the decade-long traditional process created by vendor locked companies. For example, the Fighter Optimization eXperiment (or FoX) is a tablet that allows applications and software to get into the hands of the warfighter quickly without risk to the safety of the aircraft (https://www.twz.com/40224/project-fox-brings-tablet-based-apps-to-f-35-stealth-fighter-cockpits). In the last few years, the ACC Fed Lab even demonstrated the first instance of 3rd-party software on an F-22. This was performed using a 5th gen common application based on Kubernetes (https://www.af.mil/News/Article-Display/Article/3146566/acc-federal-laboratory-flies-combat-apps-on-f-22-with-new-open-software-stack/).

Third, the Air Force Research Laboratory (AFRL) continues to make progress on the technologies necessary for Networked, Collaborative, Autonomous (NCA) weapons. In addition to the Collaborative Small Diameter Bomb (CSDB) designed to share data and execute coordinated behaviors, they have fostered a digital ecosystem known as the Colosseum that lowers the barrier to entry for teams wishing to compete (https://www.defensenews.com/air/2021/02/04/air-forces-golden-horde-swarming-munitions-program-to-get-a-second-chance-this-month/).

All three of these efforts paint a promising technology roadmap that embraces the important role that software, data, and AI might play for militaries of the future.

Challenges with trust

However, like the F-16 Automatic Ground Collision Avoidance System discussed in the introduction, these efforts will all encounter challenges with trust because it is central to everything we do in the military.

When it comes down to it, trust is the currency of combat operations. It is hard to gain and easy to lose which makes it a major challenge to widescale adoption of AI on the battlefield. This is because trust is relational, contextual, and subjective with a willingness to be vulnerable that is difficult, if not impossible, to be written into computer code. Throw in the complications of lethal outcomes associated with legal, moral, and ethical implications and it becomes even harder.

In fact, the best way to address the trust challenges is to start working with the system early to effectively coevolve the tactics with the technology. For example, when we say we trust someone it inherently implies some sort of shared understanding and bi-lateral communication that is often only forged with experience.

Ultimately, for AI in the cockpit we want human-commanded, AI-controlled autonomy where each member of the team is doing what they do best and this means thinking about the software more like a teammate than a tool. However, for millions of years our non-biological tools have been exclusively passive. They do exactly what we tell them and nothing more. Even our most advanced systems, from motorcycles and computers to combat aircraft, do nothing without our explicit direction. This has led us to build a whole host of incumbent assumptions and processes for these passive systems. But the passive nature of our tools is entering a revolution.


Often when talking to non-pilots I make the analogy that the airplane is simply a very fancy motorcycle that flies. That analogy is helpful because we think of modern aircraft and motorcycles the same way…they are tools that are deterministic and passive. Ideally, they work the same way every time but they require a high degree of intelligence by their operator. As a comparison, a Harley Davidson motorcycle is very different from my horse, coincidentally named Harley.

My wife and I have four horses, one for each of us along with our two girls. As a result, I have a lot of experience working with a tool that very different from a motorcycle. My Harley clearly has a mind of his own and he is far from passive. In fact, Harley is actually smarter than our most advanced AI algorithms. This is because when I arrive at his turnout with a saddle, Harley knows with a high degree of certainty that we are going for a ride because every time I have shown up with a saddle before, we went for a ride. To come to his conclusion Harley had to pay attention, remember what happened before, and he had to retain and create a pattern of the activity in his mind.

This is something computer scientists have been trying to do in the AI field of research for the last 70 years or so and we are just now making progress on this kind of intuitive behavior.

It is also worth noting that while sometimes unpredictable and unexplainable, Harley and other biological systems like him bring capabilities that traditional deterministic systems do not have. In a lot of cattle work, a rider relies quite heavily on the unique and narrow capabilities of their animal teammate. Separating a single cow from a herd in the open is nearly impossible to do with a passive motorcycle but is rather straight forward with a good horse (https://youtu.be/GSgCJe8ph9Y?si=f9Jqow3GCYm7ovFf).

From the video, you see that the rider gives control to the horse who has a far better ability to perceive and anticipate the cow’s movements all the while remaining in command and “on-the-loop.” In effect, riders establish a close relationship with the horse that we call “harmony” in equestrian circles. This harmony capitalizes on the unique and symbiotic relationship of each teammate. A relationship I believe is critical to understanding how we will negotiate our future with AI. After spending several years flying with AI agents in simulated combat scenarios I am firmly convinced that our future with AI looks a lot more like my relationship with Harley the horse, than Harley the motorcycle.

Embrace the Augmented Age

To sum it all up, AI is here to stay and is coming to a cockpit near you. Although what I have discussed provides just a brief peek into what is happening in the United States, AI doesn’t not recognize sovereignty and will not be contained by boarders. Software enabled AI systems will undoubtedly play a role in the future of Norwegian Air Power as well. The question is how to shape and influence its adoption in your country.

To do so successfully will require the Royal Norwegian Air Force to embrace the Augmented Age we are currently living. AI driven software, enabled by performant hardware, is alive and well in our domestic lives and is already making its way into military systems. However, like the F-16 example we started with, humans will likely be reluctant adopters. Only though early involvement can you build trust and coevolve the tactics with the technology. Finally, this very capable AI software will force you to shift your mindset. Instead of considering AI as a simple tool, you will need to transition to thinking of it as a teammate.

About the author
Former director Air Force Operational Test and Evaluation Center, Nellis Air Force Base, Nev., and Director, F-35 U.S. Ope­rational Test Team. AFOTEC’s Detachment 6 plans, conducts, and reports on realistic, objective, and impartial operational test and evaluation of fighter aircraft. Operated F-35, F-22, F-15C/E, F-16, and A-10.

AI og autonome systemer