AI is poised to cash in on a lucrative military simulation and training revenue source as the Pentagon prepares to spend an excess of $26 billion in these areas through 2028, forecasts a new report by GlobalData, while the line between front and rear line AI use grows fuzzier as the technology becomes an enabler of military operations.
“Whether it be the use of non-lethal technology, the development of military strategy, the auto-generation of virtual training environments or simply the use of new budgeting tools, there are many areas where AI can assist military leaders without causing harm to others or creating new weapons,” says Fox Walker, a defense analyst with GlobalData, noting that OpenAI recently lifted its ban on the use of its AI tools for military use. “Military training and simulation is the largest sector of the U.S. defense market.”
For its part, the Pentagon expects 2024 to be a major year for training and simulation. The U.S. Army, for example, is developing a “synthetic training environment” (STE) envisioned as a collection of software programs that can do everything from individual to collective training on a variety of platforms. STE would allow trainers to call up the topography of any place in the world and apply an assortment or vehicles and aircraft to a training scenario. STE ultimately will be able to simulate joint operations involving as many as 75,000 troops, according to a National Defense report. STE also will be a significant upgrade to simulating cyber, space and maritime domains.
AI, in fact, already is being used in military training. Operators from the 10th Special Force Group-Airborne, an unconventional warfare and counterterrorism unit, in February completed virtual shooting training using an AI weapons simulator at a facility in Fort Carson, CO. The VirTra simulator analyzed and adjusted the performance of the soldiers as they went through a series of high-risk entry, active threat, hostage situation, and threat identification scenarios. The AI element honed individual skills by increasing the level of difficulty from a previous exercise with a “practice makes permanent” goal of increasing lethality.
Meanwhile, the Space Development Agency within the U.S. Space Force has contracted with California-based EpiSci to develop a hypersonic missile tracking system using its AI tools, a challenging task given the extreme speeds of hypersonic weapons, generally Mach 5 or five times the speed of sound or greater. Â EpiSci will test its AI software against data collected by low earth orbiting sensors to identify and track hypersonic threats. Ultimately, a 100-plus network of missile-tracking satellites is envisioned. One challenge is being able to track a hypersonic missile as it moves from one camera to another and communicate that information across the satellite network. EpiSci specializes in AI for autonomous drones and AI enhancement packages for combat pilots.
How AI may be of service among the other branches of the military is one that is being actively explored. For example, the U.S. Air Force is experimenting with AI use for an autonomous drone in a virtual wingman mode. And the U.S. Navy, in the wake of Houthi attacks in the Red Sea, hopes to use AI to help thwart attempts to disrupt international shipping, a strategy that may include the use of AI-enabled drones not reliant on being able to communicate to execute their missions.
As has been seen in the war in Ukraine, electronic jammers can sever links between an operator and a drone. That’s not a far cry from the development of an autonomous weapon system governed by AI, a move the military no longer expressly bans, saying such a weapons system would be subject to a “senior review process” if developed.