Mobility and AI Are on a Collision Course
The U.S. Defense Department is accelerating its investments in live, virtual, constructive and mixed-reality training, which will result in the rapid development of new immersive military applications. As the mobile revolution intersects with new data science technologies such as artificial intelligence and machine learning, these expenditures will enable warfighters to be better prepared regardless of the scenario.
The U.S. military faces constantly evolving threats in all corners of the world. To stay one step ahead of increasingly sophisticated adversaries, it’s imperative that they have the right tools and training at their fingertips to operate and maintain highly complex systems—from Air Force jets to Navy littoral combat ships to Army cyber defense systems.
To address this need, the Navy alone is expected to spend between $6 billion and $6.12 billion annually from 2016 to 2021, according to a report from Frost & Sullivan.
Incredibly fast-paced, dynamic threats and complex missions require more intelligent training methods. Traditional education, operations training and combat simulations pull troops from commands or the frontlines, do not keep up with the latest operational procedures and soldiers’ needs, and are only accessible while on-site.
From a purely economic standpoint, current training methods are costly and complex to operate. They take place in enormous physical environments and entail travel expenses. The bottom line is it’s not cost-effective or practical to send personnel to these facilities more than once or twice a year.
This government and private-sector expenditure will require Defense Department chief information officers, simulation managers and trainers to adopt a new deployment mindset to move beyond the buzzwords, save millions and redefine training.
Commonplace widespread technical advances should help transition their way of thinking. According to Pew Research, more than 77 percent of Americans now own a smartphone. At the same time, the capabilities of these devices are more advanced than ever. Today’s smartphones are millions of times more powerful than NASA’s combined computing power when the first astronauts set foot on the moon in 1969.
The use of mobile devices has undoubtedly become ubiquitous and ingrained in how people work, live, play and now how military commands prepare warfighters. By moving training to mobile platforms, agencies can untether experiences to bring battlefield and operational training to troops rather than transporting troops to the training. Not only is virtual reality safer than live training, it also can greatly improve deployment flexibility and reduce training complexity and cost by delivering training at the garrison or a mission’s edge.
A mobile-first approach to training also helps agencies adapt to a changing workforce and keep pace with the way the next generation receives, processes and shares information. Millennials learn, interact and communicate in a different way than previous generations. Offering immersive experiences that can be accessed at their fingertips on a mobile device enables these digital natives to absorb training in a way they understand. This overall revolution in training style can lead to higher retention, greater productivity and more effective real-world outcomes.
Life or death can depend on responses honed in high-stress survival simulations. As an industry, modeling and simulation is at a crossroads of adoption. The combination of mobile devices and artificial intelligence (AI) is ushering in a new era of ultra-realistic and high-fidelity training experiences that hold up in the real world and enable troops to train like they fight.
New advancements in data science—from machine learning and AI to decision analytics, deep learning and computer vision—will power more interactive, customized and predictive models and simulations where machines will learn from and adapt to users instead of vice versa. The industry already is beginning to leverage machine learning capabilities that allow a simulation to self-learn users’ behavioral patterns in real time to mimic more closely the unpredictable scenarios they will encounter on the battlefield.
One example of how machine learning can be applied to virtual reality is the deep analysis of 360-degree drone footage video. Users can identify points of interest such as friends, foes and vehicles and use machine learning tools to identify and label image contents to determine points of interest for analysts to investigate.
Another example is employing natural language processing and machine intelligence to allow multiple users to view and interact with enemy units simultaneously in a simulated theater environment. By visualizing how orders might be executed and maneuvers could occur in a scenario of a potential conflict, commanders can sharpen their critical-thinking skills before a boot ever touches the ground. In addition, AI-powered bots beginning to emerge can create more intelligent simulated opponents in fighter pilot training.
The application of big data analytics intelligence also will be critical to improving the simulations themselves. Surgical training for field medics is one example. Multisensory haptic data, such as visual and tactile feedback, could be captured, learned and incorporated by instructors into the training environment to develop improved scenarios for future surgeons.
Along with data science advances, Moore’s law and network enhancements on the horizon, 5G high-speed and high-bandwidth connectivity are supercharging computing and increasing the speed of rendering capabilities. As a result, agencies will be able to better leverage their stockpiles of data to solve complex mission-critical problems using mobile platforms.
To drive adoption, however, agencies must deploy stronger mobile security to keep sensitive training content out of enemy hands. As simulations become more advanced, so too must the technology keeping that data secure. Agencies must embrace defense-grade endpoint security solutions that address a host of needs, including malware protection, multifactor authentication and biometrics, mission action verification, encryption and cryptographic digital credentialing to ensure data is protected regardless of where it originates or resides.
While the industry has the technical capabilities to deploy virtual reality training capabilities securely today, Defense Department training and information technology leaders also must address the cultural challenges that prevent their organizations from achieving full productivity and cost-savings potential. Agencies are gripped by the status quo and a stifling culture of renovation instead of innovation. Decision makers need to remain nimble in the face of accelerating digitization and information technology modernization initiatives and adopt a prototype mindset.
Defense organizations also must embrace a willingness to partner more broadly with innovative industry partners and reward risk-taking. Only after doing so will troops be able to benefit from the combination of advanced mobile and artificial intelligence-driven capabilities that are set to be a game changer in bringing immersive capabilities closer to the mission edge.
Christopher Balcik is vice president for the government vertical business in support of the Mobile B2B Sales Division, Samsung Electronics America. He has more than 26 years of experience in the defense, intelligence and transportation industries.