When the United States entered World War II 79 years ago this month, it embarked on an unprecedented period of change. With the attack on Pearl Harbor, traditional notions of work, education, security and every other aspect of life in America were pushed into a new reality. Today—albeit without a declared war—deployment of technology has created similar conditions for society-level change that the country must embrace.
In the book Bracketing the Enemy, John R. Walker writes about the World War II practice of having forward observers accompany infantrymen on the front lines to send targeting information back to artillery gunners. This innovation helped the United States win crucial battles because gunners benefited from timely and accurate information instead of guessing target locations.
“We’ve always done it this way.”
When statements like this become commonplace within teams, they can corrode even the best of organizations. Innovation is stifled, work becomes routine and experts disengage and move on.
Yet in many organizations, the resistance to change is an enduring part of its culture—to the delight of adversaries and competitors.
Innovation has propelled the government and society forward with lasting advances in science, technology, medicine and many other fields. Its relentless nature has created competition among technology providers, shortened product life cycles and resulted in many solutions being shelved in favor of upgraded ones.
However, some legacy systems remain useful and still fulfill customer needs. If a solution or system isn’t broken, should it receive upgraded functionality or be set aside in favor of a new solution “just because?” If so, what’s the criteria for doing so?
When global positioning system (GPS) devices entered the consumer marketplace, they were big, clunky and not user friendly. To reach a location, users had to input waypoints and then be sure to stay on the line connecting each one. Despite their difficulties, early GPS receivers represented a typically incremental pathway for innovation: evolving from an early military application to becoming extremely useful on a commercial basis when connected to digital maps.
Now, GPS connectivity is standard in cars, smart phones and fitness devices, and the innovation continues with applications for autonomous farming equipment, online cargo tracking and smart munitions.
Whether it’s propaganda that has a grain of truth, or it’s more deliberate disinformation that adversaries distribute to alter public opinion and gain an advantage, deceptive content and easy access to the mass population via social media pose a high threat to institutions and democracy. Intentionally or accidentally, groups and individuals have the ability to quickly promote falsehoods, making it difficult for governments, businesses and citizens to take corrective action.
As the world becomes more complicated, everyone strives to find ways to simplify it. The retail industry’s big box chains demonstrate this by allowing customers to avoid going to multiple stores, while mail-order clothing services allow you to “try before you buy” in the comfort of your own home.
In the information technology (IT) industry, this streamlining takes the form of buying Everything-as-a-Service (XaaS). Using cloud-based tools and technologies not only assures users access from anywhere and on any device, but it also allows agencies to use fewer IT staff and procure pay-as-you-go, consumption-based services.
High on the list of disruptions caused by the COVID-19 pandemic is productivity, with multiple sectors of the economy having slowed. As industries and workers find their path forward, many are taking a renewed look at telework, and for good reason.
Some industries, such as hospitality or healthcare, require at least some face-to-face contact with customers, and they must take the necessary precautions to keep employees safe and functioning where possible. However, for other industries, this crisis presents an opportunity to rethink remote work and how well it can fulfill organizational goals and missions.
A feeling of déjà vu has emerged following various conference presentations by speakers across the Defense Department and intelligence community. Their top priorities and concerns are similar to the ones that arose during the Cold War.
The first reaction of society at large is to say “same stuff, different day.” But is it?
These headlines sound oddly familiar:
“Freedom of navigation operations denounced.”
“European Defender 2020 to be largest deployment of troops to Europe.”
“New foreign bases built in Southern Hemisphere and on islands in the Pacific.”
“Swedish and Polish defense leaders discuss concerns about Russia.”
“NATO condemns Russian annexation of Crimea.”
When the National Counterintelligence and Security Center designated April as National Supply Chain Integrity Month, it cited threats that cost the country innovation, jobs and economic advantage. It also mentioned a reduction of U.S. military strength as the need for increased awareness. Now as we approach the one-year anniversary of that designation, threats—especially cybersecurity threats—continue to grow and evolve. These give the military-industrial base new reasons to refocus on the security of contractors, subcontractors and suppliers.
More than half of organizations today are not prepared to handle cyber attacks and data breaches, according to a recent report from FireEye. Updating operating systems, patches and even cloud strategies is a start for addressing the problem today, but technology only offers one, often over-emphasized, leg of support.
Long before the federal government charged two defendants in 2018 for ransomware attacks on municipal computer systems—including Atlanta’s—cities found ways to make do during these outages. Police wrote reports by hand, traffic tickets were paid in person and social media kept everyone informed in a way that showcased a city’s resiliency.
Well, 2019 has flown right by, and so my monthly column for SIGNAL Magazine comes to a close. It has truly been a privilege to present these columns to the AFCEA community. I hope they sparked some fresh thinking about the many changes and innovations we see all around us. The U.S. military community is at an inflection point, and it is critical that we continue these important discussions into the future.
My columns so far have centered on various components of modernization and innovation that I think are needed for the U.S. military to reposition itself for success on future battlefields. Emerging technologies, culture, workforce, partnerships—all play critical roles and must be recalibrated for a future that will be increasingly complex and dynamic.
As the Defense Department moves to embrace more innovation, it will change the way our future wars will be fought. Defense planners already are working to understand this in detail, and the vision they have devised is called multidomain operations (MDO).
Part one of a two-part series.
Nothing keeps Defense Department leaders up at night more than today’s cyber threat. This heightened concern was clearly reflected in the September 2018 DoD Cyber Strategy, which noted that “competitors deterred from engaging the United States and our allies in an armed conflict are using cyberspace operations to steal our technology, disrupt our government and commerce, challenge our democratic processes, and threaten our critical infrastructure.”
Just about everybody who has worked for the Defense Department has encountered this: A new technology is deployed—a software application, new hardware, a piece of gear or a tool—and after using it, people discover it falls short of expectations. Perhaps it was difficult to operate. Or maybe it didn’t do what was needed. Or it might have done what was needed but did it poorly. Or it worked well enough for some use cases and not others.
Ever since British polymath Alan Turing posed the question, “Can machines think?” in 1950, mathematicians and computer scientists have been actively exploring the potential of artificial intelligence (AI).
To be sure, much of the buzz around AI since then has been more hype than reality. Even today, no one credibly argues that machines can match the suppleness and complexity of human intelligence. But we are at a point where machines, when tasked for specific use, can do many things humans can do—such as learn, problem-solve, perceive, decide, plan, communicate and create—and some things even humans can’t do. And that’s a huge leap from where we were only a decade ago.
Within the last year and a half, an exciting development has taken place at the Defense Department: It has turned the corner on cloud.
For years, the department had followed a cautious, even wary, approach toward cloud adoption. But after reading the 2018 National Defense Strategy and the department’s new artificial intelligence (AI) and cloud strategies, one can only conclude that top defense leaders now view cloud as the cornerstone of our future military readiness.
“The Army is engaged in a protracted struggle to out-innovate our future competitors, and right now, we are not postured for success.”
This statement kicked off congressional testimony by four senior U.S. Army leaders, including now-Gen. John Murray, USA, commanding general of the new Army Futures Command (AFC). The command’s mission is to “out-innovate” our rivals.
I think this statement succinctly captures the paramount challenge of being hidebound by bureaucracy, fragmented efforts, conventional processes and, most importantly, an acute intolerance of perceived risk.
In today’s increasingly complex, dynamic and digital-centric world, the Defense Department’s success will hinge on how well it takes on the characteristics of an agile workforce. This requires qualities such as agility, responsiveness, efficiency, resiliency, innovation and hyperawareness of the many environments it inhabits.
Information technology, smartly managed, can deliver all these capabilities. So it is no surprise that in the most successful agencies, technology is leading the charge toward new business models and new ways of thinking and working.