What would it take to deliver high quality augmented reality to the masses? Mobile devices packed with high computing power and both optical and LIDAR sensors in every hand? Check. Robust operating systems capable of overlaying 3D graphics in real environments? Check. Devices that enable high-definition rendering of digital images? Check. What’s missing? A compelling need to project information from offices and retail environments into homes and remote locations? The COVID-19 pandemic may have fixed that. So, what else is missing? Bandwidth! With 5G cellular communications entering mainstream markets, it may finally become a part of our daily lives—for real this time.
Disruptive by Design
The Defense Department has an information warfare (IW) problem. While the information environment continues to grow exponentially in importance and ubiquity, rapidly transforming the character of competition and war, there is no organization within the department that directs, synchronizes and coordinates IW planning and operations.
U.S. Cyber Command serves this very purpose for cyber operations, as do its service components. But this necessarily anchors the focus of American IW on a single information related capability (IRC), at the expense of the many other IRCs and their ability to generate military advantage.
Joint doctrine emphasizes the importance of information operations (IO) in campaign planning and operational design. Information operations include many information-related capabilities, such as cyber operations, electromagnetic spectrum operations, special technical operations and others. But as the battle for the narrative becomes exponentially more important in an increasingly interconnected world, joint planners must re-examine how they employ one of the most neglected information-related capabilities—public affairs.
As the capabilities of networked technologies continue to increase exponentially, so too does the speed and impact of the narrative. Recordings, images and commentary about an event can be uploaded within seconds. Based on how visceral any event might be, it could go viral within moments. By the end of the hour, a dominant narrative about that event could be echoed across the Internet, television and radio, and remain wedged in the minds of the audience for weeks, months or years.
Especially if it isn’t true.
Misinformation is the spreading of wrongful information without intentionally doing so. It can be a simple matter of getting the facts wrong, misremembering some details or sharing a meme.
Disinformation, on the other hand, is deliberate and without regard for second or third-order effects. Russia spreading conspiracy theories on social media to sow discord in other countries is one example; fraud scams are another. Intent makes all the difference. Malice or nefarious intent generally accompanies disinformation while misinformation stems from a generally innocent source.
Disinformation, like misinformation, is everywhere. Plain and simple. It is plentiful in all communities—especially intelligence—and across the globe.
As a way to provide sea denial in support of naval operations, expeditionary advanced base operations are the Marine Corps’ bid for success in disrupting the fait accompli strategies of great power competitors. While highly promising, this concept possesses a critical vulnerability: signature management.
Detection or denial of command and control systems will hamstring expeditionary advanced base operations. Any misstep in communications discipline will reveal the locations of expeditionary advanced bases, putting them at risk. But emerging communications techniques and technologies provide viable solutions to signature management, validating the concept and ensuring the sea services will maintain a critical edge.
The first lesson of economics is scarcity. When supply is low and demand high, prices soar, and some will go without. In the U.S. Defense Department, both the demands and costs for reliable, resilient, and robust communication services continue to grow. As the services consider options to privatize aspects of communication, both the opportunities and challenges require thorough consideration.
A recent posting of a satellite image from the U.S. National Training Center by Col. Scott Woodward, USA, raised a lot of eyebrows. The image shows a cluster of electromagnetic signals emitted from a battalion-sized unit participating in a large-scale training event. These signals were captured as part of the exercise from more than 10 kilometers away. This picture showed what many of us already know: we have an electromagnetic emission discipline problem.
The 2020 election may be the most vulnerable yet. Last year, several federal agencies released a joint statement identifying election security as a “top priority for the U.S.” However, some have proposed mail-in ballots due to the COVID-19 pandemic and consequences associated with not social distancing. Why are we going backward instead of forward? Reverting backward during a disaster only adds challenges and difficulties with an already broken voting system. We need to be proactive, not reactive, when electing leaders at all levels across the country.
Many experts have imagined a future in which the Department of Defense deploys an army of gadgets to track the health of individual warfighters in real time. However, most did not envision a global pandemic being the tipping point for the large-scale adoption of devices.
As the world faces the coronavirus pandemic, leaders want to better understand the health of soldiers, Marines, airmen and sailors in real time, securely while maintaining some semblance of privacy. As leaders and program managers wrestle with decisions to employ these technologies, they must address information security, privacy and the need to know.
The Air Force Cost Analysis Agency (AFCAA) offers multiple examples of data visualization tools being actively used for cost analysis, including the Air Force Total Ownership Cost (AFTOC) program decision support system, the Flying Hour Program and an array of research projects. However, these are far from the only examples. Data visualization tool power is popping up everywhere.
The Office of Management and Budget’s (OMB’s) Memorandum M-19-18, “Federal Data Strategy - A Framework for Consistency” acts as a foundation of guiding principles and best practices to help agencies update the way they manage and use data and improve on information delivery, service and consistency. The intent is to pull government into the modern technological times in which we live while focusing on the ethical and compliance challenges of governing, managing and protecting data.
Over the past few months, I have participated in a forum to help competitive graduates find quality internships and jobs after earning bachelor’s and master’s degrees from Carnegie Mellon University’s prestigious engineering and computer science programs. Listening to the next generation of technologists, innovators and leaders has helped me understand their concerns and desires in the hiring process. I have also gained new perspectives on what applicants think employers do right and wrong.
The following tactics, techniques and procedures (TTPs) are meant to help those in the government and defense sectors attract tech talent today.
Like me, you may have thought black is black and as dark as it gets. However, courtesy of carbon nanotubes (CNTs), individuals are creating blacker and blacker, even blackest versions of black. A quick Google or YouTube search yields all sorts of interesting results from BMWs painted in Vantablack, to the “blackest little black dress.”
In practice, CNTs are materials that can be vertically aligned to capture light in the 99.9XX percent range and produce blacker versions of the blackest black. CNTs are microscopic filaments of carbon that can be grown on surfaces for various uses.
When it comes to acquisition reforms, many know of the talent management, leader development and other transaction authority endeavors, but in this column I want to highlight a lesser-known effort, Army Directive 2018-26 (Enabling Modernization Through the Management of Intellectual Property), which will be incorporated into a number of other Army regulations covering acquisition, technology transfer and integrated product support.
When Google announced it was acquiring Nest for a little over $3 billion in 2014, analysts thought the company wanted to enter the home appliances market.
It was all about the data.
Google gained access to a treasure trove of information about consumer demands for heating and cooling. The company learned when people turned on their furnaces and shut off their air conditioners. Google could pair this information with the type of household, neighborhood and city.
A deepfake is an artificial intelligence-based technology used to produce content that presents something that didn’t actually occur. This includes text, audio, video and images. Deepfakes are a recent disruptive development in the technology world, and both the House and Senate are investigating the phenomena and have introduced bills to combat these potentially dangerous creations.
In the cyber realm, organizations need the means to rapidly identify emerging threats, immediately respond to mitigate risk, and systematically learn from these encounters—just as the immune system responds to a virus.
A single tool, process or team cannot deliver true cybersecurity. Collecting, analyzing and disseminating intelligence requires a converged organization that fuses expertise across domains. As adversaries possessing sophisticated expertise and considerable resources target multiple attack vectors—cyber, electromagnetic and physical, for example—cyber leaders must develop teams and systematic processes to rapidly transform analysis into action.
Want to be disruptive, I mean truly disruptive? Try delving into history while surrounded by software engineers and app developers. Watch how the presence of a book on Charles Babbage and Ada Lovelace in the 19th century raises eyebrows at your next scrum team meeting. Be passionate about the history of technology, and you will disrupt.
I recently completed a short course on the history of computer science. Accounts of generations of scientists and engineers stepping from one advancement to the next through iterative problem solving efforts provided rich details about how computers progressed and the thinking of those working to advance the broader field of study.
There has been a quiet revolution in the television industry thanks to the vision of Adde Granberg, chief technology officer and head of production at Swedish Television SVT.
When we watched Lindsey Vonn retire in February of this year after an amazing career as an alpine skier, a quiet revolution happened behind the cameras. What looked like a normal, well-produced live TV event on the surface was, in fact, the world’s first remotely produced large-scale live TV production. In the world of live TV production, this is almost considered a quantum leap.
Benefits associated with agility, scalability, ease of management and increased security justify the Defense Department’s investments in a transition to cloud services. As each military service rolls out new cloud capabilities, however, they may find that simply building these solutions will not attract organizations to use them. A misalignment of various motivations and an array of complex factors will impose costs that limit leaders’ freedom of movement in deploying any universal cloud solution. Getting the people and processes right matters just as much as the right technology.
Server Farm to Tabled Agreements
In his famous poem, “The Road Not Taken,” Robert Frost writes, “Two roads diverged in a yellow wood.” If you have read the poem or analyzed it, as many a high school English teacher has required, you know that Frost suggests taking the road less traveled is the better choice. And while this may be true for adventure seekers and wanderers out there, here in the world of IT I recognize the benefits of not wandering off on my own. The life cycle of network equipment can be five to seven years, or even longer, so on this cusp of 400G it is important to choose optics that offer interoperability for the long term.
What’s smaller than small? Nano. One nanometer is a billionth of a meter. At a scale smaller than a grain of salt, a blood cell or a virus, resides the nanoverse. Nanoparticles range from one to 100 nanometers. For perspective, a sheet of paper is about 100,000 nanometers thick.
Many believe the development of nanotechnologies will forever change our world. Rather than taking what the planet provides, we can make what we want, beginning at the smallest scales. From microprocessors to minuscule organ-on-a-chip devices, the forefront of creation begins with some of the tiniest objects. With nano, a new era in capabilities is moving from the horizon to the now.
While the U.S. Defense Department struggles to connect tactical and strategic networks, industry has cracked the interoperability code. Commercial pressure to develop a digital ecosystem where any device delivers content across platforms and service providers has led to robust industry standards and intuitive application programming interfaces.
Increased interoperability and access, however, bring increased risk, which discourages the bridging of networks and enterprise services. Innovators must face these fears head-on. Strategic-tactical network integration requires a plan for analyzing risk, employing control measures, developing operating procedures and training across organizations.
Ever-expanding reviews and policies aren’t the only way to control enterprise information technology projects. Instead, management should establish clear standards and incentivize project managers to choose enterprise-friendly designs that streamline external reviews and eliminate the delays and costs associated with compliance.
Information technology projects have distinct requirements: cybersecurity, privacy and Section 508 compliance. These necessary requirements add a significant burden and can cause slowdowns and cost overruns. Other external challenges come from the budgeting process, procurement and configuration management.
G-Invoicing: Sounds interesting by name alone, right? Chatter among the U.S. Defense Department financial management communities and peripheral groups supporting government invoicing confirms said interest. Many of my colleagues and I want to know more, and I hope you do too because it is changing the way intragovernmental transactions work. In the last year or so, questions, thoughts and, most recently, training are informing audiences about G-Invoicing.
Unmanned systems and robots are rapidly changing the character of warfare. As the U.S. Defense Department considers their increased use, the time is ripe to discuss both the opportunities and challenges these autonomous systems present on and off the battlefield for military communicators. Communicators deliver and protect command, control, communications, computers, intelligence, surveillance and reconnaissance (C4ISR) services. Unmanned systems rely on digital communication channels to execute tasks and share information. The more systems, the more links required.
The scope of managing these channels is set to explode.
People worldwide are buzzing about digital currencies such as bitcoin and Ethereum. Blockchain is the technology that forms the backbone of some of these new currencies being marketed today.
Blockchain creates a digital decentralized ledger that records all transactions. There is no central point of ownership for the information on the ledger, and the information is transferred among disparate parties. Each time the ledger is updated or verified, a time stamp is assigned and linked back to the previous record. The result is an unchangeable chain of information consisting of blocks—hence the term blockchain.
The U.S. government is likely the largest combined producer and consumer of software in the world. The code to build that software is volatile, expensive and oftentimes completely hidden from view. Most people only see the end result: the compiled and packaged application or website. However, a massive worldwide community, the Open Source Initiative, centers on the exact opposite.
With modern society’s infatuation with selfies, facial recognition technology could easily be used to identify common physical traits of criminals, pinpoint communities dominated by potential offenders and then help determine where to focus crime-prevention programs.
As businesses, governments and militaries wrestle with artificial intelligence (AI) technologies, managing machines that learn is a challenge common to all.
AI will not merely displace blue-collar tasks; it will affect every management level. Managers will outsource many mundane, time-consuming, attention-taxing and less rewarding tasks. The bigger challenge, however, is integrating AI systems into their teams and determining how teams will collaborate with AI systems to increase insights, improve decision making and enhance leadership.
The U.S. Defense Department is implementing one of the world’s largest enterprise resource planning (ERP) systems, and the process could be going better. This is the case for many organizations that decide to adopt the software. After all, ERP software can cause network failures, resulting in significant lost opportunities and resources.
ERP software allows the integration of business management applications and automation of office functions. As a taxpayer and a steward of tax dollars, I have questioned the department’s choices of ERP software and implementation techniques. I have also studied a rarity—an ERP implementation success in a government organization.
In a few short decades, the world will be vastly different. The military environment is no exception, given that a force built for and in the industrial age will continue providing national security in an increasingly unstable and uncertain world. The dramatic and potentially unforeseen advances in technology will be countless. Leaders will need help figuring out how to conceptualize and capitalize.
This includes the Air Force. The force of 2050 will no longer be confined to space, sky and cyberspace. Training, tools and tactics will change.
Agroterrorism, a subset of bioterrorism, is defined in a Congressional Research Service report as “the deliberate introduction of an animal or plant disease with the goal of generating fear, causing economic losses or undermining social stability.” The word is rarely used, and fortunately, an event is even more rare. Rarer still are common understanding and readiness among U.S. agencies facing this threat. However, recent legislation and a survey of the nation’s emergency management capabilities underscore the need to prepare even for low-probability but high-impact acts of agroterrorism.
A new era of computing, sensing, modeling and communicating will begin with the advent of viable quantum technologies. Viable quantum technologies will change everything about computers. Harnessing the characteristics of quantum mechanics is bound to unlock mathematical mysteries and enable profound applications.
Today’s military leaders must prepare now for the quantum future.
No one likes a snitch. Yet whistleblowers or leakers have been sharing sensitive national secrets and agitating government waters since the country’s founding, usually to the ire of those in power. Today, spilling secrets seems more pervasive than ever. Recent leaks radiating from the National Security Agency (NSA), the CIA, the U.S. Defense Department and the White House leave little doubt that investigators are poring over every detail.
Understanding why leakers leak is just as important as grasping how they do it. Determining the motives behind someone’s deliberate action to share government secrets requires concerted due diligence after the incident.
President Donald Trump recently signed a succinct but sweeping cybersecurity executive order fortifying the U.S. government’s role in thwarting cyber attacks, establishing a path toward protecting federal networks and critical infrastructure, and bolstering cybersecurity for the nation as a whole.
“Our nation’s economic and national security rely on a safe, secure and reliable cyberspace,” said U.S. Department of Homeland Security Secretary John Kelly of the order, titled Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure.
Warfare, as with technology, is changing quickly and dramatically. The U.S. Defense Department’s most recent Quadrennial Defense Review noted the link between this rapid evolution and “increasingly contested battlespace in the air, sea and space domains—as well as cyberspace—in which our forces enjoyed dominance in our most recent conflicts.”
These assertions have major implications for airpower in future contingencies that will call for the Air Force to emphasize cyber over its five core missions. Already, these missions have been tweaked in content and application—changes that leaders could use to set a course for future cyber dominance.
Entitled. Self-centered. Disaffected. These are just a few of the divisive and disparaging words used to describe millennials. The largest generation in U.S. history—an emerging consumer powerhouse—is making significant cultural changes centered around revolutionary, life-enhancing technologies. Tomorrow’s successes are sure to stem from millennials who are pushing the limits.
Perhaps fewer ecosystems can benefit more from this work force’s M.O. than cyberspace, experts shared during the recent debut of a Young AFCEAN panel at West 2017.
If you have been living in a cave, Malaysia’s Borneo rainforest or the 1950s, then you might be among the few people unfamiliar with the power of crowdsourcing.
The term, a convenient meshing of the words crowd and outsourcing, refers to tapping a group of people with similar skills or interests and offering them a venue through which they compete or collaborate to accomplish a particular task, job or goal. Typically, crowdsourcing is carried out by leveraging the ubiquitous connectivity of the Internet. (For more, see “Crowdsourcing Confronts Cyber Challenges.”)
Ask Siri to tell you a joke and Apple’s virtual assistant usually bombs. The voice-controlled system’s material is limited and profoundly mediocre. It’s not Siri’s fault. That is what the technology knows.
According to a knowledgeable friend, machines operate in specific ways. They receive inputs. They process those inputs. They deliver outputs. Of course, I argued. Not because I believed he was wrong, but because I had a lofty notion of the limitations of machines and what artificial intelligence (AI) could become.
In this era of e-commerce, a person can pay for a coffee by simply using a cellphone. Clearly, we have come a long way from trading goats and pelts for goods, but the global method of exchanging currency has advanced little. The world largely relies on the paper money system started by the Chinese Tang Dynasty, despite the enormous expense to maintain physical currency.
It is about time federal contractor employees received benefits equal to their in-house peers.
In November, the long-awaited final rule issued by the U.S. Department of Labor mandated that federal contractors provide paid sick leave to certain employees. The regulation covers both new federal contracts and replacements to expired contracts.
Although some cities and states require that employers offer paid sick leave, no federal law mandates the employment benefit across the board. The United States is the only industrialized nation without paid leave.
The U.S. intelligence community (IC) must transform its ability to discern threats from hundreds of millions of data points that flood databases each day and provide timely, actionable findings to warfighters and government officials. As it stands, agencies devote too much time, money and talent to reading data and must find new ways to keep their edge over adversaries. One way of addressing the problem is turning analysts’ thoughts into digital analytic models.
You read that correctly.
The burgeoning cyber domain as a battlefront has done more than shift the front lines for warfighters—it has virtually erased them. At the same time, traditional armies continue to threaten U.S. national security both at home and abroad. Given the scope of cyber and conventional warfare, how does the U.S. military balance its competing needs?
For decades, private companies have implemented enterprise resource planning (ERP) systems to improve their businesses. However, for all the promises of efficiencies behind the demanding approach, a majority of ERP system implementations fail. Even so, the U.S. government wants to pursue ERP systems.
Regardless of the bad press these systems have received, there is an emerging public-sector market interested in them—and for good reason. ERP systems offer the only holistic alternative to the inefficient legacy systems plaguing the federal government.
The U.S. Defense Department and the federal government could piggyback on the recent blockbuster popularity of Pokemon Go, the location-based augmented reality game that catapulted some couch potatoes from their sofas to the great outdoors, to transform cyber training. The mobile app, an overnight international sensation, combines the virtual world of Pokemon with the real world in which people live.
The gaming craze offers insights on how to excite people to partake in—and really learn from—cybersecurity training.
The time has arrived for the U.S. Defense Department to develop an enterprise solution for the coming wave of augmented reality (AR) systems. Unlike virtual reality (VR) systems that fully immerse users in computer-generated worlds, AR systems overlay virtual content on a user’s perceptual field of view using 2- and 3-D holograms. These images either remain fixed to a user’s perspective as he moves his body and head or anchored to georeferenced locations in a user’s surroundings.
Rapidly evolving cyberthreats challenge all levels of government, and recent incidents such as the Office of Personnel Management data breach illustrate the importance of shielding public and private-sector organizations from such attacks.
How to best equip cyber warfighters—both at home and abroad—is an ongoing debate complicated by persistently improved and interesting tools for cyber analysis, security and ethical hacking that makes picking the “best tool,” or even “the right tool for the job,” very much a matter of opinion and preference.