Starting from the first recorded raid on the monastery of Lindisfarne in 793, Viking raids presented European rulers with an unprecedented challenge. Fast, sleek longships could stealthily deploy alongside the coasts of early medieval England and France, striking at wealthy, isolated targets and departing before local authorities could mount a response.
Guest Blogs
This is the first in a series of online articles written by Army Signal Corps officers.
As an armored formation, our lethality is not just derived from our firepower but also from our mobility—the speed that we can bring lethality into the fight.
This article is part of a series that explores zero trust, cyber resiliency and similar topics.
Over the past year or so, I’ve discovered the secret weapon that IT leaders of various U.S. government entities have deployed as they implement zero trust architectures. Their first step has been to create a comprehensive educational pathway for their workers. This is because no one can implement zero trust alone.
Zero trust: Only education can move you forward
This article is part of a series that explores zero trust, cyber resiliency and similar topics.
The recently released federal zero-trust strategy from the Office and Management and Budget (OMB) and the Homeland Security Department’s Cybersecurity and Infrastructure Security Agency (CISA) has one action area that has raised a few eyebrows within the zero trust community: Go ahead and open your applications to the Internet. Wait… what?
More than just a technology focus, zero trust (ZT) is an invitation for all of us to think differently about cybersecurity. We are losing on the cybersecurity battlefield, and continued investment in more advanced versions of the same architecture patterns will not change that.
Open-source software components now often comprise at least 80 percent of modern software applications, according to the best available estimate. They run the web servers that allow you to read this article, form the core of the mobile apps you use, and even help stealthier corners of government accomplish their missions—supporting U2 Dragon Lady missions, for example.
The novel 2034 by James Stavridis and Elliot Ackerman perpetuates a fundamental misunderstanding of how technology should be employed and managed in future conflicts.
The continuing narrative is that we should purposely degrade our systems in a conflict with a peer competitor because of the possibility of a degraded spectrum, cyber attacks, space-based detection and jamming. But if we preemptively degrade our technology in a peer conflict, we will lose.
In the novel, after a conflict with the Chinese Navy in which the U.S. technical systems were incapacitated, U.S. ships preemptively disabled “any interface with a computer, a GPS or [any interface] that could conceivably be accessed online.”
There’s little doubt that thanks to the influx of new government regulations around privacy and data security, requirements have become the primary area of focus for many defense industrial base and General Services Administration contractors.
In the ever-growing and complexifying ecosystem of the Internet of Things (IoT), demand for connectivity is stronger than ever and only bound to intensify. Statista predicts that by 2025, there will be 38.6 billion devices connected to the internet, which will put even more pressure on organizations to monitor their infrastructures.
For system administrators, there are several obstacles to keeping pathways clear and the flow of data smooth. Here are a few of the most common roadblocks when it comes to IoT monitoring, as well as a few ways to overcome them.
Roadblock #1: Managing different interfaces for different devices
Ask someone in federal IT what zero trust means and you’re likely to hear that it’s about access control: never granting access to any system, app or network without first authenticating the user or device, even if the user is an insider. The term “Never trust; always verify” has become a common way to express the concept of zero trust, and the phrase is first on the list of the Defense Information Systems Agency’s (DISA’s) explanation.
When it comes to nefarious deeds, the COVID-19 pandemic has been a gold mine for bad actors. In addition to wreaking havoc for individuals and healthcare organizations, federal agencies are also prime targets. Case in point: a portion of the Department of Health and Human Services’ (HHS) website was recently compromised, in what appears to be a part of an online COVID-19 disinformation campaign.
In a time of heightened cyber risk and limited human and fiscal resources, how can agencies protect their networks from malicious actors by taking a page from the COVID playbook? They can diligently practice good (cyber) hygiene.
In fact, there is a direct correlation between personal and cyber hygiene.
I take no joy in writing this article, but it is a desperate plea for improvement.
From 1995-2001, I worked for the Department of the Army as a contract specialist procuring advanced communications and electronics systems, equipment and services.
The Department of Defense (DOD) is dramatically increasing its digital security expectations for defense contractors and subcontractors. Having been on both sides of the partnership between government and the public sector, I am happy to see DOD is not only raising the bar on cybersecurity but also providing guidance on the implementation of cybersecurity best practices within the defense industrial base.
Recently, I had the privilege of attending a ceremony and presenting an award to a local high school Junior Reserve Office Training Corps (JROTC) cadet on behalf of another organization for this cadet’s superior performance and leadership. Looking around the stage, I noticed representatives from multiple organizations all eager to recognize the efforts of these amazing young leaders with their respective groups’ awards.
The rising prominence of the Cyber branch in the U.S. military, and namely the Army, begs the question “What will the Cyber branch be used for?” Citing the Defense Department’s plan for the Cyber branch, as well as the Signal branch’s shifting roles in the realm of cyberspace, the responsibilities of both branches are becoming clear. It is evident that as time goes on, the Cyber branch will become focused mainly on the defense of the military domain and cyberspace.
As people around the world practice self-isolation in an effort to reduce exposure and spreading of the COVID-19 virus, the need to maintain a strong cybersecurity posture arguably has never been higher. Millions of people have shifted their daily lives to an environment relying on telework, distance learning, Internet-enabled social engagement, streaming news and entertainment and other activities.
This “new normal” is facilitated by the robust capabilities of the Internet. Yet it presents a significant cyber risk. During the COVID-19 crisis, we’ve seen bad actors stepping up their game with increased incidents of phishing, disinformation, watering hole attacks and other criminal activity.
By now, federal agencies universally recognize that data is an asset with seemingly limitless value as they seek to reduce costs, boost productivity, expand capabilities and find better ways to support their mission and serve the public.
A mushroom cloud explosion in the New Mexico desert on July 16, 1945 forever changed the nature of warfare. Science had given birth to weapons so powerful they could end humanity. To survive, the United States had to develop new strategies and policies that responsibly limited nuclear weapon proliferation and use. Warfare is again changing as modern militaries integrate autonomous and semiautonomous weapon systems into their arsenals. The United States must act swiftly to maximize the potential of these new technologies or risk losing its dominance.
By 2030, artificial intelligence (AI) is projected to add $13 trillion to the global economic output. In government, AI applications promise to strengthen the federal workforce, safeguard our nation against bad actors, serve citizens more effectively and provide our warfighters the advantage on the battlefield. But this success will require collaboration and advancements from government and industry.
It’s easy to forget that in the midst of a catastrophe, physical safety isn’t the only thing that’s important. As technology’s role in disaster response and relief becomes more and more prevalent, cybersecurity becomes an essential part of the process. Here’s why.
Few people are more vulnerable than those impacted by a crisis. Whether a man-made attack or a natural disaster, the widespread destruction created by a large-scale emergency can leave countless individuals both destitute and in need of medical attention. Protecting these men, women and children requires more than a coordinated emergency response.
Last year was a banner year for cyber fraud. In just the first six months of 2019, more than 3,800 breaches exposed 4.1 billion records, with 3.2 billion of those records exposed by just eight breaches. The scale of last year’s data breaches underscores the fact that identity has become the currency of the digital world and data is the fuel that powers the digital economy. What’s also clear looking back on 2019 is that digital identities are continually being compromised on multiple levels.
There are certainly similarities between network resilience and cyber resilience. The foundation for both is the ability to maintain business or mission capabilities during an event, such as a backhoe cutting your fiber cables or a nation-state actively exploiting your network. But there are also significant differences.
Supply chain security has been of concern to government leaders for decades, but with attacks now originating in industrial control systems (ICS) from supply chain vulnerabilities and with an increasing reliance on the Internet of Things (IoT), Congress is stepping up its involvement. For example, legislators have promised that more stringent standards will soon be enforced.
Government agencies face similar challenges when it comes to understanding—and gaining intelligence from— foreign language content. They need to process, manage and gain insight from large volumes of content locked away in different formats, often across multiple languages. And they need to do all of this as quickly as possible. It’s no mean feat when you consider the mindboggling amounts of content being generated: 90% of the world’s content was created over the past two years alone.
When it comes to artificial intelligence (AI), the Department of Defense (DOD) has put a firm stake in the ground. The department’s AI strategy clearly calls for the DOD “to accelerate the adoption of AI and the creation of a force fit for our time.”
Anyone who has worked in the Pentagon or on almost any military installation can attest to wireless connectivity problems. Whether dealing with a dearth of cellular service, inadequate Wi-Fi or security blockers, service members and civilians have felt the frustration of not being able to access information or communicate effectively.
In every recent discussion I have had with government and defense leaders around IT modernization, the conversation quickly leads to cloud and its role in enabling agile ways of working for government. Many agencies have already developed cloud migration targets and are looking at how they can accelerate cloud adoption.
The U.S. Army is leading the charge on the military’s multidomain battle concept—but will federal IT networks enable this initiative, or inhibit it?
The network is critical to the Army’s vision of combining the defense domains of land, air, sea, space and cyberspace to protect and defend against adversaries on all fronts. As Gen. Stephen Townsend, USA, remarked to AFCEA conference attendees earlier this year, the Army is readying for a future reliant on telemedicine, 3D printing and other technologies that will prove integral to multidomain operations. “The network needs to enable all that,” said Townsend.
The response to the Chief of Naval Operations (CNO) Adm. John Richardson’s repeated request to “pick up the pace” of developing and implementing breakthrough technologies for our warfighters has gone, in my opinion, largely unheeded.
This is not the result of a lack of innovative solutions. A myriad of research and development programs exists to support the development of new technologies or to adapt existing commercial technologies to defense applications. Rather, it’s the result of an arcane acquisition process that is burdensome, expensive and lacking vision. Acquisition reform is where we need to pick up the pace!
When the Department of Defense (DOD) launched its Everything Over IP initiative nearly 10 years ago the focus was to bring traditional telecommunications technology—phone calls, streaming video and even faxes—to the digital world.
At that time, unified communications (UC), especially in the government workplace, was a relatively new concept. Remember, this was a time when voice over Internet Protocol (VoIP) phones were still seen as cutting edge. Now, though, UC has become not just a business tool, but a strategic offering that can connect employees in disparate locations, including the frontlines.
More than a year has passed since the Modernizing Government Technology (MGT) Act was signed into law, cementing the establishment of a capital fund for agencies to support their special IT projects. The MGT Act prompted defense and intelligence agencies to accelerate the replacement of legacy systems with innovative and automated technologies, especially as they explore new ways to mitigate security risks like those experienced all too often by their private sector counterparts.
The military continues to focus its efforts on developing the most sophisticated technologies and capabilities needed to sustain tactical advantage and achieve mission objectives. But the most critical component to success on the battlefield continues to lie with the warfighter.
Open source containers, which isolate applications from the host system, appear to be gaining traction with IT professionals in the U.S. defense community. But for all their benefits, security remains a notable Achilles’ heel for a couple of reasons.
First, containers are still fairly nascent, and many administrators are not yet completely familiar with their capabilities. It’s difficult to secure something you don’t completely understand. Second, containers are designed in a way that hampers visibility. This lack of visibility can make securing containers extremely taxing.
Layers upon layers
The U.S. defense industrial supply chain is vast, complex and vulnerable. Organic components, large-scale integrators, myriad commercial service providers, and tens of thousands of private companies sustain the Defense Department. According to the SANS Institute, the percentage of cyber breaches that originate in the supply chain could be as high as 80 percent.
Implementing a new system can be an exciting time, but the nagging questions and doubts about the fate of data you’ve literally spent years collecting, organizing and storing can dampen this excitement.
This legacy data often comes from a variety of sources in different formats maintained by a succession of people. Somehow, all the data must converge in a uniform fashion, resulting in its utility in the new solution. Yes, it is hard work and no, it is not quick. Fortunately, this scrubbing and normalization does not have to be a chaotic process replete with multiple failures and rework.
It comes as no surprise that U.S. adversaries continue to target and successfully exploit the security weaknesses of small-business contractors. A successful intrusion campaign can drastically reduce or even eliminate research, development, test and evaluation (RDT&E) costs for a foreign adversary. Digital espionage also levels the playing field for nation-states that do not have the resources of their more sophisticated competitors. To bypass the robust security controls that the government and large contractors have in place, malicious actors have put significant manpower into compromising small- and medium-sized businesses (SMBs).
Artificial intelligence can be surprisingly fragile. This is especially true in cybersecurity, where AI is touted as the solution to our chronic staffing shortage.
It seems logical. Cybersecurity is awash in data, as our sensors pump facts into our data lakes at staggering rates, while wily adversaries have learned how to hide in plain sight. We have to filter the signal from all that noise. Security has the trifecta of too few people, too much data and a need to find things in that vast data lake. This sounds ideal for AI.
Every time federal information technology professionals think they’ve gotten in front of the cybersecurity risks posed by the Internet of Things (IoT), a new and unexpected challenge rears its head. Take, for instance, the heat maps used by GPS-enabled fitness tracking applications, which the U.S. Department of Defense (DOD) warned showed the location of military bases, or the infamous Mirai Botnet attack of 2016.
Historically, the U.S. Department of Defense (DOD) has been the driver of technological innovation, inventing remarkable capabilities to empower warfighter mission effectiveness and improve warfighter safety. Yet over the past 25 years, a transformational shift has taken place in several key technology sectors, and technology leadership in these sectors is no longer being driven by the military, but rather by the private sector.
The need for next-generation networking solutions is intensifying, and for good reason. Modern software-defined networking (SDN) solutions offer better automation and remediation and stronger response mechanisms than others in the event of a breach.
But federal administrators should balance their desire for SDN solutions with the realities of government. While there are calls for ingenuity, agility, flexibility, simplicity and better security, implementation of these new technologies must take place within constraints posed by methodical procurement practices, meticulous security documentation, sometimes archaic network policies and more.
Government IT professionals have clear concerns about the threats posed by careless and untrained insiders, foreign governments, criminal hackers and others. For the government, cyber attacks are a matter of life. We must deal with them as a common occurrence.
Today’s battlefield is highly technical and dynamic. We are not only fighting people and weapons but also defending and attacking information at light speed. For mission success, the American warrior in the field and commanders up the chain need the support of highly adaptive systems that can quickly and securely establish reliable communications and deliver real-time intelligence anytime and anywhere.
Recently, Secretary of State Michael Pompeo, in response to Executive Order 13800, released recommendations to the President of the United States on the subject of cybersecurity. Included was an emphasis both on domestic policy and international cooperation to achieve several key diplomatic, military and economic goals. The specific focus on international cooperation is a big step in the right direction. The United States has a chance to demonstrate international leadership on a complex issue, while setting the groundwork necessary to protect national interests.
The U.S. Office of Management and Budget released a report this spring showing the abysmal state of cybersecurity in the federal government. Three-quarters of the agencies assessed were found to be “at risk” or “at high risk,” highlighting the need for a cyber overhaul. The report also noted that many agencies lacked “standardized cybersecurity processes and IT capabilities,” which affected their ability to “gain visibility and effectively combat threats.”
Never before has there been such an intense focus on data security and privacy. With data breaches increasing exponentially and the European Union’s recent implementation of the General Data Protection Regulation (GDPR), data security has been at the forefront of news stories over the past several months, with both businesses and consumers suddenly paying very close attention. With this increased attention has come an understanding that data continues to exist even when it is no longer needed or used. Due to this newfound understanding and GDPR’s “Right to be Forgotten,” the eradication of data has new urgency and has become critical to a successful data security program.
Fraud, waste, and abuse (FWA) remains a major challenge to the federal government. From 2012 to 2016, the 73 federal inspectors general (IGs), who are on the frontline of fighting FWA, identified $173 billion in potential savings and reported $88 billion in investigative recoveries and 36,000 successful prosecutions and civil actions.
In February 2018, the Department of Defense (DOD) Defense Digital Service (DDS) relaunched Code.mil to expand the use of open source code. In short, Code.mil aims to enable the migration of some of the department’s custom-developed code into a central repository for other agency developers to reduce work redundancy and save costs in software development. This move to open source makes sense considering that much of the innovation and technological advancements we are seeing are happening in the open source space.
It has become increasingly evident that artificial intelligence (AI) and machine learning (ML) are poised to impact government technology. Just last year, the General Services Administration launched programs to enable federal adoption of AI, and the White House encouraged federal agencies to explore all of the possibilities AI could offer. The benefits are substantial, but before the federal government can fully take advantage of advancements like AI, federal agencies must prepare their IT infrastructure to securely handle the additional bandwidth.
Wary that the Internet of Things (IoT) could be used to introduce unwanted and unchecked security risks into government networks, senators last year created a piece of legislation that placed minimum security standards around IoT devices sold to and purchased by government agencies. The IoT Cybersecurity Improvement Act of 2017 specifically cites the need for regulation of “federal procurement of connected devices,” including edge computing devices, which are part of the IoT ecosystem.
Traffic on optical transport networks is growing exponentially, leaving cyber intelligence agencies in charge of monitoring these networks with the unenviable task of trying to sift through ever-increasing amounts of data to search for cyber threats. However, new technologies capable of filtering exploding volumes of real-time traffic are being embedded within emerging network monitoring applications supporting big data and analytics capabilities.