More than a year has passed since the Modernizing Government Technology (MGT) Act was signed into law, cementing the establishment of a capital fund for agencies to support their special IT projects. The MGT Act prompted defense and intelligence agencies to accelerate the replacement of legacy systems with innovative and automated technologies, especially as they explore new ways to mitigate security risks like those experienced all too often by their private sector counterparts.
The military continues to focus its efforts on developing the most sophisticated technologies and capabilities needed to sustain tactical advantage and achieve mission objectives. But the most critical component to success on the battlefield continues to lie with the warfighter.
Open source containers, which isolate applications from the host system, appear to be gaining traction with IT professionals in the U.S. defense community. But for all their benefits, security remains a notable Achilles’ heel for a couple of reasons.
First, containers are still fairly nascent, and many administrators are not yet completely familiar with their capabilities. It’s difficult to secure something you don’t completely understand. Second, containers are designed in a way that hampers visibility. This lack of visibility can make securing containers extremely taxing.
Layers upon layers
The U.S. defense industrial supply chain is vast, complex and vulnerable. Organic components, large-scale integrators, myriad commercial service providers, and tens of thousands of private companies sustain the Defense Department. According to the SANS Institute, the percentage of cyber breaches that originate in the supply chain could be as high as 80 percent.
Implementing a new system can be an exciting time, but the nagging questions and doubts about the fate of data you’ve literally spent years collecting, organizing and storing can dampen this excitement.
This legacy data often comes from a variety of sources in different formats maintained by a succession of people. Somehow, all the data must converge in a uniform fashion, resulting in its utility in the new solution. Yes, it is hard work and no, it is not quick. Fortunately, this scrubbing and normalization does not have to be a chaotic process replete with multiple failures and rework.
It comes as no surprise that U.S. adversaries continue to target and successfully exploit the security weaknesses of small-business contractors. A successful intrusion campaign can drastically reduce or even eliminate research, development, test and evaluation (RDT&E) costs for a foreign adversary. Digital espionage also levels the playing field for nation-states that do not have the resources of their more sophisticated competitors. To bypass the robust security controls that the government and large contractors have in place, malicious actors have put significant manpower into compromising small- and medium-sized businesses (SMBs).
Artificial intelligence can be surprisingly fragile. This is especially true in cybersecurity, where AI is touted as the solution to our chronic staffing shortage.
It seems logical. Cybersecurity is awash in data, as our sensors pump facts into our data lakes at staggering rates, while wily adversaries have learned how to hide in plain sight. We have to filter the signal from all that noise. Security has the trifecta of too few people, too much data and a need to find things in that vast data lake. This sounds ideal for AI.
Every time federal information technology professionals think they’ve gotten in front of the cybersecurity risks posed by the Internet of Things (IoT), a new and unexpected challenge rears its head. Take, for instance, the heat maps used by GPS-enabled fitness tracking applications, which the U.S. Department of Defense (DOD) warned showed the location of military bases, or the infamous Mirai Botnet attack of 2016.
Historically, the U.S. Department of Defense (DOD) has been the driver of technological innovation, inventing remarkable capabilities to empower warfighter mission effectiveness and improve warfighter safety. Yet over the past 25 years, a transformational shift has taken place in several key technology sectors, and technology leadership in these sectors is no longer being driven by the military, but rather by the private sector.
The need for next-generation networking solutions is intensifying, and for good reason. Modern software-defined networking (SDN) solutions offer better automation and remediation and stronger response mechanisms than others in the event of a breach.
But federal administrators should balance their desire for SDN solutions with the realities of government. While there are calls for ingenuity, agility, flexibility, simplicity and better security, implementation of these new technologies must take place within constraints posed by methodical procurement practices, meticulous security documentation, sometimes archaic network policies and more.
Government IT professionals have clear concerns about the threats posed by careless and untrained insiders, foreign governments, criminal hackers and others. For the government, cyber attacks are a matter of life. We must deal with them as a common occurrence.
Today’s battlefield is highly technical and dynamic. We are not only fighting people and weapons but also defending and attacking information at light speed. For mission success, the American warrior in the field and commanders up the chain need the support of highly adaptive systems that can quickly and securely establish reliable communications and deliver real-time intelligence anytime and anywhere.
Recently, Secretary of State Michael Pompeo, in response to Executive Order 13800, released recommendations to the President of the United States on the subject of cybersecurity. Included was an emphasis both on domestic policy and international cooperation to achieve several key diplomatic, military and economic goals. The specific focus on international cooperation is a big step in the right direction. The United States has a chance to demonstrate international leadership on a complex issue, while setting the groundwork necessary to protect national interests.
The U.S. Office of Management and Budget released a report this spring showing the abysmal state of cybersecurity in the federal government. Three-quarters of the agencies assessed were found to be “at risk” or “at high risk,” highlighting the need for a cyber overhaul. The report also noted that many agencies lacked “standardized cybersecurity processes and IT capabilities,” which affected their ability to “gain visibility and effectively combat threats.”
Never before has there been such an intense focus on data security and privacy. With data breaches increasing exponentially and the European Union’s recent implementation of the General Data Protection Regulation (GDPR), data security has been at the forefront of news stories over the past several months, with both businesses and consumers suddenly paying very close attention. With this increased attention has come an understanding that data continues to exist even when it is no longer needed or used. Due to this newfound understanding and GDPR’s “Right to be Forgotten,” the eradication of data has new urgency and has become critical to a successful data security program.
Fraud, waste, and abuse (FWA) remains a major challenge to the federal government. From 2012 to 2016, the 73 federal inspectors general (IGs), who are on the frontline of fighting FWA, identified $173 billion in potential savings and reported $88 billion in investigative recoveries and 36,000 successful prosecutions and civil actions.
In February 2018, the Department of Defense (DOD) Defense Digital Service (DDS) relaunched Code.mil to expand the use of open source code. In short, Code.mil aims to enable the migration of some of the department’s custom-developed code into a central repository for other agency developers to reduce work redundancy and save costs in software development. This move to open source makes sense considering that much of the innovation and technological advancements we are seeing are happening in the open source space.
It has become increasingly evident that artificial intelligence (AI) and machine learning (ML) are poised to impact government technology. Just last year, the General Services Administration launched programs to enable federal adoption of AI, and the White House encouraged federal agencies to explore all of the possibilities AI could offer. The benefits are substantial, but before the federal government can fully take advantage of advancements like AI, federal agencies must prepare their IT infrastructure to securely handle the additional bandwidth.
Wary that the Internet of Things (IoT) could be used to introduce unwanted and unchecked security risks into government networks, senators last year created a piece of legislation that placed minimum security standards around IoT devices sold to and purchased by government agencies. The IoT Cybersecurity Improvement Act of 2017 specifically cites the need for regulation of “federal procurement of connected devices,” including edge computing devices, which are part of the IoT ecosystem.
Traffic on optical transport networks is growing exponentially, leaving cyber intelligence agencies in charge of monitoring these networks with the unenviable task of trying to sift through ever-increasing amounts of data to search for cyber threats. However, new technologies capable of filtering exploding volumes of real-time traffic are being embedded within emerging network monitoring applications supporting big data and analytics capabilities.