Sponsored Content: Securing DoD Cloud Platforms by Breaking Silos

June 1, 2021
By Henry S. Kenyon


Cloud-based service models offer challenges, benefits to Pentagon agencies.


As the Department of Defense migrates more mission-critical systems and software to cloud environments, it must also consider an innovative way for securing this new environment from potential cyber attack.

It is up to DoD organizations like the Defense Information Systems Agency (DISA) to work out the details of such efforts and ensure the military’s considerable inventory of legacy equipment and systems can continue to interoperate smoothly with the latest technologies. But integrating different technologies is never an easy process.

One major issue DISA and the DoD face when it comes to cloud migration and security is back hauling through legacy on-premises security stacks, says Beau Hutto, vice president of Federal at Netskope. Even for those as robust as the armed services’ Joint Regional Security Stacks (JRSS), he says it can be painful and costly enough to compromise both mission responsiveness and budgets.

Legacy tech and cloud migration

Another major issue is siloed technologies operating in systems like JRSS. Hutto notes that these systems all use proprietary languages, making it difficult for personnel to become experts in each tech silo because of their numbers. This creates a reliance on tools such as Security Incident and Event Management (SIEM) to help connect these disparate pieces of software and hardware. In the future, he believes that the trend is to move away from siloed systems towards more open, interoperable applications.

As DISA manages the DoD’s migration to cloud-based services, the end goal for programs like JRSS is an environment resembling a private cloud but with the ability to deliver security controls closer to where the data actually lives. That data has been moving off-premises and closer to warfighters, Hutto says, adding that in the commercial sector greater than 80 percent of the data is now created offsite.

But while the world continues to move to offsite solutions for cloud use and cybersecurity, including the federal government, most of the DoD’s investment is still for on-premises security instead of at the network’s edges and other spaces where data is rapidly expanding to today, Hutto explains.

Operationally, the DoD uses a more closed service model compared to the commercial world, says Jason Ohs, Netskope’s director of Federal Systems Engineering. The military is adopting more cloud-based services as it transitions and more data is going offsite into the cloud, but it still isn’t at the level of the commercial world, Ohs says. He adds there are still many networks on the DoD’s Non-classified Internet Protocol Router Network “that are still running file servers and local SharePoint Instances housing a huge number of documents locally .”

Security is the other major consideration and consists of two components, Ohs says. The first is that as data services migrate to the cloud, there must be a protective platform to secure it. And the second consideration is that those defensive systems are sophisticated enough to protect the data by examining and understanding the context of the data, who should have access and how it’s used. Ohs describes “data” as an umbrella term for sensitive information including cloud services and assorted pieces of information such as the documents residing on them. Data governance is the term he uses to describe the process of managing all these different parts and providing the proper access with a dynamic least privileged access model.

Another issue with managing legacy systems is that over the last decade, the language of the internet has changed, says Krishna Narayanaswamy, Netskope’s chief technology officer. Older systems are geared around onsite, hard disk-based data storage and semantics related to it, but contemporary government and commercial enterprises are commonly using application programming interfaces encoded using Javascript Object Notation (JSON) objects and are cloud-based. This makes security platforms that can understand new languages between clients and servers as an important step in protecting data, he explains.

Security considerations

In the last several years, there’s been a major shift from network protection to data protection, especially in capabilities such as backup to cloud. While many civilian and DoD agencies aren’t there yet, Hutto notes they are rapidly transitioning to these kinds of systems. “Continuing to be the greatest superpower requires agility. Cyber may not be the mission, but it absolutely enables it,” he says.

As the DoD moves to a shared security model, either through DISA or other agencies, it will need to migrate applications and data into cloud services. This process requires working with large platforms like Microsoft Azure or service providers like Amazon Web Services, explains Ohs. Cybersecurity is folded in with those providers.

Paradoxically, this new approach is slightly less secure. In pre-cloud days, Ohs notes it was much easier to lock down a network by shutting off external ports, preventing adversaries from accessing the network or exfiltrating data. Also, with older systems, services and agencies could still operate in isolation to meet their mission requirements.

But in a data sharing environment, this is no longer possible because if forces lose access to incoming data, it degrades their mission, says Ohs. Because cutting off a network isn’t an option anymore, the services will have to figure out how to operate with compromised or degraded systems. “How do we mitigate some of the risks that are in this shared responsibility model?” he asks.

Once the services and agencies understand the limitations of these new architectures, they can apply additional authentication and encryption algorithms and methods to manage data residing in a cloud service model and a shared security model. One way to add to a layered network defense is for organizations to have a quick reaction plan to manage incidents when they occur. But besides using a software-as-a-service (SaaS) model to build trust in network security, more must be done to understand how cloud services will work when compromised.

“It’s only going to be a matter of time before some of those SaaS services do get compromised, and then we need to figure out how to remediate,” Ohs says.

This is where methods such as data governance become important. “If we don’t understand how data is being put into the cloud, you really have limitations to being able to understand how that data could be exported, moved out by an adversary and then ultimately used against us,” Ohs explains.

Zero trust issues

One way to provide enhanced security for cloud services is by adopting a zero trust security model, which assumes that the network is or will be compromised and seeks to defend data at rest and in transit. But while zero trust is a good concept, it’s often difficult to implement, Ohs says.

Part of the problem is in setting access controls —processes shouldn’t have more access to data than the user that activated them. Another challenge is applying zero trust to a software application layer or certain access models, Ohs says. Access and being able to limit lateral mobility by users while giving only the data they need are fundamental challenges to designing a zero trust system, he explains.

There is also a general shortcoming in what the security industry calls zero trust, says Narayanaswamy. He notes that most modern applications have long-lived sessions, which means the zero trust application must extend to every transaction on the network. It would not be very secure to have a system that only conducted security checks when users log into a network or application, as opposed to monitoring every transaction while the user session is active.

As the DoD and DISA transition to using more cloud software, platforms, infrastructure as a service, and as a place where critical mission applications reside, data security controls need to be closer to where the data resides, Hutto says. With more processing and services of all kinds happening off premises, he adds that there is a need for ubiquitous security controls in a single platform to protect mission critical data.

Because the old practice of working with individual systems in their own siloes doesn’t work in a cloud-based environment, using a single security platform to protect data that can work with current infrastructure offers several advantages. A key benefit is allowing agencies to transition from costly onsite, siloed systems and software, Hutto says.

What distinguishes Netskope from other firms is that its platform is a single system designed for data security at the application level as opposed to several different technologies welded together, Hutto explains. Access and privilege controls flow through a single cloud-based security stack, which allows clients to control all communication on their networks. This is done through a single tenant, single platform that uses one language across the entire system.

With data moving to the cloud from onsite data centers, it’s important to protect the sensitive data used in cloud applications, says Narayanaswamy. This includes defending against threats to data in motion and threats created by and directed against cloud applications. This is where a platform like Netskope’s, which is designed for data access governance can be helpful in keeping an organization’s data from being compromised, he says.

Leveraging automation

Another aspect of an effective cloud-based security system is automation. Narayanaswamy notes that it is key for a range of reasons such as avoiding the need for human intervention for routine tasks, data loss prevention and threat detection.

To this end, Netskope’s platform extracts contextual metadata from the network and data in motion and storage. This metadata is made available to customers so they can automate data collection and monitoring into other security management tools to collate information, Narayanaswamy says.

Netskope is one of the first companies in the cybersecurity industry to develop a free threat intelligence exchange platform, notes Narayanaswamy. The platform permits threat indicators to be exchanged and it recognizes government standard threat exchange formats.

Artificial intelligence (AI) and machine learning techniques are also key to Netskope’s use of automation for security monitoring and identification of sensitive data. One important aspect is being able to train software to look for sensitive information that might be exfiltrated or targeted by attackers without giving away what is being searched for, explains Ohs. For example, training an AI data protection model on a closed network allows a search for compromised sensitive or classified data on the unclassified side or cloud service. This technique does not use keywords of known sensitive subjects, which could potentially tip attackers off about what is being searched for.

To avoid alerting hackers, Netskope built a hunting mechanism to look for sensitive data via a one-way hash of devices and data “fingerprints.” This is then used to train the AI software to look for the tagged data, says Ohs, adding that this is a next-generation way to hunt for leaked data and build in data governance.

For more information, go to www.netskope.com/solutions/government.

Enjoyed this article? SUBSCRIBE NOW to keep the content flowing.


Departments: 

Share Your Thoughts: