Enable breadcrumbs token at /includes/pageheader.html.twig

Enabling Operations With Cross-Domain Transfer and Access

Sponsored Content

The driving need for collecting volumes of data in real time to support military and humanitarian operations in rapidly shifting environments is pushing analysts, service members and their organizations to find faster and more efficient ways to access data and to share it securely across classification levels. This also calls for pulling open-source intelligence data from social media, traditional media streams and public websites.

But these differing needs create a tension for security requirements. “You can’t connect to all those sources because they’re just too high-threat,” explained David Flanagan, vice president for secure consulting at Garrison Technology. He added that this access can be achieved via hardware-based security tools, such as those made by Garrison, permitting organizations to broaden the variety of systems they can connect to and the information they can access. As opposed to software-based security tools, hardware-based tools are resistant to compromise, allowing users to safely access data from nonsecure sources.

Historically access and cross-domain transfer of data across clearance levels was complicated and fraught with security concerns. Flanagan noted that the transfer element itself introduced a range of risks such as timeliness, integrity and potential danger in the form of malicious code making it past security filters in the payload itself.

Instead of adding restrictions associated with cross-domain transfer, he said new hardware-based access capabilities allow organizations to work differently. This shift permits access to new data sources and allows users to work within remote environments to achieve the required business outcome without introducing the burdens required by security technologies that support data transfer.

Beyond Sneakernet

Once organizations and users can dip into these new data sources, the next challenge is moving across networks and security layers. Traditionally this was managed by having several computers in work areas, each connected to a separate network. Because those systems couldn’t share information with each other, organizations often had to pass this information manually via “sneakernet” to different classification layers.

This access method is inefficient and cumbersome for rapidly shifting operations across all domains. Additionally, Flanagan notes that a key challenge remains—when there is a need to bring in data from an unsecure source, it is going to have to be processed by a transfer guard.

Such active processing introduces risks. These include latency associated with processing a large volume of data; there is also the integrity of the processing—can organizations expect the same quality of information after processing as the raw data that came in from a remote environment with the assurance that no malicious code slipped in.

“If I haven’t got a filter for that thing, I can’t bring it in at all. I know it’s there. I know it is of value, but I just can’t bring it in because I’ve got no way of getting it through my transfer guard,” Flanagan said.

However, if organizations shift their focus from trying to bring in volumes of data to reaching out to where it resides, “then you get access to the original source data in its original source environment in real time. You’re addressing the timeliness issue, the integrity issue and the problem of not being able to transfer it at all,” Flanagan said.

The result is that organizations can access and work with broader data sets than they could previously. This freedom is magnified when dealing with high volume data sets. Instead of attempting to suck in raw data, organizations can focus on edge processing and then only bring in the information they need.

“You can’t ingest social media sources. The volume and velocity is too high, so the only way to play with them is to reach out to them in their real environment, which is in their cloud environments, and then extracting information from the data. It’s this information that then gets transferred in,” Flanagan said.

Real Time Versus Slower Time

Organizations require tools to quickly get at specific high-value information they need, instead of wasting time processing volumes of less valuable data. Flanagan notes that transferring data is still the only way it can get to the specialized internal tools used by analysts and other experts to examine the information, but the question remains as to whether all the data or just enough of the data is required.

In a transfer-based data model, a final step in the verification process often involves going back to the original sources of that information to determine if its integrity is still good. This last verification step can be quite important, especially if an organization’s process exploitation dissemination cycle is hours long, he said.

Using the example of a hypothetical humanitarian disaster relief operation, Flanagan explains that a relief organization pulls in a volume of cell tower data from the disaster site. In that data, it’s determined that many cellphones are clustered at a specific geographic point, indicating people that need rescue. Before launching a rescue mission, the organization quickly rechecks the raw cell tower data to make sure the phones are still transmitting to that same tower. “It’s that sort of last data verification before you commit that’s important because you can’t take a lot of those actions back,” he said.

Balancing Access and Transfer

While access to data sources is highly desirable, it is not an end in itself. This is because organizations can’t consume all the data available, as either they don’t have the capacity for it or processing and analyzing it all takes too much time. A truly balanced environment can use a combination of access and transfer technology, potentially using edge computing, open source or data as a service, Flanagan said.

“The result of that is that you distill the vast data set down to an information set that you can bring through your transfer gateways in order to enrich the decision support that you need to make in your high-side environment,” he explained.

However, if transfer capabilities can be focused where needed and accessed where they can be used, “then it gives you a whole different balance, a whole different cyber posture, a different tempo and reactive capability” Flanagan said, adding that by shifting the paradigm of how analysts and operators work, it gives them a better chance of overmatching adversaries’ decision loops.

One classic hypothetical example for the use of balanced access and transfer capabilities is the need for warfighters in the field to know what is over the next hill before moving forward. Flanagan notes that if there is a cross-domain access solution in play allowing an ISTAR controller to see that unit’s chat session when it enters a latitude and longitude request and asks, “What’s behind the hill?”, this then allows the ISTAR controller to task an asset to get the required image, generate potential target coordinates and send the data directly to the unit.

Avoiding Data Overload

Another way to access or use data in real time is through tipping and cueing techniques. Flanagan notes there are a variety of data-as-a-service providers offering customers cloud-based dashboards for processes such as sentiment analysis or event analysis. These tools can have a variety of military, civilian or law enforcement applications.

The ability to tip and cue allows organizations to access very large data sets and have that information differentiated to provide specific, corroborated alerts for situations such as where to send first responders, Flanagan said. Likewise, he noted that a slightly slower response time might select the right people or teams to go to an emergency or support first responders.

Flanagan adds that there are categories of classified desktops used by military, intelligence and law enforcement organizations. To varying degrees, they allow users to access information sources and distill that data set down to information, based on a defined set of questions or rules to provide cueing and tipping functions. He notes that this is a more efficient way of working rather than trying to ingest and process large amounts of data.

“It’s being able to take vast data sets and use commercially available open-source analysis that gives you a clue that there’s something you need to take a closer look at,” Flanagan said.

Changing the Transfer-Access Paradigm

There is a growing number of hardware-based cross-domain solutions from companies such as Garrison appearing on the market. Additionally, hardware-based transfer products are also beginning to emerge to support access use cases, Flanagan said. These systems allow organizations to reach into an array of services without the need for separate individual systems on users’ desktops or the need for users to step away from their desks to access other information sources.

Recent developments mean that the technology has caught up with the concept of having classified desktops safely access multiple networks and classification levels because hardware-based access solutions are replacing older software-based models, which among other things, are vulnerable to cyber attack, he said.

Hardware-based systems such as Garrison’s avoid the security issues of software-based cross-domain solutions applications because they provide a very low risk approach that completely isolates classified systems from high threat networks and data. Additionally, it allows organizations to access data quickly and safely from high side environments.

Flanagan noted that Garrison’s solution by addressing access challenges, allows organizations to create better workflows.

“It’s no longer a thing that’s nice to have but that organizations can’t do. Now it’s a nice to have and they can do it,” he said.

By shifting towards a cross-domain, access-based paradigm, supported by cross-domain transfer where possible, it is essential to address the challenges of carrying out efficient, effective multidomain operations in a timely, secure manner. Technology, policy and stakeholder awareness are coming together to provide users routine access to information that was once impossible. Missions require this new and innovative approach, and this solution is available to be evaluated and implemented.

For more information: https://www.garrison.com/en/cross-domain