Sharing The Wealth Key to Army Intelligence
![]() |
Two U.S. Army infantrymen observe artillery fire in Afghanistan. Army intelligence analysts around the world, including those located in remote regions, will be able to access and process information far better than currently possible as a result of improvements under development at the U.S. Army Intelligence and Security Command (INSCOM). |
Two key projects are defining U.S. Army intelligence efforts to improve its analytic capabilities. While their aim is the same—allowing analysts to process key intelligence information faster and more efficiently—they take opposite approaches to the common goal.
One project focuses on enabling easier access to information stored on different databases and in different formats. The other effort aims to tap unused processing capability seamlessly for use by analysts thousands of miles away. Both efforts are underway at the Army Intelligence and Security Command (INSCOM),
One project is the Defense Cross-Domain Analytic Capability (DCAC), a laboratory environment prototype designed to help analysts access information that currently is available in different domains. DCAC effectively is a back-end database that allows multiple security domains to operate through it alone. This laboratory prototype would benefit users previously unable to gain access to important data. DCAC would aid their ability to consolidate key pieces of data by eliminating the problem of multiple databases containing complementary information yet having different security domains. It also would help eliminate duplicate information stored in different databases.
Col. Timothy P. Hill, USA, director of the INSCOM Futures Directorate, explains that information currently is controlled by all of the various security domains. Each domain has its own databases, which often include duplicate information. A single unclassified document could reside on most of the security domains because it offers no security problems without classification. In addition to being wasteful, that duplication may not be guaranteed on all of the domains, so an analyst could not count on discovering it in every domain. Consequently, that analyst might need to ask a question related to that document on every domain he or she visits.
However, DCAC would eliminate that problem and simplify the search by presenting a single cross-domain database. A user need only ask a single question to receive an answer quickly and without any duplication. And, this single entry would permit collapsing all of the hardware, software and power infrastructure maintaining those databases on separate domains, Col. Hill points out.
This effort does not aim to dissolve all the intelligence databases into a new single entity, however. It is designed to maintain the existing intelligence networks, but users would be able to access data transparently through their own familiar interfaces. The networks would remain separate.
DCAC currently employs existing applications already used by analysts within the Distributed Common Ground System–Army (DCGS–A) framework. The search application, the ESRI geospatial application and a named area of interest application served as surrogates to replicate the user experience. Other commercial products move data into the database.
The intent was to provide a transparent means for a user to access information in the cross-domain environment. Col. Hill allows that a future iteration may feature new user applications, because DCAC’s approach brings more richness to the data. For now, however, users would find themselves in familiar territory when using DCAC.
“We only picked three applications as our initial testbed—they’re kind of bread and butter applications—and we didn’t change them in any significant way,” Col. Hill explains. “We simply built an interface to those applications.”
Commercial technologies play a major role in this proof-of-concept prototype. These include Linux application servers, trusted Solaris clusters, an EMC SAN, Cisco network switches and firewalls and an Oracle database. Col. Hill describes the main component as the Oracle application CDSE, for cross-domain security enclave. It is the fundamental underlying technology that INSCOM is proving out in its efforts, he says.
Col. Hill explains that the goal of this effort is to prove that the enabling technology is accreditable. The technology has been extant for some time, and larger agencies have tried unsuccessfully to use it to build an accreditable system. But the DCAC approach is the first that can receive government certification to operate. The colonel allows that INSCOM examined the other attempts and learned where they ran into trouble, which helped it avoid those pitfalls and adopt different approaches.
“We really scoped down this project to focus fundamentally on the technology first,” he explains. “The key question we were asking was, ‘Can we prove that this technology can work and be accredited in this environment?’ We haven’t spent a lot of time on operationalizing the technology yet—that’s the next phase.”
Col. Hill allows that DCAC’s creators probably underestimated the difficulty of being on the bleeding edge of technology in terms of accreditation. If one technology has been certified up to 100 times, a level of documentation already exists that can be tailored to suit the new program’s circumstances. However, in a case where no one has done a particular certification, DCAC personnel had to write their own test, which would be graded along with how they took the test. The accreditors also faced a learning process as they worked to place their stamp of approval on this new system. DCAC generated more than 500 pages of documentation to support the system, the colonel notes.
While DCAC is still a prototype, it is using operational networks, operational data and operational tools. DCAC completed and passed its beta 2 testing in June, and penetration testing began in July. With the authority to operate without oversight from certifiers, INSCOM will implement lessons learned from everyday users to improve interfaces. This would affect mainly sustainers on the back end, the colonel notes.
Another component of the next phase will be to move the capability from a single-instance approach to establish it as an enterprise service. “The goal is for it to be packaged as a security service in a service-oriented architecture for any program of record,” Col. Hill reports. Currently, the DCAC effort is focusing on DCGS–A as its program of record.
Col. Hill offers that DCAC’s base technology has application outside of the intelligence field. Most other disciplines and branches have systems that operate across multiple domains, he notes, and these all could benefit from the DCAC approach.
“You can get a greater capability into a theater, into a new environment, faster, cheaper and more efficiently and sustain it with less people on the ground,” he says.
INSCOM’s other major intelligence-sharing priority is its enterprise platform, which is known as the IEP. This effort focuses on advancing analytical capabilities provided to intelligence units and consumers.
Col. Hill notes that INSCOM’s theater intelligence brigades around the world have the DCGS–A system to support intelligence and warfighting operations. Each DCGS–A system has a certain capability that is not always portable to others, and each DCGS–A is limited by its hardware configuration.
Yet, the Army intelligence enterprise worldwide offers a large amount of capacity at any given moment—if for no other reason than many facilities operate at a slower pace at night. These temporarily underutilized distributed data and fusion centers are an untapped resource that could be useful to intelligence-intensive operations.
The IEP effort aims to unite these fusion, processing and storage systems into a single platform that can be leveraged by all of the Army intelligence elements that are using its resources. It involves building a series of grids—a data grid, a storage grid, and a service and application grid.
“To a certain degree it’s cloud technology, but we are applying something that’s less bleeding-edge,” the colonel notes. “We are applying grid technology to this problem set, even though to the outside user it will seem to be a compute cloud.”
This will allow INSCOM to move excess capacity from one site to another. An analytical job being run in theater, for example, may need greater processing power than is available on site. With the IEP, that theater site can tap excess processing power elsewhere in the enterprise to suit its needs. And, this can be accomplished without INSCOM needing to move new hardware and support personnel into theater.
This also will allow new emerging analytic tools to be applied anywhere throughout the enterprise—even if those tools require more processing power than is available at a given site. And, this can be accomplished without buying new hardware, Col. Hill notes.
“We’re taking advantage of the existing investment that government already has made in these tools,” he explains. “We’re just squeezing all the capacity out of the system and making it available to run processors.
“Physically locating the number of processors at one site to run some of the advanced analytics that are coming out of the research and development world today is quite a significant expense.”
While providing more powerful analytics to the warfighter is the primary purpose for the IEP, the program offers other benefits. For example, data storage efficiency is improved when data is stored across an enterprise. And, when data is stored at more than one site, an enterprise approach avoids having to back it up at each location. This avoids redundant data backup because each site serves as a backup for the enterprise.
INSCOM is not trying to develop a new architecture; rather, it is adding versatility to the existing architecture. The existing resources of the enterprise would move based on user demands.
“If we were to build this from scratch, we probably would do some things differently,” Col. Hill states. “But one of the constraints that we put on ourselves was to use the existing systems that are out there that have pretty good hardware. There will be some adds—there will be minor tweaks to the hardware to enable the concept—but fundamentally, it is the same hardware and architecture.
“What we think we’ll be able to do is: one, get more capacity out there; and, two, extend the useful life of the existing systems—which also is a value to the government,” he says.
Again, this new approach would be transparent to the user. The user would not need to direct his or her resources with the tools available today. INSCOM engineers are applying existing technology to the IEP.
The only difference apparent to the user would be that INSCOM could equip the user with new and more powerful tools as a result of the improved analytical and processing power coming with successful IEP implementation. The user’s response time also could be improved with that better processing power.
INSCOM has just completed the first spin of the IEP. This entailed establishing the three fundamental grids in a two-node configuration at INSCOM headquarters. The two-node construct allowed experts to demonstrate the fundamental capabilities and functions that make it work.
After functional testing is completed, the next step is to stabilize the platform over the next several months. As with the DCAC effort, the experimental IEP network will use operational data instead of simulated material.
The third step will be to expand the number and spread of the nodes. To the two side-by-side nodes at INSCOM headquarters will be added a third node at a remote site. Then, a fourth node will be established in another part of the world to test IEP performance over great distances. These platforms will be DCGS–A fixed-site units.
The U.S. Army Intelligence and Security Command (INSCOM), One JCTD, known as Counter Intelligence Human Intelligence Advanced Modernization Program/Intelligence Operations Now (CHAMPION), focuses on providing analytical capabilities and tools to counterintelligence human intelligence (HUMINT) operators. Col. Timothy P. Hill, USA, director of the INSCOM Futures Directorate, points out that INSCOM is the technical manager for that JCTD, which was a Defense Department JCTD of the year in 2008. The effort already has provided a significant number of tools, he reports. Capabilities that have emerged from CHAMPION have been deployed and are being used in both INSCOM also is participating in a large data JCTD, although it does not have as large a role. The command is participating as a user base with a large data capability, as it has its own requirements for moving large data files. Col. Hill relates that this JCTD also is in its final year with a user’s test remaining. INSCOM is providing insight on the benefit of this capability as a user. Col. Hill allows that the command has seen a significant benefit to its analysts in terms of the time it takes to process large data files—in some cases as much as 10 hours when accessing national databases. |