Visualizing Information Emerges As Major Element of Operations
Seeing is believing when trainees and operators view data as three-dimensional graphic imagery.
Data visualization, where information is displayed in recognizable graphic elements, increasingly is moving into mainstream applications as a remedy for information overload. As computer users find growing amounts of gigabytes at their fingertips, system engineers are returning display perspectives to everyday three-dimensional visages that are comprehended faster and more readily.
This effect may be felt across the discipline of data processing. Data visualization, or its close relative information visualization, can be used in applications as simple as software tool representation or as involved as complex training routines. It can serve as a more efficient method of designing an architecture for tool development, after which it can transition to serve as a training and educational manual for users. It can provide complex information from multiple sources in an easily understandable form that also validates data.
The newest element of data visualization is that it is becoming available to more low-end users who increasingly are finding themselves prisoners of information overload. Once largely limited to high-end mainframe users, it now can be exploited on simple laptop computers.
Potential users could include virtually anyone facing access to burgeoning amounts of information. Network managers can observe consequences of node or system failures in realistic form. Financial planners can examine the ramifications of a host of changes in economic variables. The intelligence community, with its range of different sensors and data, can extend the ability to view collated information graphically teamed with imagery down to individual desktop users.
“Visualization is the rage right now,” declares Gary Gomez of Booz•Allen & Hamilton Incorporated, McLean, Virginia. Gomez heads the data visualization team operating within the company’s concept visualization team in its national security technology group. The firm is working on both data visualization and information visualization.
Information visualization differentiates from data visualization in that it involves displaying other elements such as landscapes. “Information visualization involves turning data into information,” Gomez explains. Its sources might be bulk information such as networks or databases that are turned into analyzable information by putting them into context or perspective. This allows users to view how each bit of datum relates to other bits and how it affects them.
“Information visualization doesn’t make decisions, but it really focuses efforts,” Gomez states.
A host of elements are factored to establish proper information visualizations. For example, animation experts would determine proper color combinations, motion, perspective and texture. Other experts in computer-based training establish the ideal display functionality. And, above all, other specialists would ensure that the information presented to the customer is accurate while useful.
The current market for information visualization consists largely of higher–level decision makers. However, new approaches are teaming with hardware and software advances to open lower-level users to its advantages. The gulf between high end and low end, while still substantial, is shrinking.
For data visualization, useful applications include network monitoring, battlespace command and control, and logistics movement. Its use in collection and asset management can show the footprint of a sensor, including where it might be blocked or ineffective. Applying visualization to information fusion for decision makers could include validating the data before presentation.
As an example, an existing military service maintenance program may cover a component that is used in a number of different platforms—a radio in aircraft, for example. Experts would need to know how changes or shortages in that radio alter other systems down the line, which is known as the network and waterfall effect. To accomplish this conventionally, each aircraft division head would be queried for compiling lengthy reports that would feed textual “what if” speculations.
An information visualization routine would display, in a network format, the disparate types of aircraft in the entire inventory. The database would contain the systems that run in the aircraft, so a user could perform a screen pull-down to institute a radio shortage. The system would display the aircraft that are affected by the shortage. A second analysis would display how much of a factor this radio degradation would be to a major command’s combat status.
With this visual information, a user could seek alternate ways for deployed units to maintain combat status with these radio problems. The system would visualize alternative solutions such as redeploying radios or reconfiguring a combat information network. Gomez points out that a paper manual may define how mission-capable an aircraft is when specific systems are out of action, but this information can be programmed into a database to allow a user to view a wide range of ramifications visualized on screen. This can be achieved without developing an expensive modeling and simulation algorithm.
Other applications beckon. Shipboard radar operators and maintenance workers, for example, ordinarily can view the three-dimensionality of the radar in a book or on a flat screen. A training tool can show the radar’s capabilities in a three-dimensional area. A related effort underway would provide three-dimensional data visualizations of tactical aviation maneuvers.
The Defense Advanced Research Projects Agency’s (DARPA’s) command post of the future initiatives include showing the common operational picture in three dimensions. The agency recently awarded a number of contracts in the command post program. The U.S. Atlantic Command, as the executive agent for joint experimentation, also is active through its work on battlefield visualization.
Gomez’s company is developing one tool that would allow a user to plug into the Intelink for visualization. The operator would not need a high-end tool on a laptop for this access.
The technology also could be applied to air traffic control. Information visualization would portray aspects in ways that are easily understandable to busy controllers. They tend to think of the airspace in three dimensions, as do pilots, but control radarscopes continue to present information in two dimensions with altitudes shown as numbers. The small area inherent in the congested terminal approach phases could be displayed in three dimensions with terrain imagery. A controller could see a conflict developing even before program alarms sounded. This visualization capability could help reduce controller stress and lessen the possibility of a fatal error.
The finance industry already is a major beneficiary of data visualization, Gomez notes. Many different factors can change an asset’s yield and volatility, so financial planners can view their effects on a portfolio instantly. A brushover also can allow a viewer to data mine a portfolio for more detailed information.
Gomez’s company is employing these technologies for concept visualization. Concept visualization can be used to represent a desired tool as well as to display its performance. Most of the company’s efforts are geared for use on a personal-computer-based laptop that does not require special graphics.
Tool or processor architecture development often involves understanding a system that serves as the core of that project. This system usually originates in graphic format, after which it is converted into words. Putting it back into graphic format often results in some loss of the original information. Gomez explains that understanding these visualization requirements from the start is essential.
Gomez’s team focuses on two applications. One is to use three-dimensional animation, computer-based training and advanced graphics to develop presentations. These presentations can be used to train people on tool usage.
The other is to employ visualization techniques to actually help develop a software tool. This effort begins with a paper model that is put together before any code writing, software development or prototyping. Gomez characterizes this paper model as a storyboard that spells out the client’s requirements for the data visualization tool.
His team avoids development of electronic prototypes because of their high start-up costs. In addition to their financial impact, these costs tend to inhibit change because of a reluctance to throw out work that results from significant investments. “A prototype tends to pick up a steam of its own,” Gomez says, while paper modeling allows advanced development before expensive code generation.
Tool development is relatively vendor-driven, he warrants. No good mechanism exists for first identifying a client’s visualization requirements. They might include users and client plans for the information, for example. These requirements must be identified to build a tool irrespective of any vendor.
After creating the paper model, the concept visualization requirements are incorporated into an evolutionary document development. This activity carries over elements of the tool development into training and user manuals. In effect, the program that is used to build the system is identical to the guides that the customer uses to operate it.
The biggest challenge in developing data visualization tools is to think in a three-dimensional mode, Gomez says. “We are three-dimensional creatures; we actually think in depth. We sense movement and change,” he explains. Sensing change in a picture is cognitive, but sensing change in a number requires mental processing. Most computer users are accustomed, however, to seeing data in X/Y graphs and spreadsheets, even though this is counter-intuitive. Showing an information-visualization display that provides thousands of bits of information can prove difficult to the viewer, especially in areas such as relating to other information and data fusion.
This becomes vital for intelligence applications. Operators increasingly must have systems that fuse data from signals, measurement and analysis, imagery and other intelligences. Any form of network management that involves diverse data sources often strains the designing of visual information. “Once people get past that [three-dimensional adjustment], it’s almost like the floodgates have opened,” Gomez relates.
Most of his team’s efforts focus on visualization systems featuring medium cost, high impact and fairly quick turnaround. The turnaround can be a matter of weeks, Gomez says. There are enough high-end tools that perform good graphics, he acknowledges, but many potential users of information visualization lack the expensive platforms necessary to run these advanced graphics tools.
“There’s nothing in visualization now to speak of,” says Gomez of broad-based systems. Many high-end companies with powerful machines are leading the way with visualization technology, but this leaves behind many companies with systems that have up to 90 percent of these high-end capabilities at only 10 percent of their cost.
“Ten years from now, there will be fabulous systems,” he predicts. “In the interim, there is no planned segue to get people working with visualization routinely and also using a lot of the available tools, which can be implemented within weeks at a fraction of the cost.” If not enough effort is made to increase the spread of visualization, then the advanced tools that appear 10 years hence will represent an introduction to the concept for many people, which Gomez says “will not be healthy.”
He suggests that users need to employ visualization now to drive the technology. The result will be not a slowdown in technology development, but a focusing of it, he adds.