Blog     e-Newsletter       Resource Library      Directories      Webinars     Apps
AFCEA logo

Defense Portal Will Make Supercomputers Accessible With a Mouse Click

August 19, 2010
By George I. Seffers, SIGNAL Online Exclusive
E-mail About the Author

Years from now, engineers and scientists across the U.S. Defense Department may double-click an icon on their desktop computer screens and access the phenomenal processing power of the Army Research Laboratory (ARL) Defense Department Supercomputing Resource Center (DSRC) at Aberdeen Proving Ground, Maryland. Those computers are currently capable of processing 350 trillion calculations per second. 

The ARL DSRC supercomputers are used to research a wide variety of issues, such as weapons development, projectile design, computational chemistry, nanoscience, bioscience and network science. For example, the center houses a virtual aerodynamics range to conduct simulated testing of aerodynamic designs before equipment is built, saving time and money during the development process. Scientists also use the center’s processing power to create new materials and to model ad hoc networking on the future battlefield. In the past, the computers were instrumental in coming up with the up-armored design for military vehicles deployed to Iraq and Afghanistan.

Charles Nietubicz, director of the ARL DSRC, is leading an effort to develop a Defense Research Engineering portal that will expand access to the center’s supercomputers far beyond the small number of computational scientists currently using the machines. Other researchers often need computing vastly superior to that of their desktop computers to help solve some of the military’s most complex challenges. For now, though, anyone wanting to access the supercomputers is required to fill out several forms and then learn how to use them—a daunting and time-consuming task in itself, Nietubicz explains—and they often give up before even getting started.

“Part of my job here, I think, is to never be satisfied with what we have and to find out what we need to change to make things better, so I’ve been working on a concept for about a year, maybe a year and a half, that gets high-performance computing out to engineering scientists so they can use it on a regular basis,” says Nietubicz. “My goal is to put high-performance computing at the fingertips of our engineers and scientists throughout DOD.”

Nietubicz says he believes we are close to a tipping point for supercomputer usage. A tipping point is essentially the point at which something becomes a part of everyday life, as personal computer usage did in the 1990s. Nietubicz says he has been enamored with the idea since reading Malcolm Gladwell’s book “The Tipping Point: How Little Things Can Make a Big Difference.” Nietubicz explains that he wants to nudge supercomputing toward that tipping point.

“The average engineer doesn’t use high-performance computing. Part of the reason is that it’s too hard. I’m working to develop a tipping point for high-performance computing,” Nietubicz says.

His best estimate is that it may take another 10 years to make the center’s supercomputers more easily accessible across the department. First, they have to solve some daunting challenges, including writing the software, creating the link between desktops and supercomputers, and resolving authentication and firewall issues.

“If it were easy, it would have been done already,” Nietubicz states.

The ARL DSRC is one of six Defense Department Supercomputing Resource centers across the country. The others include the Air Force Research Laboratory, Wright Patterson Air Force Base, Ohio; Arctic Region Supercomputing Center, Fairbanks, Alaska; Army Engineer Research and Development Center, Vicksburg, Mississippi; Navy Supercomputing Resource Center, Stennis Space Center, Mississippi; and Maui High Performance Computer Center, Maui, Hawaii.

Last year, ARL DSRC added a third supercomputer, an SGI Altix Integrated Compute Environment 8200 Linux cluster, which is the largest of its kind ever deployed at the ARL DSRC. It offers 10,752 Intel Xeon 5500 series processor cores and higher memory and bandwidth than the center’s other high-performing computers. The other two supercomputers are the Tow, which has 6,656 cores with 52.2 terabytes of system memory and 400 terabytes of memory storage, and the TI-09 cluster, a test and development system with 96 cores that supports the two larger systems.

The center receives a supercomputer upgrade every couple of years through the Department of Defense High Performance Computing Modernization Program initiated in 1992 with the aim of keeping the military on the supercomputing cutting edge. Nietubicz traces the center’s roots back to the 1940s when the Army needed a faster way to calculate firing tables, which are used to aim artillery and ensure the shells fall where intended. At that time, the process could take months, so the Army’s Ballistics Research Lab—the forerunner of the Army Research Lab—and the University of Pennsylvania created the Electronic Numerical Integrator and Computer (ENIAK), which is widely recognized as the world’s first computer. That first computer was capable of about 5,000 calculations per second.

The commercial sector could also influence the rate at which a supercomputing tipping point occurs. Nietubicz cites a recent announcement by Incorporated that it is making high-performance computing services available to organizations that cannot afford the hardware. Microsoft Corporation also has been investing in supercomputer technology in recent years, and in May, the company announced the Technical Computing Initiative, which is intended to unleash the power of pervasive, accurate, real-time modeling for the purposes of scientific research. The company vowed to invest in three key areas: simplifying parallel development to more efficiently use the massive processing power of today’s computers; developing new, easier-to-use technical tools and applications to automate acquisition, modeling, simulation, visualization, workflow and collaboration; and bringing technical computing power to scientists, engineers and analysts through cloud computing, so that high-performance computer users will be able to augment their on-premises systems with cloud resources.

Regardless of the challenges involved, Nietubicz insists supercomputers will become more widely available. “It’s going to take a while because a big part of it has to do with the software. But I believe it’s going to come, and it’s going to be another tipping point. “It’s an exciting time for me because I think I can see another revolution coming,” Nietubicz says.