Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars  Apps     EBooks
   AFCEA logo
 

December 2012

Update on the Asia-Pacific

December 4, 2012
By Rita Boland

Military activities in the Asia-Pacific region have become more focused since the release of a defense strategy a few months ago that places renewed attention on the global area. Through U.S. Pacific Command's (PACOM's) recent theater campaign plan, leaders are telling the subordinate military-service components to report back in a year on how efforts are working while deconflicting duplicate programs.

Implementing the Defense Department
 Cloud Computer Strategy Poses New Challenges

December 1, 2012
By Paul A. Strassmann

A few staff experts can formulate new strategies in a short time. Over the years, the U.S. Defense Department has accumulated a large collection of long-range planning documents. However, none of the plans ever was fully implemented, as new administrations kept changing priorities.

The just announced Defense Department Cloud Computing Strategy presents a long list of radically new directions. Ultimately, it will take hundreds of thousands of person-years to accomplish what has been just outlined. Several points stand out.

In one, individual programs would not design and operate their own infrastructures to deliver computer services. Users would develop only applications. This approach will require tearing apart more than 3,000 existing programs. A pooled environment will be supported by cloud computing that depends on different processing, storing and communications technologies. Small application codes then can be managed separately, relying exclusively on standard interfaces. The challenge will be how to manage more than 15 years’ worth of legacy software worth about half a trillion dollars, but in completely different configurations. Making such changes will require huge cost reductions of the infrastructure that currently costs $19 billion per year.

Another point is that cloud computing will reduce the costs of the existing computing infrastructure. The Defense Department will have to virtualize close to 100,000 servers and integrate that construct with 10,000 communication links. The department will end up with a small number of enterprise-level pooled and centrally managed operations. This is a short-term multibillion-dollar effort that can be financed only from rapid savings, because no new funding will be available.

In This Period of Change and Challenge, Engagement Is Key

December 1, 2012
By Kent R. Schneider

In our professional lives, most of us have not seen an economic environment or a budget climate such as those we face today. We are approaching the ramp-down of the longest period of continuous conflict in U.S. and allied history. Technology is changing at an unprecedented pace, and to help address budget declines, we are relying on some of these technological advances—enterprise networking and service approaches, cloud computing, data center consolidation, more effective cybersecurity and better use of mobility solutions. The U.S. defense strategy is changing with a rebalance to the Asia-Pacific region. NATO is undergoing the most fundamental restructuring in its history, impacting headquarters structure, force structure and agency reform. Most of us do not remember a time when the government information technology budget was not growing year over year. This paradigm changed in 2012, and it will change more as new budgets reflecting debt reduction take effect. If sequestration occurs, an additional 11 percent will be cut from every budget line item, further aggravating the problem.

Since its start in 1946, AFCEA has been committed to an effective and ethical dialogue among governments, industry and academia to ensure that critical decisions are informed and that options are understood. It seems to me that the requirement for focused information exchange is greater today than ever before, because more uncertainty exists than ever before. Many of you have heard me say that we need to step up engagement at every level in order to understand the need fully and to respond effectively.

Intelligence Concerns Shift
 on Both Sides of the Atlantic

December 1, 2012
By Kent R. Schneider

Similarities outnumber differences as allies compare challenges.

The past 11 years have seen a sea change in intelligence operations and challenges in both Europe and North America, as longtime allies have had to confront a new era in global security issues. Both the United States and European NATO members have discovered that they face many of the same challenges, some of which must be addressed together by all members of the Atlantic alliance.

These issues were at the core of discussions populating the first AFCEA Global Intelligence Forum, held September 20-21, 2012, in Brussels, Belgium. High-level speakers with unique perspectives on global security intelligence issues focused on changes in the intelligence community that have taken place on both sides of the Atlantic since 9/11. Discussions examined changes in the threat, how the cast of characters has shifted, the growing role of open source intelligence, how the cyberdomain has increased demands on the entire intelligence community, and the balance now needed between defense and security requirements.

A key perspective on the trans-Atlantic intelligence community was offered by the Right Honourable Lord Robertson of Port Ellen KT GCMG Hon FRSE PC. Lord Robertson served as the United Kingdom’s secretary of state for defence from 1997 through 1999 and as the secretary general of NATO and chairman of the North Atlantic Council from 1999 through 2003. A veteran of the highest level of government leadership, Lord Robertson provided a sense of the intelligence community from the perspective of a senior decision maker. “Those who work and live in the world of secret intelligence rarely fully trust the ultimate customers of their product,” he said, adding, “I often had the feeling that I was only getting the most sensitive secrets on sufferance, and that it was high risk to tell me—unvetted as I was—what they were doing and discovering.”

Book Review: Project Azorian, the CIA and 
the Raising of the K-129

December 1, 2012
Reviewed by Dr. R. Norris Keeler

Book By Norman Polmar and Michael White (U.S. Naval Institute Press, Annapolis, Maryland 2010, 238 pages)

In 1974, the United States attempted to raise a sunken Soviet submarine from a depth of 16,000 feet, in the Pacific Ocean north of Hawaii. The submarine had been lost in March 1968. The operation to do this was camouflaged as an ocean bottom mining operation carried out by the Hughes Glomar Explorer, specially constructed for that purpose. As the Soviet general staff later admitted, the deception was excellent. They did not believe recovery from such a depth could be accomplished.

In thoroughly describing this ambitious effort, the book begins with the story of how the news media, specifically the Los Angeles Times, published an article describing U.S. attempts to raise a Russian submarine, the K-129, from a depth of 16,000 feet. This publication compromised the operation, at least partially. The authors then describe the role of the USS Halibut, which found and localized the K-129. By coincidence, the Halibut also was a strategic-missile-launching submarine as was the K-129. The Halibut’s missiles were the Regulus, an air-breathing platform launched from the surfaced submarine.

The K-129’s missiles were of the “Serb” designation, underwater-launched ballistic missiles, three in the sail aft, with thermonuclear warheads. The Halibut, with its large spaces available for Regulus missiles, had ample room for cameras and other sensors with the missiles removed. These sensors were deployed while submerged in the search for the K-129.

Laboratory Research
 Twists Antenna Technology

December 1, 2012
By Robert K. Ackerman

Scientists bend, not break, the laws of physics.

Faced with limitations imposed by physics, laboratory researchers are generating antenna innovations by tweaking constructs to change the rules of the antenna game. Their efforts do not seek to violate long-held mathematical theorems or laws of physics. Instead, they are working to find lawful ways of working around limitations that long have inhibited the development of antennas that would suit user needs with fewer tradeoffs.

Currently, many types of antennas can be made small enough to fit in a tight area. Yet, they suffer performance drawbacks or are extremely limited in their application. Conversely, the type of antenna suitable for high-bandwidth links may prove detrimental to a use that requires low observability.

Laboratories in industry and academia are pursuing different approaches for future antenna technology breakthroughs. These efforts involve materials, architectures and network topologies. If successful, this research could lead to unobtrusive panels that replace large antennas as well as new capabilities for antenna-bearing platforms.

Howard Stuart, technical staff member at LGS Innovations, explains that the art of building smaller antennas comes up against the laws of physics. The issue is not one of miniaturization but of signal performance when antennas are built below a certain size.

“You can’t keep making antennas smaller and smaller,” Stuart points out. “There are fundamental physical limitations, and beyond that, [the antenna] is just not going to work anymore. Or, you’re going to have to give up something, such as gain.”

Multi-Antenna Research Overcomes Frequency Shortages

December 1, 2012
By Max Cacas

Beamforming could help increase capacity of cellphone networks 
to meet the demands of data-hungry smartphones and tablets.

Multi-antenna technology that could increase data capacity and maximize existing spectrum use for cellular network providers is in the early stages of development. Although widespread use of this technology will require new devices and possible network changes, the concept has shown the potential to ease mobile device congestion from smartphones and tablets. This research is underway at a time when wireless carriers worldwide are scrambling to keep up with demand for mobile data and, in some cases, are attempting to obtain additional electromagnetic spectrum.

Dubbed Argos, after a creature with 100 eyes from Greek mythology, the multiple antenna technology is being developed primarily by the Electrical and Computer Engineering Department at Rice University in Houston, Texas. Researchers at Alcatel-Lucent/Bell Labs and Yale University also are participating in the Argos program.

While Argos is being developed for electromagnetic frequencies used by smartphones and tablets, the underlying technology theoretically could be used for any device that requires an antenna, according to Clayton Shepard, a Rice graduate student who constructed the experimental Argos antenna array, which has been successfully tested. “The frequency really doesn’t matter, and eventually, we would like to make this for Wi-Fi, or any wireless application,” he says. Technology limitations of size and cost are the primary reasons why Argos is being developed for cellular network frequencies, Shepard adds.

Antenna Experiments Yield Military Benefits

December 1, 2012
By Rita Boland

Academic investigations are establishing the future
 of transmission technology for troops and civilians.

Improving antennas for defense or commercial purposes has as much to do with mathematics as it does with hardware. Researchers in the Wireless Networking and Communications Group at the University of Texas at Austin are exploring algorithms along with other properties that should improve communications systems on the battlefield.

A key focus of the work has honed in on multiple input, multiple output (MIMO) technology, which features many transmit and receive antennas. A large portion of that effort involves studying limited feedback—an idea that if receivers can send information back to an original transmitter, that transmitter can better configure links. This process should reduce interference and has applications for MIMO and cellular communications. Dr. Robert W. Heath Jr., director of the Wireless Networking and Communications Group (WNCG), says the research has advanced beyond single point-to-point links to examine how base stations can connect to many users and how to coordinate multiple base stations together to reduce interference.

Team members are looking into the fundamental limitations of such systems, and their results demonstrate that complete elimination of interference is not feasible only through the coordination of base stations. “That’s been something that’s surprising,” Heath states. Graduate students under his direction also are studying new analysis techniques in which they try to understand system performance and how antennas would play a role in situations with randomly located base stations. On the cellular side, experiments are underway to see how antennas can improve facets of functionality. Team members are exploring how distributing antennas throughout the cell instead of locating them all at the base station impacts performance.

Researchers Whip Up 
Antenna 
Technology

December 1, 2012
By George I. Seffers

U.S. Army officials

 seek to replace the

 commonly used 
device.

For decades, the U.S. Army has relied on the ubiquitous whip antenna for an array of air and ground communications, but those antennas often interfere with one another and are plainly visible to enemy soldiers in search of a target. Now, service researchers are using a wide range of technologies that could begin replacing the pervasive whip, providing more efficient, effective and reliable combat communications. Options include antennas embedded with vehicle armor, transparent antennas integrated into windshields and smart antenna technology capable of determining the optimal direction to focus transmission power.

The rule of thumb on the battlefield is that the more antennas sticking off of a vehicle, the more likely the vehicle is a high-value target with a high-ranking occupant. But this situation could change as officials at the Communications-Electronics Research, Development and Engineering Center (CERDEC), Aberdeen Proving Ground, Maryland, investigate technologies for short- and long-term replacements for whip antennas, whether for dismounted, mounted or airborne communications.

“The antenna is the intermediary between the radio and the network. You can have a state-of-the-art radio and a very substantive network, but if you don’t have the antenna, the whole thing falls apart,” says Mahbub Hoque, acting director of CERDEC’s Space and Terrestrial Communications Directorate. “We have programs in this directorate—in the antenna division—starting from basic research to develop prototypes to technology ready to transition to the program managers and program executive officers.”

Technologies
 Advance the Art of Antenna Science

December 1, 2012
By George I. Seffers

U.S. Air Force researchers use 3-D printers and
 other cutting-edge concepts 
to create
 the next 
innovations.

There is no Moore’s Law for antennas because size reduction and performance improvement will always be subject to the limitations imposed by electromagnetic physics and material properties. But steady advances in computer technologies, such as electromagnetic modeling and simulation and 3-D printing, enable antenna technology researchers to push the limits of possibility on behalf of the warfighters.

Scientists and engineers at the U.S. Air Force Research Laboratory (AFRL), Antenna Technology Branch, Wright-Patterson Air Force Base, Ohio, are taking advantage of these technological advances to develop next-generation antennas. Experts say metamaterials show great promise for military antennas, but the technology is not yet at a point where it is being manufactured widely. To help overcome that challenge, Air Force researchers use a 3-D printer to prototype antenna metamaterials that potentially could advance technology beyond the more conventional microstrip antenna. Small, lightweight, low-cost microstrip antennas, which were invented about four decades ago, are used in military aircraft, missiles, rockets and satellite communications as well as in the commercial sector.

“It allows us a capability in rapid prototyping that we didn’t have before,” says David Curtis, the AFRL’s Antenna Technology Branch chief. “It’s yielding some interesting things. It’s creating new ground planes for antenna elements.”

Pages

Subscribe to RSS - December 2012