Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars
AFCEA logo
 

SIGNAL Online Exclusives

Google Glass Sharpens View of Wearable Computer Future

August 27, 2013
By Rachel Lilly

Cutting-edge consumer technology that once seemed possible only in science-fiction films now is in the hands of experts and innovators working to solve government challenges. From wearable mobile devices to a sensor that lets you control your screen with the wave of a hand or lift of a finger, these tools could one day be key to serving soldiers in the field.

Thermopylae Sciences and Technology, based in Arlington, Virginia, is one defense contractor pushing the technology envelope to apply commercial solutions to government problems. By participating in the Google Glass Foundry and Explorer programs, which make technology available to early adopters, the company recently acquired several pairs of Google glasses. Through this program and partnerships with other technology companies, Thermopylae now is experimenting with how wearable computers could integrate with its current and future products.

“What we’re able to do is to start prototyping and working with these devices in conjunction with the tools and technology we’re developing for actual government programs,” John-Isaac Clark, chief innovation officer for Thermopylae, says. As part of the Google Glass effort, “We can get early access to some of these technology improvements and then wonder … how might this technology let me interact with the user in a different way altogether?”

Clark, brother of Thermopylae President A.J. Clark, is a self-proclaimed geek and wears the glasses up to eight hours per day to explore the capabilities. While Google focuses on the commercial and consumer product spaces, Clark believes Google Glass sheds light on the potential for wearable computers in the military realm. “Eventually, while it is incredibly unlikely to be Google Glass, a soldier will have something like this.”

Don’t Blink! Eyes Provide Long-Term Identifications

August 26, 2013
By Rita Boland

Iris scans are a legitimate form of biometric identification over the long term, a new study from the National Institute of Standards and Technology confirms.

White House Cyber Policy Focuses on Internal Consolidation, External Engagement

August 21, 2013
By Henry Kenyon

As a part of its ongoing efforts to protect critical national infrastructure, the Obama administration has been actively working on making government computer networks more robust and resistant to cyber attack. To do this, the White House has looked internally at federal agencies to put into place new metrics and policies to improve their security stance and externally, reaching out to foreign governments to set up international accords on cyber espionage, a top administration official said.
 
The administration has several major priorities for its cyberspace policy: protecting critical infrastructure, securing the government, engaging internationally, and shaping the future, explained Andy Ozment, the White House’s senior director for cybersecurity.
 
Speaking at the USENIX Security Symposium in Washington D.C., on August 15, he said that as part of its overall cyberspace goals, the Obama administration is actively pursuing international engagement and cooperation. This is a necessity as most cyberspace intrusions come from overseas, he said, adding that it also touches on diplomatic issues. This is because the term “attack” has a number of political implications that can potentially lead to direct conflict with a nation. On the other hand, intrusions fall under the category of espionage, an area where there are well established protocols for working with other nations, he said.
 

Cyber Threats Abound, but Their Effects Are Not Certain

July 31, 2013
By Robert K. Ackerman

Protecting the nation from cyber attack entails deterring or preventing marauders from carrying out their malevolent plans. But, while government and the private sector endeavor to fight the menace jointly, evildoers constantly change their approaches and learn new ways of striking at vulnerable points. So many variables have entered the equation that even the likelihood of attacks—along with their effects—is uncertain.
 

Change Ahead for Global Governance of the Internet

July 19, 2013
By Max Cacas

In October 2014, the International Telecommunications Union (ITU) is scheduled to convene its Plenipotentiary Conference in Busan, South Korea. The conference, held every four years, is a venue for ITU member nations, including the United States, to discuss matters pertaining to the management of the planet’s telecommunications infrastructure. One of the most contentious and controversial issues dealt with in previous meetings of this type is global governance of the Internet. And now, one of the people who was “present at the creation” and helped develop the initial technical protocols for what would become the Internet is speaking out on the governance issue and the possibility of continued contention at next year’s ITU meeting.

Vint Cerf is vice president and chief Internet evangelist with Google in Reston, Virginia. Because the Internet as it is known today is largely the result of the American military’s investment in its predecessor, the DARPAnet, Cerf describes the U.S. role in the governance of the global network as “seminal.” As the Internet has evolved from being a resource for academics and government scientists, to a fast-growing network for commercial business, and now, a global communications tool, so too has the governance of what Cerf calls “the Internet ecosystem.”

Cerf, who helped invent the IP system for the transmission of data packets, says that the “distributed” nature of the Internet, which he calls “an engine of permissionless invention,” is also at the heart of the challenges of governance.

Spectrum Management System Deploying to Afghanistan

July 11, 2013
By George I. Seffers

The U.S. Army is currently delivering a new and improved Coalition Joint Spectrum Management and Planning Tool (CJSMPT) to divisions scheduled for deployment in Afghanistan. The software automates the spectrum management process, dramatically reducing the amount of time and paperwork associated with spectrum allocation and mission planning in a tactical environment.

For operational security reasons, Army officials cannot reveal exactly which divisions will be receiving the systems or when, but for the next few months, they will be working to get the system out to Afghanistan.

Warfighters are continually confronted with an increasingly crowded radio spectrum—too many devices transmitting on a limited range of frequencies and interfering with one another. Poor spectrum availability can have a devastating effect on operations, and spectrum management normally is a complex and time-consuming process involving frequency access requests that must be approved at multiple levels. “There’s a lot of paperwork associated with the spectrum management process. There are thousands of these [requests] that have to be prepared, submitted, received and reconciled down at the brigade level. Normally, this could take days or even weeks in preparation for a mission or deployment, and CJSMPT can do this in a matter of hours. It provides automation to the spectrum manager to reduce the complexity of his tasks,” says Bob Shields, chief of the Spectrum Analysis and Frequency Management Branch, Space and Terrestrial Communications Directorate, U.S. Army Communications-Electronic Research, Development and Engineering Center (CERDEC), Aberdeen Proving Ground, Maryland.

Citing Cost, Innovation and Flexibility, Navy Awards NGEN Contract to HP Group

June 27, 2013
By Robert K. Ackerman

The U.S. Navy has programmed change into its $3.45 billion Next-Generation Enterprise Network (NGEN) contract.

 

U.S. Army Welcomes Two New Draft Horses to Supercomputing Stable

June 21, 2013
By Max Cacas

The U.S. Army Research Laboratory (ARL) at Aberdeen Proving Grounds, Maryland, has unveiled two new supercomputers that are among the fastest and most powerful devices of their kind. The devices are part of a recently opened supercomputing center that is the new locus of the service’s use of high-speed computing not only for basic scientific research and development, but also to solve basic warfighter needs using the latest available technologies.

“The Army Research Lab is the largest user of supercomputing capacity,” says Dale Ormond, director, U.S. Army Research Development and Engineering Command (RDECOM). “To have a supercomputer there gives us a huge advantage as we move forward in our research and engineering mission,” he adds.

At the heart of the new Army supercomputer center are two IBM iDataPlex systems that are among the most powerful of their kind on the planet. “We have the ‘Pershing,’ which is the 62nd fastest computer in the world, and another one called ‘Hercules,’ which is the 81st (fastest),” he explains. The Pershing contains 20,160 central processing units (CPUs), 40 terabytes of memory, and operates at 420 teraflops. The Hercules has 17,472 CPUs, 70 terabytes of memory, and operates at 360 teraflops.

The $5 million dollar center also features state-of-the-art electrical supply systems designed to support supercomputing, and special cooling systems designed to manage the heat that comes from all the CPUs that make up both supercomputers. The new facility has over 20,000 square foot of space, which will eventually house as many as six large supercomputing systems by 2016.

Pershing and Hercules join other Army supercomputers run by the U.S. Army Corps of Engineers in Vicksburg, Mississippi, along with supercomputers operated by the Navy and Air Force.

Streamlining Coalition Mission Network Participation

June 17, 2013
By George I. Seffers

NATO and eight coalition nations participating in the Coalition Warrior Interoperability eXploration, eXperimentation and eXamination, eXercise (CWIX) are working to reduce the amount of time it takes to join coalition networks in the future. On average, it took a year or more for a nation to join the Afghan Mission Network, but officials hope to trim that down to a matter of weeks, says Lt. Col, Jenniffer Romero, USAF, the CWIX Future Mission Network focus area lead.

“On average, it was taking a year, maybe 18 months, for a nation to join the Afghan Mission Network, and usually we don’t have that much time,” says Col. Romero, who also serves as the chief, cyber assessments for the U.S. Joint Staff J6 Command, Control, Communications and Computers Assessments Division.

The network for future operations will be a federated network modeled after the Afghan Mission Network, for which NATO offered the core infrastructure that participating nations could connect with using their own networks. Col. Romero explains that the goal is to have core services up and running on “day zero,” which she defines as the day pre-deployment orders drop. “Our goal is for the lead nation or lead organization to have the core up and running on that day and for people to be able to join within weeks as opposed to months and months,” she says.

To streamline the process, officials are creating templates of instructions for joining future coalition networks, which NATO officials refer to as the Future Mission Network and U.S. officials dub the Mission Partner Environment. For the CWIX exercise, which runs from June 3-20, they have built a mission network that includes core services such as voice, chat, email and document handling. “We’re assessing those core enterprise services on a future mission network that was built for CWIX 13 specifically for that purpose,” the colonel states.

Cyber Commander Calls for Consolidated Activities

June 12, 2013
By Robert K. Ackerman

In the midst of a raging controversy over widespread National Security Agency (NSA) monitoring, the head of the NSA and U.S. Cyber Command defends cyber surveillance efforts and calls for greater consolidation of cyber activities among diverse organizations.

Pages

Subscribe to RSS - SIGNAL Online Exclusives