Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars
AFCEA logo
 

Cyber

Government Seeks New Identity Markers

September 1, 2013
By Max Cacas

 

In the next few years, usernames and passwords could gradually fade from popular use as a way to conduct business online. A public/private coalition is working on a new policy and technical framework for identity authentication that could make online transactions less dependent on these increasingly compromised identity management tools. A second round of federal grants from the group, expected this fall, will lead to continued work on what is expected to become a private sector-operated identity management industry.

“The fact is that the username and password are fundamentally broken, both from a security standpoint as well as a usability standpoint,” says Jeremy Grant, senior executive adviser for identity management with the National Institute of Standards and Technology (NIST), an agency of the Department of Commerce. As a result of such security weakness, cybercrime is costing individuals and businesses billions of dollars every year. An estimated 11.7 million Americans were victims of identity theft of some kind, including online identity theft over a recent two-year period, according to NIST, the federal agency tasked with setting cybersecurity standards.

Transforming NATO's Information Technology Architecture

September 1, 2013
By George I. Seffers

 

NATO officials are laying the groundwork for a centralized enterprise networking architecture with invitations to bid expected to be released by year’s end. The new approach is expected to offer a number of benefits, including cost savings, improved network reliability, enhanced cybersecurity and greater flexibility for warfighters.

Officials at the NATO Communications and Information (NCI) Agency kicked off the alliancewide effort in August of last year shortly after the agency was created. The initial goal was simply to examine the alliance’s information technology infrastructure, how it could be modernized, where efficiencies could be gained and how to make the business case for modernization. The NCI Agency partnered with the Network Operations Industry Consortium (NCOIC) for the study. “We didn’t want to take just an academic view or an internal belly-button look. We wanted to get industry involved and find out what is within the realm of possibility today,” says Peter Lenk, chief, Capability Area Team Seven, NCI Agency.

The result will be a historical transition for the alliance. “We are for the first time, or one of the first times in NATO, looking at things as an enterprise. We’re starting to try to consolidate things across traditional boundaries,” Lenk says. “Through the creation of the NCI Agency, which has a mandate across all of the components of NATO, now we have within our grasp the ability to do this, and we can clearly see the advantages.”

Army Signal Expands Its Reach

September 1, 2013
By Robert K. Ackerman

The U.S. Army Signal Corps is expanding the work its personnel conduct while dealing with technology and operational challenges that both help and hinder its efforts. On the surface, Army signal is facing the common dilemma afflicting many other military specialties—it must do more with fewer resources.

Bringing Together Signal and Cyber

September 1, 2013
By Paul A. Strassmann

In his June interview with SIGNAL Magazine, Gen. Keith B. Alexander advocated bringing together the signal community, signals intelligence and the cyber community. In that interview, he said, “We need to think of ourselves not as signals, not as intelligence, not as cyber, but instead as a team that puts us all together.” Yet, that goal raises several questions. How can these concepts be achieved? How can a combination of more than 15,000 system enclaves from the U.S. Army, Navy, Marine Corps and Air Force become interoperable? What technologies are needed in the next five years while insufficient budgets make consolidations difficult?

Open Data Initiative: Providing Fresh Ideas on Securely Sharing Information

August 30, 2013
By Paul Christman and Jamie Manuel

For years, the Defense department took a “do it alone” posture when it came to sharing information and protecting its networks and communication infrastructures from security attacks. Now in an interconnected world of reduced budgets and ever-increasing security risks, the DOD is fundamentally changing the way it approaches information sharing and cybersecurity. 

White House Cyber Policy Focuses on Internal Consolidation, External Engagement

August 21, 2013
By Henry Kenyon

As a part of its ongoing efforts to protect critical national infrastructure, the Obama administration has been actively working on making government computer networks more robust and resistant to cyber attack. To do this, the White House has looked internally at federal agencies to put into place new metrics and policies to improve their security stance and externally, reaching out to foreign governments to set up international accords on cyber espionage, a top administration official said.
 
The administration has several major priorities for its cyberspace policy: protecting critical infrastructure, securing the government, engaging internationally, and shaping the future, explained Andy Ozment, the White House’s senior director for cybersecurity.
 
Speaking at the USENIX Security Symposium in Washington D.C., on August 15, he said that as part of its overall cyberspace goals, the Obama administration is actively pursuing international engagement and cooperation. This is a necessity as most cyberspace intrusions come from overseas, he said, adding that it also touches on diplomatic issues. This is because the term “attack” has a number of political implications that can potentially lead to direct conflict with a nation. On the other hand, intrusions fall under the category of espionage, an area where there are well established protocols for working with other nations, he said.
 

Tech Transfer Thrives

August 20, 2013

 

Investors, integrators and information technology companies this week will see eight government-developed emerging cybersecurity technologies ready for transition into the commercial sector. Capabilities to be unveiled include intrusion detection, removable media protection, software assurance and malware forensics. The technology demonstration day, which takes place in San José, California, on August 22, gives investors and the business sector the opportunity to view laboratory prototypes of the cybersecurity products in action.

Michael Pozmantier, program manager, Transition-to-Practice, Science and Technology Directorate, U.S. Department of Homeland Security, and technology developers from the U.S. Department of Energy’s national laboratories will be on hand to discuss the capabilities and their potential.

Transition-to-Practice events are held several times a year at locations around the United States.

Incentivizing Companies to Manage Cyber Risks Better

August 9, 2013

 

The White House is developing a core of practices to develop capabilities to manage cybersecurity risk. This Cybersecurity Framework will be available in draft form in October and finalized in February 2014. At that time, officials will create the Voluntary Program to encourage critical infrastructure companies to adopt the framework. Until then, the government is looking at ways to incentivize companies to participate. Some recommended incentives can be adopted quickly while others will require legislative action and additional work. The White House is collaborating with appropriate agencies now to move forward and to prioritize incentive areas including cybersecurity insurance, grants, process preference, liability limitation, streamlined regulations, public recognition, rate recovery for price regulated industry and cybersecurity research. For more detailed information, visit the White House Blog

 

 

AFCEA Answers: GSA’s McClure Cites Two Factors for Security in the Cloud

August 5, 2013
By Max Cacas

When it comes to cloud computing, there are two items that are top of mind for Dave McClure, Associate Administrator with the General Services Administration (GSA) in Washington, D.C.
 
“One is boundaries. Where does a cloud service provider’s authorization and control begin and end?” he noted on a recent edition of the “AFCEA Answers” radio show. McClure goes on to explain that while an infrastructure provider might have a given set of controls and responsibilities, there are software applications that, as he puts it, “sit on top of that infrastructure. Who owns the apps, and who is responsible for security in the application space?”
 
McClure, who has had a long career in information technology in both the private sector and government, suggests that the other challenging security area in today’s cloud computing environment deals with defining the business side of cloud. “There’s some confusion between security controls and contractual terms that deal with access issues, location issues, and usage, some of which are contract, more than straight security concerns. Getting all of that right—the boundaries, the authentication piece, the contract piece—there’s definitely a lot to pay attention to in the cloud space.”
 
Edwin Elmore, Cloud Computing Business Development Manager with Cisco Systems in Washington, sees the challenge of security in the cloud as one of “taking the physical world and moving it to the virtualized world. When you look at cloud computing, it’s a heavily virtualized environment, so the same controls you have around a physical perimeter in your physical data center, now you have to extend it to the virtualized world.” And that, he says, includes applying the same security protocols when it comes to virtual machines exchanging data with each other.
 

Software Increases 
Unmanned Craft Survivability

August 1, 2013
By George I. Seffers and Robert K. Ackerman

 

The U.S. Defense Advanced Research Projects Agency is developing new control software to reduce the vulnerability of unmanned systems to cyber attack. This effort is relying on new methods of software development that would eliminate many of the problems inherent in generating high-assurance software.

Unmanned vehicles suffer from the same vulnerabilities as other networked information systems. But, in addition to their data being co-opted, unmanned systems can be purloined if adversaries seize control of them. This problem also applies to human-crewed systems with computer-controlled components.

If the research program is successful, then unmanned vehicles will be less likely to be taken over by an enemy. Warfighters could trust that the unmanned vehicle on which they are relying will not abandon its mission or become a digital turncoat.

This security would extend to other vulnerable systems as well. Networked platforms and entities ranging from automobiles to supervisory control and data acquisition (SCADA) systems could benefit from the research. The vulnerability of SCADA systems is well-established, but only recently has research shown that automobiles can be co-opted through their computer-controlled systems. The program’s goal is to produce high-assurance software for military unmanned vehicles and then enable its transfer to industry for commercial uses.

The Defense Advanced Research Projects Agency (DARPA) program is known as High-Assurance Cyber Military Systems, or HACMS. Kathleen Fisher, HACMS program manager, says the program is aiming to produce software that is “functionally correct and satisfying safety and security policies.

“It’s not just that you’re proving the absence of a particular bad property from the security perspective,” she explains. “You’re actually positively proving that the software has the correct behavior.”

Pages

Subscribe to RSS - Cyber