Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars
AFCEA logo
 

AFCEA Answers

AFCEA Answers: A Plan for Protecting the Nation's Infrastructure

February 24, 2014
By Max Cacas

It's important for the government, working with industry, to have a plan in place to protect the nation’s critical infrastructure, according to Suzanne Spaulding, acting undersecretary for the National Protection and Programs Directorate with the U.S. Department of Homeland Security, who spoke during a recent episode of AFCEA Answers.

AFCEA Answers: Cybersecurity Training Must Encompass All Work Force Needs

November 22, 2013
By Max Cacas

For more than a decade, experts have been forecasting a shortage in trained cybersecurity professionals. And the demand for those experts continues as government and industry note an uptick in the number and nature of cyberthreats. Experts weigh in for the latest episode of AFCEA Answers.

AFCEA Answers: The Next Generation of Identity Assurance

November 8, 2013
By Max Cacas

When it comes to cybersecurity, one of the biggest challenges is verifying the identity of the end user, whether it's for an e-commerce site or a secure government database. Experts weigh in during the latest episode of AFCEA Answers.

AFCEA Answers: Frustrating Wait to Acquire Emerging Information Technology

September 27, 2013
By Max Cacas

The chief information officer for the U.S. Marine Corps says that in an era when he and his colleagues in the American military would like tactical radios to be a cross between a walkie-talkie and a smartphone, there is a big challenge to be overcome. And no, it has nothing to do with bandwidth, storage or even the device itself, although all of those are important considerations.

Instead, Brig. Gen. Kevin Nally, USMC, director, command, control, communications and computers, and chief information officer (CIO), U.S. Marine Corps, says, “I am frustrated by waiting too long to get current, emerging information technology into the infrastructure. Gen. Nally made his comments during a recent edition of the AFCEA Answers radio program.

The general says this frustration on his part is compounded by the reality that because his service is part of the Department of the Navy, he does not have the same kind of acquisition authority that CIO colleagues in other services and most civilian agencies enjoy.

“At the speed of cyber, I need it now, I don’t need it tomorrow. I think with the handheld devices that we will use in the future, that we are developing right now with DISA [the Defense Information Systems Agency] are good, they continue to improve, DISA has been very helpful. But it has to be secure in a cyber environment,” he says.

Gen. Nally also notes that young Marines just entering the service within the last few years are easier to train using new handheld devices, especially those based on commercial smartphone technology, because they grew up using similar devices.

AFCEA Answers: The Five "Vs" of Big Data

September 13, 2013
By Max Cacas

In considering how best to manage the challenges and opportunities presented by big data in the U.S. Defense Department, Dan Doney, chief innovation officer with the Defense Intelligence Agency (DIA), says the current best thinking on the topic centers around what he calls, “the five Vs”.

Appearing on a recent episode of the AFCEA Answers radio program, Doney says it’s important to always consider “volume, velocity, variety, veracity and value” when trying to manage and take advantage of big data.

“Volume gets the most attention,” he says, noting that most people focus on datasets measured in terabytes and petabytes. “In fact, though, that’s the one in which we’ve made the most progress. When it comes to “velocity,” or the rate at which large datasets often pour into servers, Doney notes that many algorithms originally designed for static databases now are being redesigned to handle datasets that require disparate types of data to be interconnected with metadata to be useful.

Doney goes on to say that “variety” remains one of the last three challenges when it comes to big data for his agency because of the DIA’s mandate to create a “big picture” that emerges from all that information. And he says that solutions have so far not caught up with the DIA’s needs.

Doney says “veracity,” or the “ability to put faith behind that data,” becomes a challenge when one needs to put equivalent amounts of context to disparate data types to add important detail to that “big picture.”
 

Brian Weiss, vice president, Autonomy/HP, says that when it comes to “value” in consideration of big data, some of the most exciting innovation is coming in terms of how to distinguish and sort out important information from the huge datasets.

Szykman: Turning Big Data Into Big Information

August 30, 2013
By Max Cacas

 
Current efforts to deal with big data, the massive amounts of information resulting from an ever-expanding number of networked computers, storage and sensors,  go hand-in-hand with the government’s priority to sift through these huge datasets for important data.  So says Simon Szykman, chief information officer (CIO) with the U.S. Department of Commerce.
 
He told a recent episode of the “AFCEA Answers” radio program that the current digital government strategy includes initiatives related to open government and sharing of government data. “We’re seeing that through increased use of government datasets, and in some cases, opening up APIs (application programming interfaces) for direct access to government data.  So, we’re hoping that some of the things we’re unable to do on the government side will be done by citizens, companies, and those in the private sector to help use the data in new ways, and in new types of products.”
 
At the same time, the source of all that data is itself creating big data challenges for industry and government, according to Kapil Bakshi, chief solution architect with Cisco Public Sector in Washington, D.C.
 
“We expect as many as 50 billion devices to be connected to the internet by the year 2020.  These include small sensors, control system devices, mobile telephone devices.  They will all produce some form of data that will be collected by the networks, and flow back to a big data analytics engine.”  He adds that this forthcoming “internet of things,” and the resultant datasets, will require a rethinking of how networks are configured and managed to handle all that data. 
 

DeVries: A New Definition for Enterprise Services

August 16, 2013
By Max Cacas

Now that the Joint Information Environment  (JIE) has become one of the top priorities for the Department of Defense information technology officials (see Gaining Consensus on the JIE, June 2013), it is more important than ever to make sure that this new paradigm of extending voice, data and multimedia to the warfighter integrate well with existing enterprise services within the military.  That’s according to David DeVries, the deputy chief information officer for information enterprise with the Defense Department, speaking on a recent edition of the AFCEA Answers  radio program.

“The Joint Information Environment is about moving to more enterprise level things.  But just because I start to consolidate, and I reduce the number of my data centers, and I may reduce the number of my databases, I’m gong to rely more and more on the technology technology innovation to help me to achieve the economies of scale and size,” he says.  In one example, he explains that the challenge is to take existing databases used to support military mission requirements, adapt them to operate in a cloud computing environment, which in turn would allow the database to be used for other, related purposes.

AFCEA Answers: GSA’s McClure Cites Two Factors for Security in the Cloud

August 5, 2013
By Max Cacas

When it comes to cloud computing, there are two items that are top of mind for Dave McClure, Associate Administrator with the General Services Administration (GSA) in Washington, D.C.
 
“One is boundaries. Where does a cloud service provider’s authorization and control begin and end?” he noted on a recent edition of the “AFCEA Answers” radio show. McClure goes on to explain that while an infrastructure provider might have a given set of controls and responsibilities, there are software applications that, as he puts it, “sit on top of that infrastructure. Who owns the apps, and who is responsible for security in the application space?”
 
McClure, who has had a long career in information technology in both the private sector and government, suggests that the other challenging security area in today’s cloud computing environment deals with defining the business side of cloud. “There’s some confusion between security controls and contractual terms that deal with access issues, location issues, and usage, some of which are contract, more than straight security concerns. Getting all of that right—the boundaries, the authentication piece, the contract piece—there’s definitely a lot to pay attention to in the cloud space.”
 
Edwin Elmore, Cloud Computing Business Development Manager with Cisco Systems in Washington, sees the challenge of security in the cloud as one of “taking the physical world and moving it to the virtualized world. When you look at cloud computing, it’s a heavily virtualized environment, so the same controls you have around a physical perimeter in your physical data center, now you have to extend it to the virtualized world.” And that, he says, includes applying the same security protocols when it comes to virtual machines exchanging data with each other.
 

Halvorsen: DISA Cloud Contract Will Simplify Things

July 18, 2013
By Max Cacas

In the coming months, the Defense Information Systems Agency (DISA) is expected to issue its multiaward contract for cloud computing services.

Terry Halvorsen, chief information officer with the Department of the Navy, believes that the contract will make his life simpler.  “Much as we have done with our software enterprise licencing program, where we have bundled up the requirements, and we can go to the marketplace, DISA will be able to bundle the requirements, and we’ll be able to go in, and we will represent a much bigger share of the market, more money on the table, and that will get us much more competitive pricing,” he says in a recent episode of the new AFCEA Answers radio program.

In addition, the military services will have even more options for using cloud computing to perform their missions, according to Henry Fleischman, chief technologist, Federal Cloud Solutions, Hewlett-Packard in Washington.
“It will provide a mechanism to consume cloud services that are certified for government use [and] safe for the end users at different agencies to consume, and I think it will open up an ecosystem of cloud service providers that agencies can have a direct relationship with,” he explains.

Another outcome, says Fleischman, is that it will, “drive a level of standardization in the cloud offerings for government, in that, by being a certified DISA cloud provider, it will allow agencies to be more direct about what we want today.”

He also believes that in the long-run, a successful multi-award contract for cloud computing services will also offer his service more competitive pricing, and at the same time make the contract more valuable to cloud service providers chosen to offer commercial cloud services to the military. 

Dubsky: Continuous Monitoring Aids Cybersecurity Effort

July 5, 2013
By Max Cacas

There’s nothing new about the idea of continuous monitoring in information technology systems. But the ever-growing and changing cyber threat landscape explains new mandates that it become an integral part of all new federal IT systems, according to Lance Dubsky, chief information security officer with the National Geospatial-Intelligence Agency (NGA).
“As the Intelligence Community has begun implementing ICD-503, and the NIST Risk Management Framework, every new system will have a continuous monitoring strategy,” he says during the latest edition of the new radio program AFCEA Answers.
He goes on to say that the big challenge is to also continually review security controls tied to the strategy, to make sure those controls always match the changing security risk.
Al Kinney, director of cybersecurity capabilities with Hewlett-Packard, feels that the tools are available now to readily integrate continuous monitoring into most federal systems.
“You’ll have a solid system based on a standardized process and a standardized kit of tools, so that you have the opportunity to understand fully what’s happening on your networks,” he says.
Is continuous monitoring a key plank of cybersecurity strategy in your federal agency or company? And: is it making a difference?
Got a question? Got an answer? AFCEA Answers wants to hear from you – join in the conversation!
 

Pages

Subscribe to RSS - AFCEA Answers