Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars
AFCEA logo
 

cloud computing

FedRAMP May Replace Defense Department Cloud Classification Process

April 4, 2014
By George I. Seffers

The Federal Risk and Authorization Management Program (FedRAMP) may ultimately eliminate the need for an information security classification process specific to the U.S. Defense Department, according to Teri Takai, Defense Department chief information officer. FedRAMP seeks to provide a governmentwide, standardized approach to security assessment, authorization and continuous monitoring for cloud products and services.

Space Command Helps Coordinate Network Modernization Efforts

March 18, 2014
By Henry S. Kenyon

The U.S. Air Force Space Command is helping the service put its joint modernization plans into place. As the command responsible for handling cyberspace, communications and information missions, it is the Air Force’s instrument in meeting major Defense Department technology goals, such as establishing the Joint Information Environment.

Partnership Promises to Prevent Cloud Computing Problems

March 1, 2014
By George I. Seffers

Software developed by university researchers accurately predicts cloud computing issues before they occur, enhancing reliability; cutting costs; potentially improving cybersecurity; and saving lives on the battlefield.

Readying for Third-Generation Defense Systems

January 1, 2014
By Paul A. Strassmann

The U.S. Defense Department now is advancing into the third generation of information technologies. This progress is characterized by migration from an emphasis on server-based computing to a concentration on the management of huge amounts of data. It calls for technical innovation and the abandonment of primary dependence on a multiplicity of contractors.

Interoperable data now must be accessed from most Defense Department applications. In the second generation, the department depended on thousands of custom-designed applications, each with its own database. Now, the time has come to view the Defense Department as an integrated enterprise that requires a unified approach. The department must be ready to deal with attackers who have chosen to corrupt widely distributed defense applications as a platform for waging war.

When Google embarked on indexing the world’s information, which could not yet be achieved technically, the company had to innovate how to manage uniformly its global data platform on millions of servers in more than 30 data centers. The Defense Department has embarked on creating a Joint Information Environment (JIE) that will unify access to logistics, finance, personnel resources, supplies, intelligence, geography and military data. When huge amounts of sensor data are included, the JIE will be facing two to three orders of magnitude greater challenges to organizing the third generation of computing.

JIE applications will have to reach across thousands of separate databases that will support applications to fulfill the diverse needs of an interoperable joint service. Third-generation systems will have to support millions of desktops, laptops and mobile networks responding to potentially billions of inquiries that must be assembled rapidly and securely.

The U.S. Navy Looks to the Cloud

December 5, 2013
By Robert K. Ackerman

The move to the cloud offers great potential for U.S. Navy information technology efforts. Yet, other aspects such as applications and integrated capability sets must work their way into the sea service cyber realm.

Is Big Data the Way 
Ahead for Intelligence?

October 1, 2013

Another Overhyped Fad

By Mark M. Lowenthal

Director of National Intelligence Lt. Gen. James R. Clapper, USAF (Ret.), once observed that one of the peculiar behaviors of the intelligence community is to erect totem poles to the latest fad, dance around them until exhaustion sets in, and then congratulate oneself on a job well done.

One of our more recent totem poles is big data. Big data is a byproduct of the wired world we now inhabit. The ability to amass and manipulate large amounts of data on computers offers, to some, tantalizing possibilities for analysis and forecasting that did not exist before. A great deal of discussion about big data has taken place, which in essence means the possibility of gaining new insights and connections from the reams of new data created every day.

Or does it?

Read the complete perspective

A Longtime Tool of the Community

By Lewis Shepherd

What do modern intelligence agencies run on? They are internal combustion engines burning pipelines of data, and the more fuel they burn the better their mileage. Analysts and decision makers are the drivers of these vast engines; but to keep them from hoofing it, we need big data.

The intelligence community necessarily has been a pioneer in big data since inception, as both were conceived during the decade after World War II. The intelligence community and big data science always have been intertwined because of their shared goal: producing and refining information describing the world around us, for important and utilitarian purposes.

Read the complete perspective

Committed to Cloud Computing

October 1, 2013
By George I. Seffers

Recent insider security breaches have put increased scrutiny on the U.S. intelligence community’s cloud computing plans. But cloud computing initiatives remain unchanged as the technology is expected to enhance cybersecurity and provide analysts with easier ways to do their jobs in less time.

With cloud computing, reams of data reside in one location rather than in a variety of repositories. Combining data leads to greater efficiencies for intelligence analysts, but in the view of some, it also means greater vulnerabilities. “There’s a school of thought that says if you co-locate data, you actually expose more of it in case of an insider threat than if you keep it all in separate repositories by data type,” explains Lonny Anderson, National Security Agency (NSA) chief information officer. “The onus is on us to convince the rest of the community, the rest of the Defense Department, that we can secure their information in the cloud in a way that they simply can’t secure it today.”

Anderson acknowledges that the recent insider leaks have increased doubts within the intelligence community about cloud computing, but he expresses confidence that the agency and the intelligence community are on the right path. “I think everybody is a little more nervous and a little more security conscious.

“Everything we’ve learned so far of [NSA leaker Edward Snowden’s] activities has reinforced for us that the path we’re already on is the right path. The lesson we’ve learned is the need to share information but to share selectively, only with those with a need to know,” Anderson says. “The leaks actually reinforced the need to move to the cloud and move there more quickly.”

ICITE Builds From the Desktop Up

September 9, 2013
By Robert K. Ackerman

As the intelligence community moves into the cloud, it launches the first step at the desktop level.

Army Signal Expands Its Reach

September 1, 2013
By Robert K. Ackerman

The U.S. Army Signal Corps is expanding the work its personnel conduct while dealing with technology and operational challenges that both help and hinder its efforts. On the surface, Army signal is facing the common dilemma afflicting many other military specialties—it must do more with fewer resources.

NATO Seeks 
Umbrella
 Communications

September 1, 2013
By Robert K. Ackerman

NATO is adopting an enterprise approach to networking so it can take advantage of new defense information system capabilities as well as recent developments gleaned from Southwest Asia operations. This approach would allow different countries participating in alliance operations to network their own command, control and communications systems at the onset of an operation.

Pages

Subscribe to RSS - cloud computing