Search:  

 Blog     e-Newsletter       Resource Library      Directories      Webinars
AFCEA logo
 

data management

NIST Releases Latest Catalog of Security and Privacy Controls for Federal Systems

May 3, 2013
by Max Cacas

A government-wide task force led by NIST is out with the latest catalog of security and privacy controls for federal information systems, including some new thinking when it comes to addressing insider threats that go beyond technology.

Departments: 

Secretary Hagel Commits to Resolving Medical Record Interoperability Issues

April 16, 2013
By George I. Seffers

Defense Department will decide on a path forward within 30 days.

Secretary of Defense Chuck Hagel told members of Congress on April 16 that he is personally committed to solving the database interoperability problems between the Defense Department (DOD) and the Department of Veterans Affairs (VA) that have left thousands of veterans waiting months while benefits claims are processed.

According to VA officials, the agency has been breaking records in the number of claims processed, yet it now takes an average of 273 days to process a claim. The VA has fallen increasingly behind as veterans return from Iraq and Afghanistan, and that backlog is expected to increase as the drawdown in Afghanistan continues.

Part of the issue is that the VA uses an electronic processing system known as the Veterans Health Information Systems and Technology Architecture (VistA), while the Defense Department uses the Armed Forces Health Longitudinal Technology Application (AHLTA) for processing medical records.

Introduced in 1996, VistA offers an automated environment that supports day-to-day operations at local VA health care facilities. It is built on a client-server architecture, which ties together workstations and personal computers with graphical user interfaces at various VA facilities, as well as software developed by local medical facility staff. The system also includes the links that allow commercial off-the-shelf software and products to be used with existing and future technologies.

Departments: 

Change Is Challenge

March 1, 2013
George I. Seffers

Homeland Security Conference 2013 Show Daily, Day 3

Although many in government are moving as quickly as possible to adopt new technologies, such as cloud computing and mobile devices, individual agencies still face cultural challenges that sometimes prevent them from moving forward, according to officials speaking as part of the Chief Information Officer Council at the AFCEA Homeland Security conference in Washington, D.C.

Richard Spires, chief information officer for the Homeland Security Department (DHS), reminded the audience that DHS was created by joining a lot of disparate agencies, all of whom owned individual networks. While the department is working to integrate the information technology infrastructure and consolidate data centers, officials still meet some resistance at the individual agency level. “There’s still have lot of duplication and in some ways duplication is holding us back. I’d like to say we’re making progress, but I’ll let others grade us on that,” Spires said.

Other officials agreed that they meet resistance as well. Robert Carey, deputy chief information officer for the Defense Department cited a culture of change and said a constrained budget environment can be a power catalyst for action in moving toward a more centralized environment.

Cybersecurity itself can present challenges, according to Luke McCormack, chief information officer for the Justice Department. “Cyber’s hard. The individual pieces of that can be very difficult,” he said. He also cited the need to bring people together on emerging technologies, such as cloud-as-a-service, as a challenging issue.

A New Chip Thinks Like a Brain

March 1, 2013
By Max Cacas

An Army research team develops a device that could assist warfighters' decision making.

A U.S. Army scientist and his colleagues, working in the nascent field of neural computing and quantum physics, have earned a patent for a powerful quantum neural dynamics computer chip. The device, which has been tested in a laboratory, and the advanced mathematical computations that make it work may lead one day to powerful devices that could help warfighters sift through huge datasets of information and make important tactical decisions in the field. The chip also holds promise for civilian applications requiring the rapid analysis of big data, and it could represent a bridge to the next generation of computing.

“The patent covers different ways to make computer chips,” states Ron Meyers, a computer scientist with the Army Research Laboratory (ARL) who is the principal investigator for the neural chip project. “We developed a type of mathematics that allows for quick function-changing and also emulates some of the processes of neural intelligence that the human brain uses. We combined those together, and we made a new type of computer chip that incorporates those functions. It’s qualitatively different. It doesn’t do the same kinds of computations as traditional computer chips.”

The chip, and its underlying operating system based on newly developed mathematical formulas, will make possible faster and more powerful computers. “We’re talking about the ability to compute that exceeds exponentially millions of times greater than any of the computers that exist today or are on the drawing boards using conventional approaches,” Meyers explains.

Departments: 

Cloud Industry Group Issues Mobile Computing Guidelines

March 1, 2013
By Max Cacas

When it comes to popular smartphones and tablets, security can be a many-layered and necessary endeavor

The growing use of advanced mobile devices, coupled with the increase in wireless broadband speed, is fueling demand by employees to bring their own devices to the job. This situation has opened a new set of security challenges for information technology staff, especially when it comes to the use of apps.

As the popularity and capability of mobile devices expands, standards are necessary to ensure that personal devices can function securely on enterprise networks. To address this need, the Cloud Security Alliance (CSA) organized its Mobile Working Group last year. The group recently released guidance to members on how enterprise administrators can successfully integrate smartphones and tablets into their work environment. The CSA is a not-for-profit organization of industry representatives focused on information assurance in the cloud computing industry.

Top Information Technology Officials Peer into the Future

February 28, 2013
George I. Seffers

Top information technology officials from a variety of government agencies identified cloud computing, mobile devices and edge technologies as the technologies that will be critical for accomplishing their missions in the future.

Luke McCormack, chief information officer, Justice Department, cited cloud-as-a-service as vital to the future. He urged industry to continue to push the barriers of stack computing, and he mentioned edge technology as an emerging technology. “Edge is going to be really critical to perform missions,” he said. He cited the Google Glass project as an indicator of what the future will bring.

Mobility could be the future of training and simulation, suggested Sandy Peavy, chief information officer for the Federal Law Enforcement Training Center within the Department of Homeland Security (DHS). She revealed that her office is putting together a pilot program introducing tablet computers into the training environment, and ideally, she would like trainees to be able to access simulation on the mobile device of their choice. Peavy also reported that the Bureau of Alcohol, Tobacco, Firearms and Explosives is providing special agents with iPhones and experimenting with other devices. “If I’m going to be able to provide just-in-time training, then mobile technology is the key.”

Richard Spires, chief information officer for the Department of Homeland Security, also cited mobility as a key future trend, but he also brought up metadata tagging, saying that it helps to understand the data itself and to establish rules for who sees what information. Metadata tagging is especially important as the department grapples with privacy concerns.

Researchers Organize to 
Share Data, Speed Innovation

February 1, 2013
By Max Cacas

To meet the challenge of implementing big data, a new international scientific organization is forming to facilitate the sharing of research data and speed the pace of innovation. The group, called the Research Data Alliance, will comprise some of the top computer experts from around the world, representing all scientific disciplines.

Managing the staggering and constantly growing amount of information that composes big data is essential to the future of innovation. The U.S. delegation to the alliance’s first plenary session, being held next month in Switzerland, is led by Francine Berman, a noted U.S. computer scientist, with backing from the National Science Foundation (NSF).

Meeting the challenges of how to harness big data is what makes organizing and starting the Research Data Alliance (RDA) so exciting, Berman says. “It has a very specific niche that is very complementary to a wide variety of activities. In the Research Data Alliance, what we’re aiming to do is create really tangible outcomes that drive data sharing, open access, research data sharing and exchange,” all of which, she adds, are vital to data-driven innovation in the academic, public and private sectors. The goal of the RDA is to build what she calls “coordinated pieces of infrastructure” that make it easier and more reasonable for people to share, exchange and discover data.

“It’s really hard to imagine forward innovation without getting a handle around the data issues,” emphasizes Berman, the U.S. leader of the RDA Council, who, along with colleagues from Australia and the European Union, is working to organize the alliance. Ross Wilkinson, executive director of the Australian National Data Service, and John Wood, secretary-general of the Association of Commonwealth Universities in London, are the other members of the council.

Departments: 

U.S. Government Bets Big on Data

January 1, 2013
By George I. Seffers
The Texas Advanced Computing Center has supported research to develop next-generation hurricane models. Environmental science and technology is one area of research that could benefit from big data initiatives.

A multi-agency big data initiative offers an array of national advantages.

U.S. government agencies will award a flurry of contracts in the coming months under the Big Data Research and Development Initiative, a massive undertaking involving multiple agencies and $200 million in commitments. The initiative is designed to unleash the power of the extensive amounts of data generated on a daily basis. The ultimate benefit, experts say, could transform scientific research, lead to the development of new commercial technologies, boost the economy and improve education, all of which makes the United States more competitive with other nations and enhances national security.

Big data is defined as datasets too large for typical database software tools to capture, store, manage and analyze. Experts estimate that in 2013, 3.6 zettabytes of data will be created, and the amount doubles every two years. A zettabyte is equal to 1 billion terabytes, and a terabyte is equal to 1 trillion bytes.

When the initiative was announced March 29, 2012, John Holdren, assistant to the president and director of the White House Office of Science and Technology Policy, compared it to government investments in information technology that led to advances in supercomputing and the creation of the Internet. The initiative promises to transform the ability to use big data for scientific discovery, environmental and biomedical research, education and national security, Holdren says.

Currently, much of generated data is available only to a select few. “Data are sitting out there in labs—in tens of thousands of labs across the country, and only the person who developed the database in that lab can actually access the data,” says Suzanne Iacono, deputy assistant director for the National Science Foundation (NSF) Directorate for Computer and Information Science and Engineering.

Departments: 

Too Much Information Imperils Big Data

January 1, 2013
By Rita Boland

It causes problems from the battlefield to the doctor’s office, but leaders are fueling the competitive fire to find an answer. 

Government experts on big data are taking a lesson from the commercial sector to introduce a novel means of finding solutions to some of their most daunting challenges. Using an open innovation approach, thought leaders believe they can generate new ideas while also reducing costs, speeding processes and soliciting responses from outside the usual cast of characters.
 

The National Science Foundation (NSF), the U.S. Department of Energy and NASA ran what they called the Ideation Challenge, focused on big data, between October and December of last year. The challenge comprised three different contests. Organizers scheduled the events close together on purpose, and officials used results from earlier competitions in the posting of later ones to help generate buzz. They hoped that participants would see what others had accomplished, then want to build or improve on those results.

Organizers focused the contests on the fields of earth science, health and energy. The series of competitions was hosted through the NASA Tournament Lab, which is a collaboration among the space agency, Harvard University and TopCoder, a competitive community for software development and digital creation with more than 400,000 members. Competitors submitted their ideas through TopCoder’s technologies.

Reading, Writing and Big Data Basics

January 1, 2013
By Max Cacas
Ben Connors is a director of Big Data University and head of worldwide alliances with Jaspersoft Incorporated.

An industry-supported online school provides a good grounding
in the science and application of very large datasets.

A virtual school, developed by a team of leading software and hardware companies, is providing readily accessible education in the use of large information datasets. The classes range from entry-level sessions on the essentials of big data for managers to practical instruction for veteran programmers who are accustomed to managing more traditional relational databases.

The mission of BigData University is to provide training and broaden the expertise of the big data community, explains Ben Connors, director of BigData University and worldwide head of alliances with Jaspersoft Incorporated, San Francisco. The uses of big data are expanding, whether for improving the health of children, facilitating the search for clean sources of energy or analyzing intelligence data from unmanned aerial vehicles. As a result, managers are realizing the potential that may be hidden within large information files whose size is measured by petabytes and exabytes, Connors explains.

Departments: 

Pages

Subscribe to RSS - data management