Northrop Grumman Systems Corp., McLean, Virginia, has been awarded a $93,000,000 indefinite-delivery/indefinite-quantity contract. This requirement is for a follow-on to continue performance of highly specialized technical services in support of product data systems, data management, migration processes and transformation initiatives. Work will be performed at Robins Air Force Base, Georgia, and is expected to completed May 9, 2024. This award is the result of a sole-source acquisition. The first order obligates fiscal 2020 operations and maintenance; and working capital funds in the amount of $19,847,079. Air Force Sustainment Center, Robins AFB, Georgia, is the contracting activity (FA8571-20-D-0006).
The U.S. Space Force is pursuing a comprehensive data strategy, designed to harness data for strategic advantage. This next-generation data management effort is meant to be more of a precise engineering discipline—rather than an ad hoc organizational effort—and as such, includes the establishment of an associated governing body.
Unlike the other services, the military’s newest service, the U.S. Space Force, is starting with a chief data officer in place on day one of its existence. With an executive in place to guide how the service will administer its information, and with support from its top leadership, the service aims to have its data aid its strategic advantage.
As part of the Department of the Navy’s aggressive effort to improve its data environment in its information infrastructure, the department appointed Tom Sasala, Senior Executive Service (SES), to oversee the its data management, establishing the policies and the governance around the data fabric of the department.
The Department of the Navy, or DON, was already on a path to improve its data management when Congress passed the Open Government Data Act in January. The measure required cabinet-level agencies in the military departments to create a chief data officer position.
The U.S. Department of Defense is seeing the nation’s adversaries use capabilities better than the American military, but change is underway. In particular, the Army recognizes that it must dust off some of its aging procurement processes and leverage commercial technology to regain an advantage over its enemies, said Lt. Gen. Bruce Crawford, USA, the Army’s chief information officer/G-6, at the MILCOM 2017 conference in Baltimore.
Data stored “in silos” is not providing a fluid, agile stream of information that the DOD needs to perform everyday missions. While data may successfully be generated, getting the needed information “in the right hands at the right time” is a challenge the DOD is facing.
We’ve all seen those scenes in spy movies where intelligence analysts scrutinize a photograph, looking at the buildings, the plants, vehicles or visible wildlife to deduce where the picture was taken because, of course, doing so is critical to discovering the bad guy’s whereabouts and saving the planet. In real-life, the Intelligence Advanced Research Projects Activity (IARPA) Finder program, detailed in my August article "Tag Teaming Big Data," is designed to help analysts locate non-geotagged imagery, whether photographs or video.
U.S. Defense Department data will be invading the commercial world as the department moves its unclassified information out of its own hands. Terry Halvorsen, acting Defense Department chief information officer, described the upcoming move at the Wednesday luncheon of the AFCEA International Cyber Symposium, being held June 24-25 in Baltimore.
Researchers are working on a computer that just could soon do the thinking for humans. Computers today, while advanced, still mostly perform calculating functions using a central processing unit and memory that stores both a program and data, taking direction from the program and data from memory to function. Sandia National Laboratories researchers are developing “neuro-inspired” computing systems to work basically like human brains. They could detect patterns and anomalies to computing solutions.
The late Vice Adm. Arthur K. Cebrowski, USN (Ret.), looks over my shoulder as I work in my home office. His picture graced the May 2003 cover of SIGNAL Magazine, highlighting an article Clarence A. Robinson Jr., wrote based on an interview with the admiral. I was lucky enough to escort SIGNAL’s freelance photographer to take the photo of Adm. Cebrowski when he led the charge for change as the director of the U.S. Defense Department’s Office of Force Transformation. I received a cover photo plaque that hangs in my home office for my effort, though it really wasn’t necessary.
Following the terrorist attacks on September 11, 2001, government agencies came under widespread criticism for failing to share information and "connect the dots." By contrast, law enforcement agencies were almost universally praised following the Boston Marathon bombing and the shooting at the Navy Yard in Washington, D.C., both of which took place last year, pointed out panelists at the AFCEA Homeland Security Conference in Washington, D.C. on Monday.
The National Weather Service is the granddaddy of open source data, according to Adrian Gardner, chief information officer, Federal Emergency Management Agency (FEMA). And, the National Oceanic and Atmospheric Administration (NOAA) was "into big data before big data was cool," added David McClure, a data asset portfolio analyst within the NOAA Office of the Chief Information Officer. The two officials made their comments during a panel on big data analytics at the AFCEA Homeland Security Conference in Washington, D.C.
North Carolina (NC) State University has announced a new partnership with the National Security Agency (NSA) to create the Laboratory for Analytic Sciences (LAS) on the university’s Centennial Campus. The lab will bring together personnel from government, academia and industry to address the most challenging big data problems and will be a cornerstone of the emerging advanced data innovation hub at NC State.
Gray Research Inc. of Huntsville, Ala., is being awarded an indefinite-delivery/indefinite-quantity (IDIQ) contract modification under contract W9113M-05-D-0003. The total ceiling award value is increased by $21,886,024 from $222,609,913 to $244,495,937. Under this IDIQ contract, the contractor will continue performing work under its previously competitively awarded cost-plus-award-fee contract, W9113M-05-D-0003, providing data management services for the Missile Defense Data Center Program. The Missile Defense Agency, Huntsville, Ala., is the contracting activity.
Top information technology officials from a variety of government agencies identified cloud computing, mobile devices and edge technologies as the technologies that will be critical for accomplishing their missions in the future.
Luke McCormack, chief information officer, Justice Department, cited cloud-as-a-service as vital to the future. He urged industry to continue to push the barriers of stack computing, and he mentioned edge technology as an emerging technology. “Edge is going to be really critical to perform missions,” he said. He cited the Google Glass project as an indicator of what the future will bring.
Claraview, a division of Teradata, Reston, Virginia, was awarded an indefinite-delivery/indefinite-quantity contract not-to-exceed ceiling price of $27,016,520 for continuation of system life cycle support services for data management/enterprise data warehouse/business intelligence environments. The Defense Information Technology Contracting Organization, Scott Air Force Base, Illinois, is the contracting activity.
Jacobs Technologies Incorporated, Tampa, Florida, is being awarded a $139 million contract for information technology service management in support of U.S. Special Operations Command. The contractor will provide information technology support services to assist the command's information technology management office with the management of the enterprise networks, data management, distributed computing, specialty services, application management, communications, evolutionary technology insertion and other aspects of the command information enterprise.
As organizations migrate more data into public clouds, demands for a different type of security are emerging. A specialized option is available now for Amazon Web Services that aims to mitigate threats more quickly by finding them faster and suggesting methods of remediation.
Known as the Evident Security Platform for Amazon Web Services (ESP for AWS), the technology offers a solution expressly designed for the Amazon environment. It has a rapid deployment of five minutes or less and gives a dashboard view of identified threats. In the first week it launched, 50 companies of various sizes signed on for the platform, including several large, multinational corporations.
Where human analysis might fail in the intelligence community, technological solutions are at the ready to fill the void. Companies are ginning up software programs that can prove to be key for intelligence analysts as they track the bad guys, so to speak—be they insider threats or an outside enemy.
The amount of data produced in the increasingly connected and virtual world makes it difficult for human beings to scour, catalog and process and mounting information and produce actionable intelligence. So industry is devising technological workarounds or complementary programs to ease the workload and make their efforts more effective.
Mining big data for salient information points presents a plethora of challenges, but in Europe a different issue with the action has emerged as a concern. Regulations prohibiting researchers and others from searching through the data in certain documents are putting countries on the continent at a competitive disadvantage in a number of fields, studies are revealing. With several economies there already in dire straits, the legal encumbrances could add to difficulties in improving financial situations.
Two closely related science and technology programs aim to improve image location and search capabilities, saving intelligence analysts significant time and effort.
U.S. intelligence analysts often must wade through enormous amounts of imagery—both photographs and videos—to uncover the exact information needed. To make matters worse, data often does not contain geolocation tags, which indicate where the images were taken.
Technology innovations, new roles and expanding missions are shaping the move toward big data in the National Geospatial-Intelligence Agency. A mix of tradecraft and technology is ensuing as the agency evolves from an organization that always has worked with voluminous imagery files to one in which big data represents a goal that promises to change many aspects of intelligence.
U.S. Army officials envision a future in which ground and air platforms share data and where soldiers at a remote forward-operating base easily can access information from any sensor in the area, including national satellites or reconnaissance aircraft flying overhead. To achieve this big data vision, the service has initiated three pilot projects designed to provide Google-style access in a tactical environment to the lowest echelon without overwhelming soldiers with unnecessary data.
Virtualization and cloud implementation are critical components of information technology planning, acquisition and management going forward. Cloud implementations are important to security, efficiency, effectiveness, cost savings and more pervasive information sharing, particularly among enterprises. Cloud architectures also are extremely important for more effective use of mobile technologies. Mobility increasingly is important, particularly for the military, which needs a full range of information technology services while on the move. Yet increased movement to the cloud, along with traditional uses of spectrum, are putting unprecedented demands on every part of the spectrum.
The Defense Department drive toward its Joint Information Environment is picking up speed as it progresses toward its goal of assimilating military networks across the warfighting realm. Individual services are developing solutions, some of which are targeted for their own requirements, that are being applied to the overarching goal of linking the entire defense environment.
Early successes in Europe have advanced Joint Information Environment (JIE) efforts elsewhere, including the continental United States. Some activities have been accelerated as a result of lessons learned, and they have been implemented ahead of schedule in regions not slated to receive them for months or even years.
Homeland Security Conference 2014 Online Show Daily, Day 2
It is not surprising that cybersecurity would dominate the discussion on the second day of the AFCEA Homeland Security Conference in Washington, D.C. But the depth and breadth and variety of topics surrounding cybersecurity and information protection in all its forms indicates the degree to which the information security mission has engulfed every department and agency at all levels of government.
Homeland Security Conference Show Daily, Day 1
Information sharing and interoperability have come a long way since the terrorist attacks of September 11, 2001, but challenges still remain, agreed speakers and panelists on the first day of the AFCEA Homeland Security Conference in Washington, D.C.
The emergence of big data combined with the revolution in sensor technology is having a synergistic effect that promises a boom in both realms. The ability to fuse sensor data is spurring the growth of large databases that amass more information than previously envisioned. Similarly, the growth of big data capabilities is spawning new sensor technologies and applications that will feed databases’ ever-increasing and diverse types of information.
This rarely happens, but for 2014, defense and technology analysts are in agreement that big data and cybersecurity are the two drivers in planning and investment for information technology, both in government and in industry. Most everything else will be enabling these two key capabilities. While much attention has been focused on the threats and work being done globally on cybersecurity, I want to focus on big data.
The increasing presence of news sources on the Internet offers an unprecedented opportunity to access open-source intelligence for a variety of purposes. Researchers from several U.S. universities have collaborated to take advantage of these resources, creating a big data collection and distribution process applicable to disciplines ranging from social research to national security.
The U.S. Defense Department now is advancing into the third generation of information technologies. This progress is characterized by migration from an emphasis on server-based computing to a concentration on the management of huge amounts of data. It calls for technical innovation and the abandonment of primary dependence on a multiplicity of contractors.
The move to the cloud that is gripping all elements of government and industry offers great potential for the U.S. Navy, according to its chief information officer. Terry Halvorsen told the breakfast audience on the final day of TechNet Asia-Pacific 2013 in Honolulu, Hawaii, that the move to the cloud is one of the best areas for gaining effect in Navy information technology.
However, other elements must fall into place for this move to be successful. Halvorsen said it must be but it must be coupled “with how you look at and structure applications,” adding the Navy has too many applications.
U.S. Army researchers are developing a software program that will provide signal corps officers will an improved common operating picture of the network, enhance the ability to manage the plethora of electronic systems popping up on the modern battlefield, advance information sharing capabilities and allow warfighters to make more informed and more timely decisions. In short, the system will assist in planning, building, monitoring and defending the network.
Another Overhyped Fad
By Mark M. Lowenthal
Director of National Intelligence Lt. Gen. James R. Clapper, USAF (Ret.), once observed that one of the peculiar behaviors of the intelligence community is to erect totem poles to the latest fad, dance around them until exhaustion sets in, and then congratulate oneself on a job well done.
In the next few years, usernames and passwords could gradually fade from popular use as a way to conduct business online. A public/private coalition is working on a new policy and technical framework for identity authentication that could make online transactions less dependent on these increasingly compromised identity management tools. A second round of federal grants from the group, expected this fall, will lead to continued work on what is expected to become a private sector-operated identity management industry.
Scientists at the U.S. Defense Department’s top research and development agency are seeking the best new ideas to provide a larger-scale mobile network to support an increasing array of bandwidth-hungry mobile computing devices for warfighters.
The Defense Advanced Research Projects Agency (DARPA) has issued a Request for Information (RFI) for new technical approaches that would expand the number and capacity of Mobile Ad Hoc Networks (MANETs) nodes available in the field.
The U.S. Navy’s Next-Generation Enterprise Network will introduce a host of new capabilities for users when it is implemented. These improvements will become apparent over time as the system’s flexibility allows for technology upgrades and operational innovation on the part of its users.
The network’s overall goals remain the same despite a protest over the contract award. However the protest is resolved, the program is designed to provide networking at less cost and with more flexibility to adjust for changes that emerge as a result of operational demand or technology improvements. These new capabilities could range from greater use of mobile technologies to virtual desktops dominating user environments.
Recent government initiatives to trim the number of data centers in the federal government have been beset by unforeseen delays in meeting target goals. Key among these challenges is the realization that the number of data centers is actually much larger than originally thought. Testifying before the House Committee on Oversight and Government Reform on July 25, the heads of several federal oversight agencies discussed why ongoing efforts have faltered and disagreed with the committee’s interpretation of the situation.
Rear Adm. Robert Day Jr., USCG, assistant U.S. Coast Guard commandant for command, control, communications and information technology, sees the Joint Information Environment as an opportunity to resolve some of the most pressing information technology problems in the years to come as he faces a future with more challenges and fewer resources. He says a military-wide common operating environment will establish “enterprisewide mandates that programs cannot ignore.”
From handheld to the cloud, new technologies are driving new approaches to data assurance.
The increasing use of readily available and inexpensive commercial technologies by the military is changing the way the Defense Information Systems Agency provides information assurance. As these technologies are integrated into the Defense Department information infrastructure, the agency is adjusting its approaches to providing security for its networks and the data that reside on them.
The U.S. Army Research Laboratory (ARL) at Aberdeen Proving Grounds, Maryland, has unveiled two new supercomputers that are among the fastest and most powerful devices of their kind. The devices are part of a recently opened supercomputing center that is the new locus of the service’s use of high-speed computing not only for basic scientific research and development, but also to solve basic warfighter needs using the latest available technologies.
The Air Force encounters turbulence of the digital kind when it underestimates the complexity of moving the service to a single network.
The U.S. Air Force’s migration to a new enterprise network known as AFNET will be at least two years late in completion because the project turned out to be more complicated than planners anticipated.
The U.S. intelligence community will be relying to a greater degree on commercial technologies to meet its current and future requirements, including some that formerly were the purview of government laboratories. And, because much of the community’s research is applied research, it will select its budgeting priorities based in part on how well the commercial sector can fill in some technology gaps on its own.
The revision reflects efforts of government-wide joint task force.
Managers of information technology systems for the federal government have new mandatory guidance on security and privacy controls used to manage and protect those systems from cyber attack.
Defense Department will decide on a path forward within 30 days.
Secretary of Defense Chuck Hagel told members of Congress on April 16 that he is personally committed to solving the database interoperability problems between the Defense Department (DOD) and the Department of Veterans Affairs (VA) that have left thousands of veterans waiting months while benefits claims are processed.
An Army research team develops a device that could assist warfighters' decision making.
Homeland Security Conference 2013 Show Daily, Day 3
Although many in government are moving as quickly as possible to adopt new technologies, such as cloud computing and mobile devices, individual agencies still face cultural challenges that sometimes prevent them from moving forward, according to officials speaking as part of the Chief Information Officer Council at the AFCEA Homeland Security conference in Washington, D.C.
When it comes to popular smartphones and tablets, security can be a many-layered and necessary endeavor
The growing use of advanced mobile devices, coupled with the increase in wireless broadband speed, is fueling demand by employees to bring their own devices to the job. This situation has opened a new set of security challenges for information technology staff, especially when it comes to the use of apps.
To meet the challenge of implementing big data, a new international scientific organization is forming to facilitate the sharing of research data and speed the pace of innovation. The group, called the Research Data Alliance, will comprise some of the top computer experts from around the world, representing all scientific disciplines.
Managing the staggering and constantly growing amount of information that composes big data is essential to the future of innovation. The U.S. delegation to the alliance’s first plenary session, being held next month in Switzerland, is led by Francine Berman, a noted U.S. computer scientist, with backing from the National Science Foundation (NSF).
A multi-agency big data initiative offers an array of national advantages.
U.S. government agencies will award a flurry of contracts in the coming months under the Big Data Research and Development Initiative, a massive undertaking involving multiple agencies and $200 million in commitments. The initiative is designed to unleash the power of the extensive amounts of data generated on a daily basis. The ultimate benefit, experts say, could transform scientific research, lead to the development of new commercial technologies, boost the economy and improve education, all of which makes the United States more competitive with other nations and enhances national security.