If the pursuit of DNA-based data storage is a race, it is probably more of a long, arduous, challenge-laden Tough Mudder than a quick, straightforward 50-yard dash. Or it may be a tortoise and hare situation with data growing at an extraordinary pace while science moves steadily along in hopes of gaining the lead.
U.S. Army officials expect in the coming weeks or months to release a data strategy that will be closely aligned with its existing cloud strategy and are also building an enterprise cloud office, according to Gregory Garcia, the Army’s deputy chief information officer/G-6 and chief data officer.
Garcia made the remarks during an address and fireside chat on the second morning of the AFCEA TechNet Augusta conference in Augusta, Georgia.
“We have a data strategy that’s going to be processed in the next weeks and months. That’s going to get after making sure data is visible, accessible, understandable and interoperable,” he said.
“The whole business of being a CTO has changed,” said Yuvi Kochar, managing director, technology and operations, CAQH, a nonprofit alliance creating shared initiatives to streamline the business of healthcare.
During his keynote address at the AFCEA-GMU C4I and Cyber Center Symposium, the former chief technology officer (CTO) of The Washington Post, discussed how he first became a CTO in 2000 for a small startup in Boston. “My first job was all about building technology and operating it. And that was good enough,” Kochar said.
Over time though, he’s seen the job transform into a more business-centric role. “Technology is taking more and more of a backseat,” he related.
The Defense Security Service (DSS) and Defense Information Systems Agency (DISA) have awarded nearly $75 million to Perspecta Enterprise Solutions LLC of Herndon, Virginia, to help reform and modernize the security clearance personnel vetting processes and develop the National Background Investigation Service (NBIS) information technology system.
Lt. Gen. Bruce Crawford, USA, chief information officer/G-6, U.S. Army, suggests the possibility of an Internet of Strategic Things in addition to the Internet of Tactical Things.
“We’ve had some really good discussions about the Internet of Things. That was a thing a couple of years ago. And then we started talking about the Internet of Tactical Things. I think what’s on the horizon is more of a discussion of the Internet of Strategic Things,” Gen. Crawford told the audience on the second day of the AFCEA TechNet Cyber 2019 conference in Baltimore.
Trident Juncture 2018, a large-scale NATO military exercise, wrapped up late last year. But in the weeks since, the alliance has been doing something it has never done before by using big data science to help inform lessons learned from the exercise.
The National Science Foundation’s Directorate for Computer and Information Science and Engineering is working to create a big data ecosystem. As part of that effort, the NSF, as it is known, is expanding the National Network of Big Data Regional Innovation Hubs, first created three years ago. The hubs, with one location for each U.S. Census region—the Midwest, Northeast, South and West—grew out of the need to aid the development of big data research and to help solve complex societal problems. The hubs are having a positive impact on the growth of machine learning, increasing the access to data, methods, networks and expertise, experts say.
Burgeoning computer capabilities often are unreliable, or brittle, at first. Capabilities that work successfully in one instance may fail miserably when applied to another area. At the moment, machine learning is no different, experts say, and the government and private industry are endeavoring to get past the limitations to improve its use.
The U.S. Navy is in the nascent stages of a plan to revolutionize readiness through the use of artificial intelligence, machine learning and data analytics. It also may include the establishment of two new offices: a chief readiness office and an analytics office.
The U.S. Coast Guard is pursuing digital solutions to support its unique set of military, law enforcement, humanitarian, regulatory and diplomatic responsibilities. It is no small feat to provide information technology to its workforce of 87,570, as well as to its cutters, boats, and aircraft that move along the coastline and inland waterways protecting the United States.
The U.S. Defense Department lags the hype cycle for artificial intelligence, machine/deep learning and implementations like natural language processing by years. It needs to uncover the root causes contributing to this delay and create winning strategies to overcome institutional obstacles to get ahead of industrial partners and adversaries who are further along the adoption curve.
Possessing technology is neither deterministic nor decisive when waging war. The effective employment and deliberate application of technologies to enhance warfighting capabilities implies advantage over an adversary when suitably coupled with offensive and defensive tactics.
Later this month a team of researchers plans to release an online wargame that will use machine learning and data analytics to study nuclear conflict escalation and the strategic stability of nations in an artificial world.
Artificial intelligence and machine learning are two of the many technologies that will change the way the military operates, according to a panel of experts. However, despite the revolutionary innovations that lie ahead, humans always will need to be the controlling factor in any operation.
These experts offered their views of the future on the second day of AFCEA’s TechNet Asia-Pacific 2018, held November 14-16 in Honolulu. In a panel sponsored by the Young AFCEANs, the five experts presented a younger generation’s perspective on the advantages and pitfalls of a data-centric battlespace.
Implementing a new system can be an exciting time, but the nagging questions and doubts about the fate of data you’ve literally spent years collecting, organizing and storing can dampen this excitement.
This legacy data often comes from a variety of sources in different formats maintained by a succession of people. Somehow, all the data must converge in a uniform fashion, resulting in its utility in the new solution. Yes, it is hard work and no, it is not quick. Fortunately, this scrubbing and normalization does not have to be a chaotic process replete with multiple failures and rework.
Artificial intelligence can be surprisingly fragile. This is especially true in cybersecurity, where AI is touted as the solution to our chronic staffing shortage.
It seems logical. Cybersecurity is awash in data, as our sensors pump facts into our data lakes at staggering rates, while wily adversaries have learned how to hide in plain sight. We have to filter the signal from all that noise. Security has the trifecta of too few people, too much data and a need to find things in that vast data lake. This sounds ideal for AI.
The Securities and Exchange Commission issued several Qualitative Research and Analytical Data Support (QRADS) indefinite delivery, indefinite quantity (IDIQ) contracts to TCG. The company, an information technology solutions and advisory services provider based in Washington, D.C., is now supporting four lines of business at the SEC—classified as Channels 2,3,4 and 5—according to recent company statements. TCG is one company out of several that received work under the multi-award IDIQs, with other winners on each channel.
Researchers at North Carolina (NC) State University have developed a new computational model that draws on normally incompatible data sets, such as satellite imagery and social media posts, to answer questions about what is happening in targeted locations. The model identifies violations of nuclear nonproliferation agreements.
Traffic on optical transport networks is growing exponentially, leaving cyber intelligence agencies in charge of monitoring these networks with the unenviable task of trying to sift through ever-increasing amounts of data to search for cyber threats. However, new technologies capable of filtering exploding volumes of real-time traffic are being embedded within emerging network monitoring applications supporting big data and analytics capabilities.
When National Science Foundation officials announced in February that three major providers of cloud computing were donating up to $9 million collectively for big data research, they already were looking for ways to broaden the effort to include a wider variety of topics, including cybersecurity. The expansion is intended to benefit both research and education initiatives and is necessary, in part, because the cloud providers now acquire cutting-edge hardware before it is made available to researchers.
Federal mandates and economic concerns are pushing businesses and government agencies to migrate their IT services to the cloud. As a result, decision makers must consider how to proceed in a way that meets compliance requirements in a timely, affordable and secure fashion.
Two data migration experts at experienced commercial organizations recently offered their advice to organizations that are just beginning on the data migration trail or are well on their way but hitting a few bumps in the road.