If the pursuit of DNA-based data storage is a race, it is probably more of a long, arduous, challenge-laden Tough Mudder than a quick, straightforward 50-yard dash. Or it may be a tortoise and hare situation with data growing at an extraordinary pace while science moves steadily along in hopes of gaining the lead.
A deepfake is an artificial intelligence-based technology used to produce content that presents something that didn’t actually occur. This includes text, audio, video and images. Deepfakes are a recent disruptive development in the technology world, and both the House and Senate are investigating the phenomena and have introduced bills to combat these potentially dangerous creations.
In a dark, wet and rocky research coal mine in western Pennsylvania, teams from around the globe put their robotic systems to the test in the Defense Advanced Research Projects Agency’s, or DARPA’s, latest contest. The agency designed the Subterranean Challenge, also known as the SubT Challenge, to spur the advancement of technologies that work well underground, including autonomous and other robotic systems, which could benefit first responders and the military, explained Timothy Chung, program manager, Tactical Technology Office, DARPA, to the media in attendance at the event.
The modernization, proliferation and commoditization of electronics make contending with peer and near-peer adversaries more difficult, according to Chuck Hoppe, director of science, technology and engineering at the U.S. Army’s Combat Capability Development Command C5ISR Center. “For every good thing we bring out of technology, someone inevitability wants to use it for nefarious purposes. That has been the biggest change in the past 20 years, and it’s what made things significantly more deadly and lethal,” he says.
Want to be disruptive, I mean truly disruptive? Try delving into history while surrounded by software engineers and app developers. Watch how the presence of a book on Charles Babbage and Ada Lovelace in the 19th century raises eyebrows at your next scrum team meeting. Be passionate about the history of technology, and you will disrupt.
I recently completed a short course on the history of computer science. Accounts of generations of scientists and engineers stepping from one advancement to the next through iterative problem solving efforts provided rich details about how computers progressed and the thinking of those working to advance the broader field of study.
Government agencies face similar challenges when it comes to understanding—and gaining intelligence from— foreign language content. They need to process, manage and gain insight from large volumes of content locked away in different formats, often across multiple languages. And they need to do all of this as quickly as possible. It’s no mean feat when you consider the mindboggling amounts of content being generated: 90% of the world’s content was created over the past two years alone.
Blockchain has achieved enough recognition and use so it no longer is a fad, but neither is it a panacea. Companies and organizations are discovering limitations to its usefulness as they embrace what they originally thought was the answer to all their concerns. While some of these hopes have been found wanting, the new cryptographic record-keeper is still evolving, and it ultimately may develop into a tool with utility far beyond current expectations.
Blockchain, the digital ledger technology, offers an immutable record of a transaction based on a distributed consensus algorithm. The technology gained notoriety through the use of bitcoin, the digital commodity. However, experts say that the blockchain technology has moved well beyond its initial underpinning role. “Bitcoin is basically like the Model T of blockchain technology, because it was the first one,” says Lee McKnight, associate professor, School of Information Studies, Syracuse University, Syracuse, New York.
A future iteration of artificial intelligence would measure a soldier’s cognitive and physical state and trigger actions that would support, or even save, the individual in combat. These actions might direct the human on a different course, or ultimately initiate activities that complete the soldier’s mission or protect the individual in combat.
Congressional leaders guiding the Congressional Blockchain Caucus are finding that part of their informative role necessitates distinguishing between the infamous dark web capabilities of digital commodities and the groundbreaking capabilities that a blockchain platform can offer as an advanced technology.
Blockchain, also described as a distributed cryptographic digital ledger, provides a verified record of transactions that is immutable or unchangeable. Legislators purport that the powerful capability, which some say could transform the economy, can be applied well beyond digital commodities for use in such sectors as healthcare, defense, supply chain management and cybersecurity.
U.S. Army scientists are learning more about how the human brain functions so they can team its bearer with artificial intelligence (AI). The goal is for AI to understand a soldier’s moods and feelings and adjust its own actions accordingly.
Researchers aim for a future iteration of AI that would measure a soldier’s cognitive and physical state and trigger actions that would support, or even save, the individual in combat. These actions might direct the human on a different course, or ultimately initiate activities that complete the soldier’s mission or protect the individual in combat.
Researchers at the Georgia Institute of Technology have created a new type of tiny 3D-printed robot that moves by harnessing the vibration from piezoelectric actuators, ultrasound sources or even tiny speakers.
The size of the world’s smallest ant, these “micro-bristle-bots” could sense changes in the environment and swarm together to move materials—or perhaps one day repair injuries inside the human body.
The National Science Foundation (NSF) is investing in a number of research institutes designed to advance quantum technologies in four broad areas: computation, communication, sensing and simulation. The institutes will foster multidisciplinary approaches to specific scientific, technological, educational, and workforce development goals in quantum technology, which could revolutionize computer and information systems.
The fight to secure microelectronic chips is becoming as basic as the chip itself. With chips facing a myriad of threats throughout their life cycle, experts are incorporating security measures into the development of the chip from the foundry to assembly. Other approaches safeguard against threats that could appear as the chip moves through the supply chain. The bottom line for microelectronics security is that necessary measures cannot wait until the device is in the hands of the user.
Nanosized robots capable of crawling around on a person’s brain or underneath the skin may sound like a nightmare to some, but researchers suggest the mini machines could serve medical purposes such as gathering data on the brain or the spinal column.
The colossal reliance on semiconductor chips by the military and commercial industry reaches across weapons, machines and systems that perform key defense and national security functions. And while the Defense Department and the industry use secure chips, they are expensive and hard to design. To remedy that, the Defense Advanced Research Projects Agency, known as DARPA, is looking to automatically include defense mechanisms into the design of microchips. The agency is creating tools to manage the supply chain custody throughout the life cycle of a microchip and increase the availability and economics of secure microelectronics.
As semiconductor manufacturers aim to produce devices at the 5-nanometer node, the ability to find tiny defects created inadvertently during the fabrication process becomes harder. In addition, there is a growing need to verify that a chip was built as specified and doesn’t contain a malicious agent. Harnessing optical methods for semiconductor wafer inspection is one way to effectively look for anomalies, says Lynford Goddard, professor of Electrical and Computer Engineering (ECE) at the University of Illinois at Urbana-Champaign.
The discovery and taming of fire changed the way humans lived. Its broad range of uses came with both benefits and hazards. It could enable life in harsh environments, but it could also serve as an instrument of destruction. The same dichotomy holds true with social media today, but its ill effects cannot be easily extinguished.
With the explosion of artificial intelligence onto the computing scene again, the hype about the technology continues to grow. Making sense of how to employ artificial intelligence (AI) and machine learning (ML) can still be difficult, however, experts reasoned Monday at the AI World Government conference, held in Washington, D.C., June 24-26.
Artificial intelligence (AI) research has enabled breakthroughs across almost every sector. The National Science Foundation (NSF), a leading funder of activities that support AI research and innovation, is joining other federal agency partners to announce the release of the 2019 update to the National Artificial Intelligence (AI) Research and Development (R&D) Strategic Plan.
The strategic plan was developed by the Select Committee on AI of the National Science and Technology Council (NSTC). The 2019 plan offers a national agenda on AI science and engineering.