Enable breadcrumbs token at /includes/pageheader.html.twig

Perception Guides the Future of Automatons

The key to attaining the long-sought goal of fully autonomous unmanned ground vehicles may lie in their ability to recognize reality. Scientists pursuing the development of truly independent robotic vehicles are finding that perception is the key hurdle they must overcome. The development of these vehicles hinges on solving problems relating to perception and its data processing.

 

A cluster of robots races across the desert in this artists’concept from the Defense Advanced Research Projects Agency. The ability of robots to operate autonomously in diverse conditions will rely on advances in perception capabilities, according to a National Research Council report.

Seeing is believing if robots are to perform up to par.

The key to attaining the long-sought goal of fully autonomous unmanned ground vehicles may lie in their ability to recognize reality. Scientists pursuing the development of truly independent robotic vehicles are finding that perception is the key hurdle they must overcome. The development of these vehicles hinges on solving problems relating to perception and its data processing.

The goal of autonomous unmanned ground vehicles (UGVs) remains beyond the reach of existing technological capabilities. Separate system development has produced some gains in a few technologies, but progress has not been nearly enough to generate the advances necessary to achieve realistic autonomous vehicle operation. And, the discipline has yet to see broad-based system integration of these enabling technologies.

A partial road map of the future of robotics was laid out in a recent National Research Council (NRC) report that addressed U.S. Army requirements. Titled “Technology Development for Army Unmanned Ground Vehicles,” the report focuses on capabilities and possibilities in assessing the future of robotics. While its focus is specific to Army needs, the issues raised in the report apply to many other robot applications.

The report describes the development of perception technologies as the highest priority for autonomous mobility. Dr. Clinton W. Kelly III, senior vice president, advanced technology programs, SAIC, McLean, Virginia, worked on the study with several other robotics experts. He relates that insufficient funding and a lack of system focus have left the development of UGVs lagging unnecessarily.

Currently, autonomous ground vehicles can come close to human performance levels on well-delineated roads in dry weather during daytime. Vehicles can follow roads up to 65 miles per hour under these conditions. However, at night or in adverse weather, the maximum speed drops significantly. On unimproved roads, UGVs can travel at top speeds approaching only 20 miles per hour. Robotic systems have difficulty dealing with existing tire tracks and other off-road conditions.

Perception tasks differ widely for on-road and off-road performance. For a UGV traveling on a paved road, the vehicle must find and follow the road; detect and avoid obstacles; detect and track other vehicles, particularly for a leader-follower operation; and detect and identify landmarks. This addresses the dual concerns of navigation and obstacle avoidance.

For off-road operation, the UGV must engage in a greater number of more complicated functions. It must follow a planned path subject to tactical constraints; find mobility corridors that enable the planned path or support replanning; detect and avoid obstacles; identify features that provide cover, concealment or vantage points; detect and identify landmarks; detect, identify and track other vehicles in formation; and detect, identify and track dismounted infantry.

In either setting, effective perception means object and terrain classification, which are combined to plot a route for safe passage by the vehicle. Identifying objects requires color sensors that can differentiate among spectral properties particular to objects such as grass, dirt or gravel, for example. If an autonomous vehicle cannot differentiate between rocks and clumps of grass, then it could find itself careening needlessly across a field as it strives to avoid “menacing” clumps of grass.

This necessary differentiation might be achieved by near-infrared sensing using bandwidths of 3 to 5 microns divided among bands, Kelly offers. Object texture could be determined by assessing changes in contrast that occur in front of the vehicle. This works well for differentiating objects, Kelly relates, but it requires a large amount of computational power.

The biggest challenge facing UGV perception is the relationship between stopping distance and viewing distance. As with any vehicle in motion, a faster speed requires a greater stopping distance. So, an autonomous vehicle moving along at a relatively fast speed must perceive obstacles and hazards at a greater distance than the same vehicle moving slowly. And, the faster vehicle must be able to adjust to the obstacles more quickly.

But rapid reaction is only part of the solution. A vehicle moving at about 5 miles per hour can make sudden course changes without putting itself at risk. However, the same vehicle moving at more than 30 miles per hour might risk a loss of control or even a rollover if it suddenly makes a sharp move.

Speed also affects the definition of an obstacle. A six-inch rock in the path of a slow-moving high mobility multipurpose wheeled vehicle (HMMWV) will do little to affect the vehicle’s progress. However, the same six-inch rock could cause damage or more significant problems to the same HMMWV if the vehicle is moving at 40 miles per hour.

A HMMWV traveling at 5 to 10 miles per hour would require 15 to 25 feet to stop and 15 to 22 feet to turn. The minimum size of an obstacle to avoid at this speed would be 12 inches in diameter. Accelerating to 20 miles per hour increases the stopping distance to 60 feet and the turning distance to 35 feet. And, that obstacle to be avoided may be as small as 6 inches in diameter. When the HMMWV is traveling at 40 miles per hour, it requires 110 feet to stop and 65 feet to turn. The minimum size of an obstacle to be avoided is down to as little as 4 inches.

So, a faster vehicle not only must be able to perceive obstacles farther away, it also must be able to spot even smaller obstacles at those greater distances. The size of the obstacle that must be detected is inversely proportional to the speed of the vehicle. And, the closure rate is greater.

Detecting those smaller obstacles at greater distances will require more pixels on sensors, which in turn affects the sensors’ focal lengths and other attributes. Among the characteristics of long-distance sensing is a narrow field of view, which clashes directly with the wide field of view that is more desirable to a vehicle that may have to turn quickly. “If you want to maintain a fairly wide field of view, then you cannot look very far ahead; and if you cannot look very far ahead, then you must limit your speed,” Kelly states.

 

Effective sensor systems may hold the key to unmanned ground vehicles’ (UGVs’) taking over roles normally fulfilled by army battlefield personnel. One possible solution may be to place multiple sensors at different heights on robotic vehicles.

This problem is complicated if the obstacles are hidden in dense undergrowth such as weeds or tall grass. Large rocks or fallen trees could be concealed effectively by dense grass, and one of them could prove the undoing of a UGV. Even upright trees with relatively small diameters could confound a vehicle’s detection system until it is too late.

These ground obstacle detection challenges focus on physical obstacles such as rocks. Another problem facing UGVs is that of negative obstacles—holes in the ground. A ditch large enough to stop a vehicle in its tracks may not be visible until the vehicle is literally on top of it. A sensor positioned about two feet above the ground will not see a hole in time for the UGV to avoid it, and this problem applies to active as well as passive sensors. One way to improve the vehicle’s chances is to place the sensor at least five feet above the ground.

With all these hazards and technological limitations in mind, researchers have had to consider speed targets for UGVs. Adopting a high UGV speed for a design objective places other requirements, possibly unattainable, on the list. One Army program aimed at a 40-mile-per-hour on-road travel speed in daytime and 20 miles per hour off-road. For nighttime, the UGV would travel at 10 miles per hour on- or off-road. However, infantry exercises pointed out that these speeds would not enable a UGV to keep up with the force, which may travel at upwards of 50 or 60 miles per hour. So, these Army objectives may not be high enough.

Passive stereovision has shown promise in detecting obstacles in roads. Researchers have produced positive results over long distances using telephoto lenses to produce a three-dimensional representation of terrain ahead of a vehicle. This works especially well with trees. However, this approach does not always provide identification of an object. Some objects might not be obstacles, but their detection as such could subject the vehicle to unnecessary maneuvers or even course changes.

The ultimate solution may be active vision. Wide-angle sensors would scan the near field of view, and some would look farther out—with less detail—to generate cues about items or conditions that bear closer examination. Meanwhile, a sensor on a pan-tilt mount would foviate on the high-interest areas in the same way that people optically pick items out of a scene.

This kind of perception system would best consist of multiple sensors rather than a single multispectral sensor, Kelly posits. Passive stereovision sensors do not provide the necessary resolution to be truly effective. Active sensors have that resolution, but they do not have the necessary range.

Extending perception range is a priority, Kelly maintains. A UGV that cannot look ahead and identify environmental features more than 50 meters away cannot move quickly. This task is compounded in a tactical environment, where that type of identification may be essential to the vehicle’s survival. That vital range may extend as far out as 1,000 meters.

The NRC report cites several areas where sensors could be improved. These include increased resolution; better spectral differentiation; a multiband forward-looking infrared (FLIR) system for nighttime running, preferably uncooled and without latency problems; and less noisy low-light-level television for stereovision.

The Defense Advanced Research Projects Agency’s (DARPA’s) PerceptOR program is focusing on perception in off-road vehicle use. Its final examinations require a vehicle to operate autonomously in unknown terrain. Kelly reports that the vehicle in use on that project features color stereovision, stereo FLIR and a variety of laser scanners. Engineers also tested a 2-gigahertz high-frequency radar for detecting rocks hidden in grass, Kelly relates.

However, he warns against proliferating the number of sensors. It can be “a logistics nightmare” to build a system rife with different kinds of sensors. Data fusion would be a better approach, he says. Symbolic fusion, where data processing would combine different feature vectors from the sensors could yield significant dividends.

One approach to meeting this challenge may be to set up perception zones. For example, a vehicle would regard objects within 50 meters as within a reactive zone. The UGV would focus on obstacle avoidance in this range. Beyond 50 meters to 500 meters would be the deliberative zone, where the vehicle focuses largely on navigation. From 500 meters to 1,000 meters out would be the tactical zone, where the UGV concentrates on terrain features that would be important for concealment or maximum traversibility.

The potential for different actions within each zone requires that the vehicle avoid contradictory actions. One arrangement, known as behavior arbitration, is a set of code built into the architecture. This set has rules for choosing among behaviors if the robot develops conflicting objectives.

Ultimately, the key to effective perception may be that the UGV learns as it proceeds. Current technology has permitted development of machines that can learn from their own performances, but not as much has been achieved in the area of perception, Kelly offers. He believes that this development may be the next breakthrough in the discipline. 

Another avenue may be embedded computing. Embedding a sensor with gigaflops of computing power could produce an image with texture for discerning different types of terrain. This would be especially useful in applying contextual data to obstacle avoidance—knowing the difference between a puddle of water, for example, and a ditch full of water.

Part of the problem is that the defense establishment has not spent nearly enough on robotics, Kelly charges. He describes expenditures as episodic, noting that funding would peak at about $5 million per year during a surge. However, the magnitude of the task requires substantially more.

“You really are trying in a sense to replicate a lot of the human visual system,” he observes. “That is a daunting task.”

Another problem with funding is that much of it has been spread out across many institutions. This may work to advance the state of research in diverse areas, but to produce a product—or even a prototype—this “progress at the canonical level” does not come together at the system level. Officials must increase the focus on the system level where robotic components are integrated, Kelly offers.

Adequate funding would not require huge amounts of money, Kelly maintains. Merely creating an organization dedicated to UGV development and then funding it with about $10 million annually over five years should generate military-useful robotic technologies. “For that kind of money, you could make astounding progress—if you had a dedicated group, kept the team together and just worked the problem,” he warrants.

The NRC report calls for the Army to give top priority to the development of perception technologies, including creation of a “Skunk Works” type of facility. The focus of this facility would be the development of perception technologies enabling autonomous A-to-B mobility that can be fielded with multiple UGV systems. Only a Skunk Works approach similar to that used by the Lockheed Martin facility will bring together the necessary resources, focus and leadership, the report emphasizes.

Additional information on the National Research Council’s report on U.S. Army unmanned ground vehicles is available on the World Wide Web at http://books.nap.edu/books/0309086205/html/116.html

 

Robots Face Lofty Army Goals

If the U.S. Army wants to use unmanned ground vehicles (UGVs) to reduce the size of the force, these vehicles must feature a considerable amount of autonomy. “If you take the people out of the vehicle, but don’t take the people out of the force, then you haven’t met the objective of substantially lightening the force by reducing the logistics burden,” says SAIC’s Dr. Clint W. Kelly III of a National Research Council report on Army unmanned ground vehicles. “You reduce the logistics burden in large measure by reducing the amount of people.”

The minimum degree of autonomy that is useful is A-to-B mobility, the report continues. The destination assigned to the vehicle is B, while the vehicle knows it is at location A. Given this information, the vehicle must plan a route between those two points and also detect and avoid any obstacles that arise during its travels. This was the approach taken for the Defense Advanced Research Projects Agency’s Grand Challenge (see page 54).

Fulfilling this task on a paved, lined road is not too difficult. Simple navigation can allow the vehicle to proceed along a clearly marked path. Achieving autonomous operation across an unpaved field full of rocks and ditches is another matter. This brings sensors and processing algorithms into play. The degrees of autonomy that an existing UGV can attain vary according to terrain, weather conditions, time of day and the specific task asked of the vehicle.

“If you live in an environment where you know a lot, then perception is less important,” Kelly allows. “However, if you stick a vehicle in unknown terrain, then perception is the only way [to succeed].”

Kelly offers that, at the highest level, the goal is a robot that can achieve its tasks merely after being given orders, regardless of conditions. At the other extreme is a vehicle that must be controlled by humans moment to moment.

Kelly draws the distinction between remote control and teleoperation by defining remote control as operation of a vehicle that is within sight of the operator. The operator guides the vehicle while watching it. Teleoperation takes place when the human operator cannot see the vehicle and instead must rely on imagery from the vehicle’s own sensors for control.

Most of the robotic vehicles used by the Army today—particularly those deployed for countermine operations—are teleoperated rather than autonomous. While autonomous vehicles are the goal, for the foreseeable future most Army UGVs will require human control on the battlefield, Kelly states. However, the operator also may be in a moving vehicle. Teleoperating a moving vehicle from another moving vehicle presents challenges of its own.

For the Army’s Future Combat Systems (FCS), the service will require UGVs that can operate to varying degrees of autonomy. Two of the FCS vehicles, the armed reconnaissance vehicle and the mule, will require the ability to conduct autonomous collaborative operations. The current state of the art is far from that, Kelly maintains.

The key to attaining autonomy is good perception, the NRC report states. And, it is in this arena that the state of the art needs the most improvement.