Blog     e-Newsletter       Resource Library      Directories      Webinars     Apps
AFCEA logo

Mine-Hunting Technology Learns to Fight Cancer

February 2012
By George I. Seffers, SIGNAL Magazine
E-mail About the Author


One application of FARSIGHT is to map the structure of complex tissues, including the structurally intricate brain tissue of a rat. The software, which has been enhanced with Office of Naval Research algorithms for locating and identifying undersea mines, is used for a wide array of medical purposes.

Doctors recruit software designed to save lives at sea.

The U.S. Navy and the medical community share seemingly different but surprisingly similar problems—finding undersea mines and identifying certain cells, such as cancer cells. And they have discovered that software designed by the Navy to locate undersea mines also contributes to faster, more accurate diagnoses of diseases and can foster medical breakthroughs.

The Office of Naval Research (ONR), Arlington, Virginia, is developing  active-learning software for identifying undersea mines. The goal is to make underwater mine-hunting robots smarter, eliminating the need for divers to risk their lives by closely inspecting potentially explosive objects. The software supports the mine-hunting mission by drawing attention to and asking questions about unidentified objects. Once the Navy user answers those questions, the software immediately learns from the experience so that it becomes better at spotting mines.

“Within the world of naval mine countermeasures, our overarching goals are to find mines faster and get the man out of the minefield. That’s what we try to do,” explains Jason Stack, the ONR program officer in charge of the effort. “The active learning algorithms work with humans to help identify mines.”

To illustrate how the software works, Stack paints the scenario of a sailor viewing sonar imagery in search of threats. The active-learning software spots an object it has never seen before, something that may or may not be a mine. A message box pops open, querying the user about the nature of the object, and if the user chooses to label the item as either a mine or something other than a mine, the software learns what it is and will not need to ask the next time. “It’s active. That means it’s really interactive, so the software can ask questions of the human. Typically, a question is: ‘Can you provide a label for this piece of data,’” Stack explains.

Doctors, meanwhile, face a similar problem: identifying specific cells in human tissue. Physicians sometimes must view hundreds of microscopic images containing millions of cells. It can take weeks for a pathologist to manually pinpoint cells in 100 images.

To aid the cell identification process, doctors commonly use an open-source computer program known as Fluorescence Association Rules for Quantitative Insight (FARSIGHT). The program was developed with funding provided by the Defense Advanced Research Projects Agency and the National Institutes of Health. FARSIGHT identifies cells based on a subset of examples initially labeled by a physician.

Identifying cells is critical to a proper diagnosis and treatment. But diagnosing the specific type of cancer can be challenging, making treatment more difficult. “Cancer is not one disease. It is a whole family of diseases, and identifying the proper type of cancer is hugely important,” explains Badri Roysam, chairman of the Electrical and Computer Engineering Department at the University of Houston in Texas. Roysam is also the program investigator for FARSIGHT.

Knowing the type of cancer allows doctors to use targeted therapy and to have an actual understanding of the prognosis, Roysam says. “But if you don’t know which subtype it is, if there is uncertainty in that, then you end up giving the patient a cocktail of multiple drugs. That is not only expensive but possibly harmful because it can have side effects. FARSIGHT’s results, however, can be erroneous because the computer applies tags based on a small sampling. In addition, the quality of training for FARSIGHT users can be hit-or-miss, so the results vary widely,” Stack explains. “The fundamental problem is totally dependent upon the training set. Every doctor on the planet who uses FARSIGHT trains it in a different way, so the performance is all across the board,” he contends. “Sometimes it’s great. Sometimes it’s terrible. Sometimes it’s in between. And it totally depends on who is using the tool and how they were trained.”

FARSIGHT includes automatic target recognition software and a graphical user interface. Training includes providing a data set, such as a host of cell samples with corresponding labels indicating whether or not the cells are cancerous. FARSIGHT allows users to build their own training sets, but the program does not adapt—until now.

ONR officials recently announced that researchers have enhanced FARSIGHT by integrating their mine-hunting active-learning algorithms, dramatically improving FARSIGHT’s accuracy and consistency. The collaboration between ONR and FARSIGHT developers began in mid-2011, Roysam reports, and he says the enhanced system already is being used in some medical centers. Because FARSIGHT is an open-source program easily available on the Web,  Roysam explains, the enhanced version likely will be used around the world someday.


A U.S. Navy explosive ordnance disposal diver attaches an inert charge to a training mine during exercises near the naval base in Guantanamo Bay, Cuba. The Office of Naval Research is developing active-learning software with the goal of removing sailors from the minefield.

Larry Carin, a professor of electrical engineering at Duke University, Durham, North Carolina, describes the results as “spectacular” and says in the Navy announcement that, “This could be a game-changer for medical research.”

Stack agrees. “It’s huge. It’s absolutely huge,” he says, emphasizing the benefits to training users. “With active learning, you don’t have a doctor trying to guess how to teach FARSIGHT; and on the mine warfare side, you don’t have a scientist in a laboratory somewhere trying to guess how to teach a piece of software how to recognize mines. What you have is a program that can ask you to help it understand so that you get the absolute best performance every time.”

William Lee, an associate professor of medicine, hematology and oncology at the University of Pennsylvania in Philadelphia, explains that doctors have not been studying endothelial cells because it is simply too time-consuming. With active learning integrated into FARSIGHT, however, the process is now automated and “highly accurate.” The enhanced FARSIGHT can accomplish in a few hours what once would have taken days or weeks.

Stack explains that the active-learning software is equally successful in mine hunting and medicine because it can be used to identify objects in any category, including movies and music. “You can teach the algorithms to identify whatever you want them to. It’s a generic capability,” he says.

Roysam says the technology also is being used to benefit the Defense Department’s research into neuroprosthetic devices—robotic arms and legs connected directly to the brain for more natural control and movement. Currently, neural implants stop working after about six weeks because the brain grows cells around the implant, isolating it from the rest of the body. “We’re looking at the roles of specific brain cells in the tissue reactive response. We have been using the algorithm to analyze that. We’ve made some breakthroughs, but we are not ready to announce yet,” he reports.

The ONR’s active-learning effort is an applied research program, and the team is currently in the process of transitioning the software to the Program Executive Office (PEO) for Littoral Combat Ships. Like most software programs, it will be delivered to the customer in iterations. The PEO integrated the active-learning algorithm into a prototype system in 2010 and is currently working on the second version of that prototype. The third and final version is scheduled for completion by the end of this year. Once the final prototype is developed, the software is scheduled to transition into a program of record known as Network-Centric Sensor Analysis for Mine Warfare, which seeks to improve mine-hunting accuracy in part by reducing false alarm rates.

Right now, the state of the art in active-learning technology is software that asks simple questions requesting that the user label a piece of data, such as whether or not an image is a mine, Stack explains. One of the challenges is to develop a program that asks the right questions at the right time and in the right number. “If I have a piece of software that can ask questions, you can imagine there might be some usability issues there. You obviously don’t want it asking questions every five seconds. It needs to ask just enough to learn and to do a good job,” he says.

The differences between the first prototype and the final version likely will be an expansion of the types of questions asked,” Stack reveals. “We are trying to fundamentally extend the definition of active learning and to enable the algorithm and the human to have more of a conversation,” he says. He envisions software that not only can ask what an object is, but also can ask for additional feedback—for example, what is it about an item that makes it not a mine?

For his part,  Roysam says FARSIGHT will continue to be enhanced as well. For example, he would like to see the system made even faster than it currently is. “It is developed enough to have value right now, but we are going to continue to work on improving it even further. That’s just our nature.”

Office of Naval Research: