Advancements in Brain Research Illuminate Robotic Future
Relying on fruit flies, several scientists are achieving new insights into how neurons in the brain are connected and send signals to control movements. A closer understanding of these neurological pathways may offer a foundation for developing more “bio-inspired” advanced robotic behaviors.
For one scientist at the Howard Hughes Medical Institute’s Janelia Research Campus in Ashburn, Virginia, the confluence of microscope advancements, artificial intelligence (AI) and neurobiology has led to brain mapping and modeling on a scale not seen before.
“I’ve worked in developing this new field that we now call Connectomics, which is the science of mapping the neural network of the brain,” explained Srinivas Turaga, group leader, Turaga Lab at Janelia. “It is the nature of that neural network and exactly how it’s wired that we believe determines how our brains work. The challenge is that we can see individual neurons, but our brains are made up of a huge number of neurons, and if you want to look at every neuron and see all of its connections, it’s quite a [task].”
Connectomics combines advanced 3D electron microscopic technologies, AI and neurobiology to create high-resolution maps at the single cell or neuron level. And while researchers in the past have relied on magnetic resonance imaging (MRI) machines to figure out how regions of the brain are connected—the brain is composed of billions of neurons—the MRI methods provided only a “very coarse understanding” of neural pathways. With the lab’s microscope, “you can now see every wire as a neuron spreads out in the brain and makes connections,” Turaga explained.
To understand neural pathways, researchers are using deceased, adult fruit flies, known as Drosophila Melanogaster. “The fruit fly has a tiny brain the size of a poppy seed—it only has 100,000 neurons,” he noted.
One area of research is a collaborative effort with Turaga’s current and former colleagues at Janelia, and scientists at Princeton University and the University of Cambridge. Turaga is specializing in deriving simulations of the fruit fly brain based on Connectomics. “The microscopy was done at my institute and then it was reconstructed,” Turaga said. “So basically, we had this 3D, very high-resolution image and we needed to trace all of the wires, the neurons inside of it. That tracing was done at Princeton through a combination of AI and researchers checking for any mistakes. It was a huge amount of work. Our plan that was hatched about 15 years ago has now come to fruition.”
As a stand-in for humans, the fruit fly allows scientists to hypothesize on how changes to neural centers could impact actions, movements or behaviors. “We know from theory if you change the connectivity, that changes how the neural networks work,” Turaga said. “What we want to try to understand is how does a normal neural network in the brain work.”
Fruit fly nervous systems very efficiently generate complex behaviors,” he added. “They’re incredibly efficient, both in terms of energy and the number of neurons that they use. They can navigate the world, fly around, walk around, look for mates and conduct an elaborate courtship ritual, look for food, escape predators, all with just 133,000 neurons. That’s extremely efficient computations that it can do.”

Since the human brain, however, has an estimated 86 billion neurons, the researchers can only dream of mapping that neural network. In the meantime, researchers at Janelia, Princeton, Harvard University, the Allen Institute and others are using deep learning and computer modeling to begin scaling up from a fruit fly’s brain to the whole mouse brain, which would be 1,000-times bigger than anything mapped before as far as modeled neural activity. Here, Turaga’s experience with MIT Lincoln Laboratory, his AI research and his neuroscience have led to the development of many models. And by examining the fly’s visual system—which, since the fly is a highly visual animal, makes up two-thirds of its brain—the researchers can understand additional neural processes that may translate to the research in mice.
“We are modeling a significant part of the visual system of the fruit fly,” he noted. “It is basically the neural network that processes that visual information. Our model, using AI methods, can take all these measurements about how different neurons are connected with each other. And then because those measurements are made from a dead brain of a fly, we’re trying to make predictions of how the living brain works and what visual information is transformed and processed in the visual system of a fly. We haven’t had this level of understanding before, and the fruit fly is where we think we can make the most progress because it’s a small nervous system.”
And while Turaga is not ready to draw a direct line between what scientists will learn about the human brain and what will be the most useful in robotics, he offers hope in the connection and for the developing field of neuro AI.
“I think there is some crude similarity between the modern AI systems and an animal’s nervous system,” he said. “You can ask the question, how does this AI system that was created to solve this problem solve it and compare it to how the brain in a living organism solves it. And this approach is called neuro AI. It’s a new field. And it tries to build bridges between how AI systems that have been trained to solve a particular problem solve that problem and compare it to how an animal does.”
Turaga will continue helping to expand the field of Connectomics and brain mapping in the United States. He sees several key research projects in the community about to be published that will provide new insights into neural pathways. “I think this is a historic year for the field of Connectomics.”
Meanwhile, scientists at the Neuroengineering Laboratory of the Brain Mind Institute in Switzerland and Janelia are examining the connection between neurobiology, neuroengineering and robotics.
Published in the March issue of “Nature Neuroscience” (Volume 26, pages 682-695), the study, “Ascending Neurons Convey Behavioral State to Integrative Sensory and Action Selection Brain Regions,” by Chin-Lin Chen, Florian Aymanns, Ryo Minegishi and other researchers, emphasizes the key role of ascending neurons—neurons that move signals from an animal’s spinal cord or an insect’s ventral nerve cord, to the brain, or cerebral cortex. They looked at the properties of these neurons in the motor systems of fruit flies and how the flies select stabile movements and appropriate physical actions.
By understanding the dynamic sensory cues in a fruit fly’s motor system that projects to the brain, the scientists hope to understand how future behaviors are identified and selected, which could be applied to the neuroengineering of robots, known as generating adaptive behaviors.
“To generate adaptive behaviors, animals and robots must not only sense their environment but also be aware of their own ongoing behavioral state,” Chen, Aymanns, Minegishi and et al. said. “Knowing if one is at rest or in motion permits the accurate interpretation of whether sensory cues, such as visual motion during feature tracking or odor intensity fluctuations, result from exafference (the movement of objects in the world) or reafference (self-motion of the body through space with respect to stationary objects). Additionally, being aware of one’s current posture enables the selection of future behaviors that are not destabilizing or physically impossible.”

I’ve worked in developing this new field that we now call Connectomics, which is the science of mapping the neural network of the brain.
In the adult Drosophila melanogaster, the ascending neurons encode behavioral states, bringing information on the fly’s movements to its brain as well as to a higher-order neuron center known as the gnathal ganglia that selects locomotion. The scientists’ breakthrough was understanding what the ascending neurons communicated and where they conveyed the signals in hundreds of neurons in the fly, which was previously unknown.
“We reveal that ANs [ascending neurons] encode behavioral states, specifically conveying self-motion to the anterior ventrolateral protocerebrum, an integrative sensory hub, as well as discrete actions to the gnathal ganglia, a locus for action selection,” Chen, Aymanns, Minegishi and et al. said. “Thus, ascending populations are well poised to inform distinct brain hubs of self-motion and ongoing behaviors, and may provide an important substrate for computations that are required for adaptive behavior.”
These researchers also depended on AI and large-scale functional imaging advancements—in this case, a multi-camera array and 3D video processing—to understand a fly’s movements.
“We precisely quantified joint angles and limb kinematics using a multi-camera array that recorded behaviors during two-photon imaging,” the researchers stated. “We processed these videos using DeepFly3D, a deep-learning-based 3D pose estimation software. By combining these 3D joint positions with recorded spherical treadmill rotations (a proxy for locomotor velocities), we could classify behavioral time series to study the relationship between ongoing behavioral states and neural activity.”
But by understanding a fly’s movements and where and how ascending neurons moved data to the brain, the researchers provide a groundbreaking picture into how behaviors and actions occur. “Taken together, these data provide a first large-scale view of ascending signals to the brain, opening the door for a cellular-level understanding of how behavioral states are computed and how ascending motor signals allow the brain to contextualize sensory signals and select appropriate future behaviors,” the researchers noted.
“Our finding that ascending neurons encode behavioral states and convey these signals to integrative sensory and action selection centers in the brain may guide the study of such neurons in the mammalian spinal cord and also accelerate the development of more effective bio-inspired algorithms for robotic sensory contextualization and action selection.”