UC Santa Barbara Maps Fruit Fly's Visual Pathway

University of California - Santa Barbara

UC Santa Barbara neuroscientists have reconstructed the entire anterior visual pathway of a fruit fly, a complex series of connections between the insect's eyes and the navigation center of its brain. With the help of artificial intelligence and manual proofreading, systems biologist Sung Soo Kim's research group and collaborators worked out the relationships between more than 3,000 neurons with unprecedented detail.

These insights into the fruit fly's anterior visual pathway contribute to a suite of nine papers reporting the neuronal wiring of the entire fruit fly brain, published in the journal Nature. Led by Princeton neuroscientists Mala Murthy and Sebastian Seung, this landmark achievement — an account of the largest, most complex brain to be so thoroughly mapped so far — brings us closer to understanding the intricacies of animal brains and is a stepping stone toward ultimately understanding how the human brain is wired.

An electron-microscopy image of the fruit fly's anterior visual pathway — all neurons involved in processing visual information and conveying it to the navigation center in the fly brain. Compass neurons are in the circular area at the center

"In systems neuroscience, the question is how neurons interact and generate perception, cognition, motor commands and so on," said Kim, a co-author of two studies (one as a co-corresponding author) appearing in the journal Nature. "But the major problem here is that we don't know how the neurons are connected to each other. So it's difficult to understand what's really going on in the neural network."

Indeed, depending on a variety of contexts, a single stimulus can result in a wide array of responses, as the information moves from the initial, sensory stage to the deeper, cognitive and motor stages of the brain. For instance, if you feel something pressing into your skin, your peripheral neurons will be the first to pick up the pressure, Kim explained. But, as that touch information rapidly makes its way through the brain, it is modified by myriad other factors, including mood, activity and the source of that pressure, just to name a few. As a result, your reaction to that touch can vary wildly.

"There are so many different connections and feedback connections that the brain is processing, so that this single touch could have totally different representations in the brain," Kim said.

Such is the case with navigation, a fundamental, goal-oriented behavior that most animals engage in. Using a constant stream of sensory cues and feedback information, we make representations of our environments and decisions about how to get to where we want to go.

In fruit flies, approximately 50 "compass neurons" — neurons that tile together to form a ring within the donut-shaped "ellipsoid body" deep in their brains — are responsible for encoding a fly's sense of direction. This relatively simple structure makes their brains a good candidate for working out the neural circuitry between what they see with their eyes, and how that information travels to the deeper areas of their brains.

"It's a lot easier to look at these pathways in the fly's brain," said co-lead author Dustin Garner, of the Kim Lab. A few years ago, scientists in the Janelia Research Campus at Howard Hughes Medical Institute took 7,050 sections of a single fly's brain, took 21 million electron microscope images, and compiled them into a publicly available database. Groups at Princeton University took this data and trained an AI to recognize sections of individual neurons across these images, which then led to a 3D reconstruction of the entire neural network of that fly's brain. But it was not perfect and still needed human eyes to confirm. Garner's job was to proofread the AI's output with regard to the fly's anterior visual pathway.

"It was great to be able to see the individual neuron-by-neuron specifics," he said. "And we actually found multiple parallel pathways that had similar types of neurons, but were slightly different in both form and function." Garner's analysis included classifications of these different types of neurons, and predicted their functions from the connectivity.

Meanwhile, Kim Lab colleague and co-lead author Jennifer Lai confirmed some of these predictions experimentally, using the lab's virtual reality arena for flies, a highly controlled environment projecting light in the fly-visible spectrum (UV to amber), in order to apply stimuli to a tethered fruit fly and observe its brain activity. In particular, they watched for which neurons fire based on what is being presented to the fly's visual system, be it multiple small dots or vertically oriented objects.

"We had two major predictions," she said. "One was the shape of the visual area that each neuron responds to. Some of them respond to vertically elongated visual areas, like columns in a Greek temple, whereas others respond to smaller and more circular visual areas, which we presented in this paper." The other, she said, is the color sensitivity of the "ring neurons", which are the last relay in the anterior visual pathway before visual information is integrated by the compass neurons to generate a directional sense. That, she said, is still a work in progress.

This detailed connectivity data can be used to create computational models that may shed light on how animals navigate and could serve as a model for autonomous vehicle navigation, without relying on GPS.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.