How fruit flies see love: An AI model of the male’s visual neurons during courtship

Written by
Payel Chatterjee for the Princeton Neuroscience Institute
May 23, 2024

For fruit flies, love is not blind. During courtship, a male fruit fly relies on his sense of vision to pursue a female fruit fly—if she is far away, he will speed up; if she is to the left, he will turn left; if she is close, he will serenade her with a complex acoustic signal generated by vibrating his wings. But how does the visual system guide this remarkable song and dance despite having a visual system with orders of magnitude fewer neurons than mice or humans? How does such complex behavior arise from relatively simple circuitry? 

To get at this question, neuroscientists in the Murthy Lab at the Princeton Neuroscience Institute used advanced genetic tools to silence individual neuron types at the interface between the early visual system and central processing in the fly brain. If there was a neuron type that, for example, kept track of the female’s size, silencing this neuron type would impact estimation of his distance to the female, which would compromise his courtship behavior. The team measured hundreds of courtship sessions for males in which each of roughly 23 different neuron types was silenced (one at a time) — in other words, setting up these flies on many potentially blind dates. Some males had deficits in visually-guided courtship, while others showed improvements - the phenotypes were highly varied. “The data was too complex to describe with our existing quantitative tools — instead we needed a computational model that could soak up all of this data and identify key differences in behavior across the different silenced visual neuron types,” said Mala Murthy, Professor and Director of PNI. The team paired up with members of the Pillow Lab, a computational neuroscience group, and their collaborative research appears in the current issue of the journal Nature.

“I was immediately intrigued by the biological problem — how the fruit fly brain transforms vision into decision. We needed to find the right sequence of step-by-step computations. However, as I kept working on the model, I realized the computational problem itself—how to silence a model’s internal variables in the same way as how the fruit fly’s visual neuron types were silenced genetically—was just as intriguing. It led us to develop a general-purpose algorithm to identify the function of components in a system without actually observing that component in action, only its effect on the output after removing it,” said Benjamin Cowley, lead author of the work, who was a CV Starr Postdoctoral Fellow at PNI and now an Assistant Professor at Cold Spring Harbor Laboratory. 

Similar to the fly’s visual system, the deep neural network model they trained with the silenced behavioral data included a ‘bottleneck’ layer of visual neurons. For building a one-to-one map between the real neurons and units in the artificial neural network, the researchers performed “knockout training”, a new paradigm in which model units were perturbed according to genetic inactivation of specific visual projection neurons, and the resulting behavioral outputs were aligned. They showed the knockout-trained network accurately predicted both the behavior of neuron-silenced males as well as the neurons’ activity itself. The latter was surprising, as the model was never trained with neural activity (just behavioral data).

The result of this work is a detailed mapping between stimulus features, visual neurons, and behavior. The authors used this mapping to investigate the functional properties of model neurons - both how they encode the visual of the female and how they coordinate to drive male courtship behaviors. Their results point to a complex combinatorial mapping, in which most of the visual projection neurons contribute to representing the female and driving behavior. Neuroscientists can now use this mapping to guide future experiments—and, this mapping will help to make better sense of the recently released connectome of an entire fruit fly’s brain, generated by the Murthy and Seung Labs at Princeton.

Cowley, B.R., Calhoun, A.J., Rangarajan, N. et al. Mapping model units to visual neurons reveals population code for social behaviour. Nature (2024). https://doi.org/10.1038/s41586-024-07451-8(Link is external)

See also News & Views