Skip to content
CIFAR header logo
fr
menu_mobile_logo_alt
  • Our Impact
    • Why CIFAR?
    • Impact Clusters
    • News
    • CIFAR Strategy
    • Nurturing a Resilient Earth
    • AI Impact
    • Donor Impact
    • CIFAR 40
  • Events
    • Public Events
    • Invitation-only Meetings
  • Programs
    • Research Programs
    • Pan-Canadian AI Strategy
    • Next Generation Initiatives
  • People
    • Fellows & Advisors
    • CIFAR Azrieli Global Scholars
    • Canada CIFAR AI Chairs
    • AI Strategy Leadership
    • Solution Network Members
    • Leadership
    • Staff Directory
  • Support Us
  • About
    • Our Story
    • Awards
    • Partnerships
    • Publications & Reports
    • Careers
    • Equity, Diversity & Inclusion
    • Statement on Institutional Neutrality
    • Research Security
  • fr
Child & Brain Development

Building a fly brain in a computer

By: Jon Farrow
25 Oct, 2018
October 25, 2018
AI Brain Illustration

Fruitfly-banner

Researchers from two different CIFAR programs collaborate to show fruit flies can do more than previously thought possible.

Despite the simplicity of their visual system, fruit flies are able to reliably distinguish between individuals based on sight alone. This is a task that even humans who spend their whole lives studying Drosophila melanogaster struggle with. Researchers have now built a neural network that mimics the fruit fly’s visual system and can distinguish and re-identify flies. This may allow the thousands of labs worldwide that use fruit flies as a model organism to do more longitudinal work, looking at how individual flies change over time. It is also more evidence that fascinating research happens at the intersection of scientific disciplines.

In a project funded by a CIFAR Catalyst grant, researchers at the University of Guelph and the University of Toronto Mississauga combined expertise in fruit fly biology with machine learning to build a biologically-based algorithm that churns through low-resolution videos of fruit flies in order to test whether it is physically possible for a system with such constraints to accomplish such a difficult task.

Fruit flies have small compound eyes that take in a limited amount of visual information, an estimated 29 units squared (Fig. 1A). The traditional view has been that once the image is processed by a fruit fly, it is only able to distinguish very broad features (Fig. 1B). But a recent discovery that fruit flies boost their effective resolution with subtle biological tricks (Fig. 1C) has led researchers to believe that vision could contribute significantly to the social lives of flies. This, combined with the discovery that the structure of their visual system looks a lot like a Deep Convolutional Network (DCN), led the team to ask: “can we model a fly brain that can identify individuals?”

Traditional view
Fig 1. How blurry is a fly’s vision? A) Ideal fruit fly input B) Traditional view C) Updated view

Their computer program has the same theoretical input and processing ability as a fruit fly and was trained on video of a fly over two days. It was then able to reliably identify the same fly on the third day with an F1 score (a measure that combines precision and recall) of 0.75. Impressively, this is only slightly worse than scores of 0.85 and 0.83 for algorithms without the constraints of fly-brain biology. For comparison, when given the easier task of matching the ‘mugshot’ of a fly to a field of 20 others, experienced human fly biologists only managed a score of 0.08. Random chance would score 0.05.

machine learning algorithm
Fig 2: The machine and the fly A) Modern DCN machine learning algorithm B) Machine learning algorithm based on fly biology C) Connections in the fruit fly visual system

Jon Schneider, a postdoctoral fellow who shares his time between Joel Levine and Graham Taylor’s labs and is the first author of the paper being published in PLOS ONE this week, says this study points to “the tantalizing possibility that rather than just being able to recognize broad categories, fruit flies are able to distinguish individuals. So when one lands next to another, it’s “Hi Bob, Hey Alice””.

The collaboration was born out of a cross-program CIFAR meeting that included researchers from both the Learning in Machines & Brains and Child & Brain Development programs. It was there that Joel and Graham started talking about how machine learning might be harnessed to improve fly tracking, a notoriously difficult problem for biologists.

It wasn’t all smooth sailing, though. “I had this naive idea about what unsupervised learning means which led me and Jon Schneider to brainstorm some really cool experiments based on what Graham had done. And it was only much later we found out that it was ridiculous because our idea of what unsupervised learning was wrong. But that’s a beautiful thing. Because those are the growing pains you need to be able to get those different areas to speak to one another other, and once they do, [amazing things] happen”.

The approach of pairing deep learning models with nervous systems is incredibly rich. It can tell us about the models, about how neurons communicate with each other, and it can tell us about the whole animal. That’s sort of mind blowing. And it’s unexplored territory.

Graham Taylor, a machine learning specialist and CIFAR Azrieli Global Scholar in the Learning in Machines and Brains program, was excited by the prospect of beating humans at a visual task. “A lot of Deep Neural Network applications try to replicate and automate human abilities like facial recognition, natural language processing, or song identification. But rarely do they go beyond human capacity. So it’s exciting to find a problem where algorithms can outperform humans.”

The experiments took place in the University of Toronto, Mississauga lab of Joel Levine, a senior fellow in the CIFAR Child & Brain Development program.  He has high hopes for the future of research like this. “The approach of pairing deep learning models with nervous systems is incredibly rich. It can tell us about the models, about how neurons communicate with each other, and it can tell us about the whole animal. That’s sort of mind blowing. And it’s unexplored territory.”

All of the researchers involved in the study, including Nihal Murali, an undergraduate student on exchange from India, found crossing between biology and computer science exhilarating. Schneider summed up what it was like working between disciplines: “Projects like this are a perfect arena for neurobiologists and machine learning researchers to work together to uncover the fundamentals of how any system – biological or otherwise – learns and processes information.”

  • Follow Us

Related Articles

  • New program provides expert AI advice for policymakers
    April 30, 2025
  • Strengthening Canada’s AI talent ecosystem
    April 16, 2025
  • Humans & the Microbiome: Educational Modules for Public Health Professionals
    April 10, 2025
  • The value of community engagement in AI deployment
    March 31, 2025

Support Us

The Canadian Institute for Advanced Research (CIFAR) is a globally influential research organization proudly based in Canada. We mobilize the world’s most brilliant people across disciplines and at all career stages to advance transformative knowledge and solve humanity’s biggest problems, together. We are supported by the governments of Canada, Alberta and Québec, as well as Canadian and international foundations, individuals, corporations and partner organizations.

Donate Now
CIFAR footer logo

MaRS Centre, West Tower
661 University Ave., Suite 505
Toronto, ON M5G 1M1 Canada

Contact Us
Media
Careers
Accessibility Policies
Supporters
Financial Reports
Subscribe

  • © Copyright 2025 CIFAR. All Rights Reserved.
  • Charitable Registration Number: 11921 9251 RR0001
  • Terms of Use
  • Privacy
  • Sitemap

Subscribe

Stay up to date on news & ideas from CIFAR.

Fields marked with an * are required

Je préfère m’inscrire en français (cliquez ici).


Subscribe to our CIFAR newsletters: *
    Social Security#
subscribe form

You can unsubscribe from these communications at any time. View our privacy policy.


As a subscriber you will also receive a digital copy of REACH, our annual magazine which highlights our researchers and their breakthroughs with long-form features, interviews and illustrations.


Please provide additional information if you would like to receive a print edition of REACH.


This website stores cookies on your computer. These cookies are used to collect information about how you interact with our website and allow us to remember you. We use this information in order to improve and customize your browsing experience and for analytics and metrics about our visitors both on this website and other media. To find out more about the cookies we use, see our Privacy Policy.
Accept Learn more

Notifications