Skip to content
CIFAR header logo
fr
menu_mobile_logo_alt
  • Our Impact
    • Why CIFAR?
    • Impact Clusters
    • News
    • CIFAR Strategy
    • Nurturing a Resilient Earth
    • AI Impact
    • Donor Impact
    • CIFAR 40
  • Events
    • Public Events
    • Invitation-only Meetings
  • Programs
    • Research Programs
    • Pan-Canadian AI Strategy
    • Next Generation Initiatives
  • People
    • Fellows & Advisors
    • CIFAR Azrieli Global Scholars
    • Canada CIFAR AI Chairs
    • AI Strategy Leadership
    • Solution Network Members
    • Leadership
    • Staff Directory
  • Support Us
  • About
    • Our Story
    • Awards
    • Partnerships
    • Publications & Reports
    • Careers
    • Equity, Diversity & Inclusion
    • Statement on Institutional Neutrality
    • Research Security
  • fr
CIFAR Pan-Canadian AI Strategy

AI researcher pioneers new subfield of AI

By: Krista Davidson
9 Dec, 2019
December 9, 2019
Agrawal Aishwary Image

Canada CIFAR AI Chair Aishwarya Agrawal is pioneering Visual Question Answering (VQA) which will revolutionize how machines understand the content of images.

Aishwarya Agrawal is a leader in the development of VQA, a complex and challenging task that enables machines to have an understanding of vision, language, knowledge and common sense based reasoning. VQA can assist artificial intelligence (AI) in visual perception and natural language communication.

VQA enables machines to answer complex questions in a way that’s accessible to humans, such as “What is the man in blue shirt holding?” To be able to answer the question, machines need to identify the region of the image where there is a person in blue shirt (called language grounding). Secondly, they need to understand the meaning of the word “holding”, i.e., they need to look at the person’s hands, even though “hands” aren’t mentioned in the question. Finally, they need to identify the name of the object in the person’s hands.

Agrawal is excited about VQA because it has potential applications that could improve the quality of life for the visually impaired, support children’s educational development and enhance the user experience with virtual assistants such as Siri and Alexa.

“There is something to solving the question of intelligence itself and what this technology could mean for our future,” she says.

Building the world’s first and largest open-ended dataset for VQA

VQA first trickled onto the AI scene in 2014 and has gained significant interest, in large part due to the work of Agrawal and her colleagues following the publication of a paper at the International Conference on Computer Vision (ICCV) the following year.

The team, which included researchers from Virginia Tech and Microsoft Research collected and publicly released the first and largest free-form and open-ended VQA dataset. They also started an annual VQA challenge to push the machine performance on VQA.

“There is something to solving the question of intelligence itself and what this technology could mean for our future”

Each year the competition presents a set of images and natural language questions, such as “What kind of cheese is on the pizza?”, “Does this person have 20/20 vision?”, inviting researchers and students from around the world to provide natural language answers. To date, the dataset contains about 250,000 images, 760,000 questions and 10 million answers. In the span of four years, Agrawal and her team have received over 1300 citations, over 800 downloads of the dataset and a best poster award at the Workshop on Object Understanding for Interaction at ICCV 15.

Canada has one of the ‘best AI environments in the world’

Agrawal joins Mila and the Université de Montréal’s department of computer science and operations research as an assistant professor in 2020. She completed her PhD at Georgia Institute of Technology. Agrawal says she chose Canada to pursue her research for its vibrant and collaborative research environment.

“Right now, climate change poses a big problem, and for me particularly, coming from a small town in India where we face many problems with primary education, healthcare solving the AI challenge could help us address the problems of climate change, education and healthcare.”

“Right now, climate change poses a big problem, and for me particularly, coming from a small town in India where we face many problems with primary education, healthcare solving the AI challenge could help us address the problems of climate change, education and healthcare,” she says.

“I believe the environment around you can play an important role in shaping the kind of research you do. And to me, Montréal, and Canada in general, seem to have one of the best AI environments in the world. There are brilliant AI researchers here who are pursuing important research directions, and there is a lot of support from institutions and government for long-term research. And less common, is that Canada has a very healthy ecosystem in which industries and universities collaborate.”

Eliminating bias

Agrawal will use her time as a Canada CIFAR AI Chair to further develop VQA. “Training models based on large data sets can potentially result in biased AI systems. For example, if a system recognizes that most of the people in images are holding briefcases, it may make the assumption that all questions that ask what a person is holding should be answered with ‘briefcases’,” she says.

“It’s very challenging to train models to overcome dataset biases and to answer purely based on the evidence presented by the image,” says Agrawal, but one that she’s determined to address during her term as a Canada CIFAR AI Chair.


The Canada CIFAR AI Chairs Program is the cornerstone program of the CIFAR Pan-Canadian AI Strategy. A total of $86.5 million over five years has been earmarked for this program to attract and retain world-leading AI researchers in Canada. The Canada CIFAR AI Chairs that have been announced to-date are conducting research in a range of fields, from machine learning for health, autonomous vehicles, artificial neural networks, climate change and more.

  • Follow Us

Related Articles

  • Three 2024 Nobel Laureates among CIFAR’s acclaimed community of researchers
    October 15, 2024
  • Canada CIFAR AI Chairs gather in Banff for annual AICan meeting
    June 20, 2024
  • Indigenous perspectives in AI
    June 18, 2024
  • How does the brain give rise to the mind?
    June 13, 2024

Support Us

The Canadian Institute for Advanced Research (CIFAR) is a globally influential research organization proudly based in Canada. We mobilize the world’s most brilliant people across disciplines and at all career stages to advance transformative knowledge and solve humanity’s biggest problems, together. We are supported by the governments of Canada, Alberta and Québec, as well as Canadian and international foundations, individuals, corporations and partner organizations.

Donate Now
CIFAR footer logo

MaRS Centre, West Tower
661 University Ave., Suite 505
Toronto, ON M5G 1M1 Canada

Contact Us
Media
Careers
Accessibility Policies
Supporters
Financial Reports
Subscribe

  • © Copyright 2025 CIFAR. All Rights Reserved.
  • Charitable Registration Number: 11921 9251 RR0001
  • Terms of Use
  • Privacy
  • Sitemap

Subscribe

Stay up to date on news & ideas from CIFAR.

Fields marked with an * are required

Je préfère m’inscrire en français (cliquez ici).


Subscribe to our CIFAR newsletters: *

You can unsubscribe from these communications at any time. View our privacy policy.


As a subscriber you will also receive a digital copy of REACH, our annual magazine which highlights our researchers and their breakthroughs with long-form features, interviews and illustrations.


Please provide additional information if you would like to receive a print edition of REACH.


This website stores cookies on your computer. These cookies are used to collect information about how you interact with our website and allow us to remember you. We use this information in order to improve and customize your browsing experience and for analytics and metrics about our visitors both on this website and other media. To find out more about the cookies we use, see our Privacy Policy.
Accept Learn more

Notifications