Skip to content
CIFAR header logo
fr
menu_mobile_logo_alt
  • About
    • Our Story
    • CIFAR 40
    • Awards
    • Partnerships
    • President’s Message
    • Publications & Reports
    • Careers
    • Equity, Diversity & Inclusion
  • News
  • People
    • Fellows & Advisors
    • CIFAR Azrieli Global Scholars
    • Canada CIFAR AI Chairs
    • AI Strategy Leadership
    • Solution Network Members
    • Staff Directory
    • Leadership
  • Programs
    • Research Programs
    • Knowledge Mobilization
    • Pan-Canadian AI Strategy
    • Next Generation Initiatives
    • Global Call for Ideas
    • Action on Covid-19
  • Events
    • Public Events
    • Invitation-only Meetings
  • Support Us
  • fr
  • Home
  • Bio

Follow Us

post_content

Animesh Garg

Appointment

  • Canada CIFAR AI Chair
  • Pan-Canadian AI Strategy

Connect

Personal Page

Google Scholar

About

Animesh Garg is a Canada CIFAR AI Chair at the Vector Institute, an assistant professor at the department of computer science. He leads the Toronto People, AI, and Robotics (PAIR) research group at the University of Toronto. Garg is also a research scientist at NVIDIA Research in machine learning for robotics. 

Garg’s research focuses on machine learning algorithms for perception and control in robotics. He aims to enable generalizable autonomy through efficient robot learning for long-term sequential decision making. The principal technical focus lies in understanding representations and algorithms to enable simplicity and generality of learning for interaction in autonomous agents. He actively works on applications of robot manipulation in industrial and healthcare robotics.

Awards

  • Best Paper IEEE International Conference on Robotics and Automation, 2019
  • Best Cognitive Paper Finalist IEEE International Conference on Robotics and Systems, 2019
  • Best Paper Award at Robot Learning Workshop NeurIPS, 2019
  • Best Medical Robotics Finalist IEEE International Conference on Robotics and Automation, 2015

Relevant Publications

  • Fang, K., Zhu, Y., Garg, A., Kurenkov, A., Mehta, V., Fei-Fei, L., & Savarese, S. (2020). Learning task-oriented grasping for tool manipulation from simulated self-supervision. The International Journal of Robotics Research, 39(2-3), 202-216.

  • Lee, M. A., Zhu, Y., Srinivasan, K., Shah, P., Savarese, S., Fei-Fei, L., … & Bohg, J. (2019). Making sense of vision and touch: Self-supervised learning of multimodal representations for contact-rich tasks. In 2019 International Conference on Robotics and Automation (ICRA) (pp. 8943-8950). IEEE.

  • Huang, D. A., Nair, S., Xu, D., Zhu, Y., Garg, A., Fei-Fei, L., … & Niebles, J. C. (2019). Neural task graphs: Generalizing to unseen tasks from a single video demonstration. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 8565-8574).

  • Murali, A., Sen, S., Kehoe, B., Garg, A., McFarland, S., Patil, S., … & Goldberg, K. (2015). Learning by observation for surgical subtasks: Multilateral cutting of 3d viscoelastic and 2d orthotropic tissue phantoms. In 2015 IEEE International Conference on Robotics and Automation (ICRA) (pp. 1202-1209). IEEE.

Institution

  • NVIDIA Research
  • University of Toronto
  • Vector Institute

Department

Computer Science

Education

  • PhD (Operations Research), University of California, Berkeley
  • MSc (Computer Science), University of California, Berkeley

Country

  • Canada

Support Us

CIFAR is a registered charitable organization supported by the governments of Canada, Alberta and Quebec, as well as foundations, individuals, corporations and Canadian and international partner organizations.

Donate Now
CIFAR header logo

Subscribe

Stay up to date on news & ideas from CIFAR.

MaRS Centre, West Tower
661 University Ave., Suite 505
Toronto, ON M5G 1M1 Canada

Contact Us
Media
Careers
Accessibility Policies
Supporters
Financial Reports
Subscribe

  • © Copyright 2022 CIFAR. All Rights Reserved.
  • Charitable Registration Number: 11921 9251 RR0001
  • Terms of Use
  • Privacy
  • Sitemap
This website stores cookies on your computer. These cookies are used to collect information about how you interact with our website and allow us to remember you. We use this information in order to improve and customize your browsing experience and for analytics and metrics about our visitors both on this website and other media. To find out more about the cookies we use, see our Privacy Policy.
Accept Learn more