Skip to content
CIFAR header logo
fr
menu_mobile_logo_alt
  • News
  • Events
    • Public Events
    • Invitation-only Meetings
  • Programs
    • Research Programs
    • Pan-Canadian AI Strategy
    • Next Generation Initiatives
    • Global Call for Ideas
  • People
    • Fellows & Advisors
    • CIFAR Azrieli Global Scholars
    • Canada CIFAR AI Chairs
    • AI Strategy Leadership
    • Solution Network Members
    • Leadership
  • Support Us
  • About
    • Our Story
    • CIFAR 40
    • Awards
    • Partnerships
    • Publications & Reports
    • Careers
    • Staff Directory
    • Equity, Diversity & Inclusion
  • fr
  • Home
  • Bio

Follow Us

Golnoosh Farnadi

Golnoosh Farnadi

Appointment

Canada CIFAR AI Chair

Pan-Canadian AI Strategy

Connect

HEC Montréal

Google Scholar

About

Golnoosh Farnadi is a Canada CIFAR AI Chair at Mila and an assistant professor in the Department of Decision Sciences at HEC Montréal, and adjunct professor at Université de Montréal. 

The increasing use of algorithmic decision making in domains that affect people’s lives such as employment, education, policing and loan approval, has raised concerns about possible biases and discrimination that such systems might introduce. Recent concerns about algorithmic discrimination have motivated the development of fairness-aware mechanisms in the machine learning (ML) community and the operations research (OR) community, independently. While in fairness-aware ML, the focus is usually on ensuring that the predictions made by a learned model are fair, in reality the fairness should be guaranteed for the decisions made using such predictions. Existing methods in fairness-aware optimization resolve this issue, however they are often deterministic and fall short in exploiting the knowledge which is available in data. 

Farnadi’s research focuses on the complementary strengths of fairness methods in ML and OR to address these shortcomings in a fair data-driven decision making system.

Awards

  • Postdoctoral fellow scholarship, IVADO, 2018 - 2021
  • Best paper award, Beyond online data workshop at ICWSM, 2018
  • Best student paper award, ILP, 2015

Relevant Publications

  • Sivaraman, A., Farnadi, G., Millstein, T., & Van den Broeck, G. (2020). Counterexample-Guided Learning of Monotonic Neural Networks. Advances in Neural Information Processing Systems, 33.

  • Farnadi, G., Babaki, B., & Gendreau, M. (2020). A Unifying Framework for Fairness-Aware Influence Maximization. In Companion Proceedings of the Web Conference 2020 (pp. 714-722).

  • Choi, Y., Farnadi, G., Babaki, B., & Van den Broeck, G. (2020). Learning fair naive bayes classifiers by discovering and eliminating discrimination patterns. In Proceedings of the AAAI Conference on Artificial Intelligence (Vol. 34, No. 06, pp. 10077-10084).

  • Farnadi, G., Babaki, B., & Getoor, L. (2018). Fairness in relational domains. In Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society (pp. 108-114).

Institution

HEC Montréal

Mila

Université de Montréal

Department

Decision Sciences

Education

  • PhD (Computer Science), joint degree, KU Leuven and Ghent University, Belgium
  • MSc (Computer Science), Delft University of Technology (TU Delft), The Netherlands
  • Bachelor (Computer Science), Shahid Beheshti University, Iran

Country

Canada

Support Us

CIFAR is a registered charitable organization supported by the governments of Canada, Alberta and Quebec, as well as foundations, individuals, corporations and Canadian and international partner organizations.

Donate Now
CIFAR header logo

MaRS Centre, West Tower
661 University Ave., Suite 505
Toronto, ON M5G 1M1 Canada

Contact Us
Media
Careers
Accessibility Policies
Supporters
Financial Reports
Subscribe

  • © Copyright 2023 CIFAR. All Rights Reserved.
  • Charitable Registration Number: 11921 9251 RR0001
  • Terms of Use
  • Privacy
  • Sitemap

Subscribe

Stay up to date on news & ideas from CIFAR.

This website stores cookies on your computer. These cookies are used to collect information about how you interact with our website and allow us to remember you. We use this information in order to improve and customize your browsing experience and for analytics and metrics about our visitors both on this website and other media. To find out more about the cookies we use, see our Privacy Policy.
Accept Learn more