Skip to content
CIFAR header logo
fr
menu_mobile_logo_alt
  • News
  • Events
    • Public Events
    • Invitation-only Meetings
  • Programs
    • Research Programs
    • Pan-Canadian AI Strategy
    • Next Generation Initiatives
    • Global Call for Ideas
  • People
    • Fellows & Advisors
    • CIFAR Azrieli Global Scholars
    • Canada CIFAR AI Chairs
    • AI Strategy Leadership
    • Solution Network Members
    • Leadership
  • Support Us
  • About
    • Our Story
    • CIFAR 40
    • Awards
    • Partnerships
    • Publications & Reports
    • Careers
    • Staff Directory
    • Equity, Diversity & Inclusion
  • fr
AI and Society

A Culture of Ethical AI: Workshop report

By: CIFAR
3 Aug, 2022
August 3, 2022
AIPolBr1_banner

What steps can organizers of AI conferences take to encourage reflection on the societal impacts of AI research?  

Against the backdrop of increasing use of artificial intelligence (AI) technologies in everyday life and growing private investment in the area, more researchers are entering the field of AI than ever before. The increasing relevance of AI has come with a wider awareness of its potential harmful real-world impacts, including on the environment, marginalized communities, and society at large.

How can the AI research community better anticipate the downstream consequences of AI research? And how can AI researchers mitigate potential negative impacts of their work such as inappropriate applications, unintended and malicious use, accidents, and societal harms?

In early 2022, CIFAR, Partnership on AI and the Ada Lovelace Institute brought together recent ML conference organizers and AI ethics experts to consider what conference organizers can do to encourage the habit of reflecting on potential downstream impacts of AI research among submitting authors.

“AI has amazing potential for doing a lot of good in our world. But it also carries tremendous potential for harm, if not conducted responsibly,” says Elissa Strome, Executive Director of the Pan-Canadian AI Strategy at CIFAR. “In an academic environment of ‘publish or perish’ and ‘fast science,’ the AI research community must systematize the practice of pausing to meaningfully consider the ethical implications of research prior to implementation and spread. As the central hubs of academic knowledge-sharing, conferences are a really smart place to start. Alongside our international collaborators Partnership on AI and the Ada Lovelace Institute, CIFAR is pleased to be sharing the fantastic conversations and tools developed through our workshop, which we hope conference organizers worldwide can adapt in their own activities to help spread the practice of responsible AI.”

Read the full report

 

For more information, contact:
Gagan Gill
Program Manager, AI & Society, CIFAR

 

  • Follow Us

Related Articles

  • International Women’s Day 2023: a call for innovation and technology for gender equality
    March 08, 2023
  • What will be the impact of AI-assisted robotics on humanity?
    February 15, 2023
  • CIFAR partners with Actua for Indigenous next-generation training in AI
    November 08, 2022
  • DLRL 2022: That’s a wrap, and a return to in-person learning in 2023
    September 28, 2022

Support Us

CIFAR is a registered charitable organization supported by the governments of Canada, Alberta and Quebec, as well as foundations, individuals, corporations and Canadian and international partner organizations.

Donate Now
CIFAR header logo

MaRS Centre, West Tower
661 University Ave., Suite 505
Toronto, ON M5G 1M1 Canada

Contact Us
Media
Careers
Accessibility Policies
Supporters
Financial Reports
Subscribe

  • © Copyright 2023 CIFAR. All Rights Reserved.
  • Charitable Registration Number: 11921 9251 RR0001
  • Terms of Use
  • Privacy
  • Sitemap

Subscribe

Stay up to date on news & ideas from CIFAR.

This website stores cookies on your computer. These cookies are used to collect information about how you interact with our website and allow us to remember you. We use this information in order to improve and customize your browsing experience and for analytics and metrics about our visitors both on this website and other media. To find out more about the cookies we use, see our Privacy Policy.
Accept Learn more