Skip to content
CIFAR header logo
fr
menu_mobile_logo_alt
  • News
  • Events
    • Public Events
    • Invitation-only Meetings
  • Programs
    • Research Programs
    • Pan-Canadian AI Strategy
    • Next Generation Initiatives
    • Global Call for Ideas
  • People
    • Fellows & Advisors
    • CIFAR Azrieli Global Scholars
    • Canada CIFAR AI Chairs
    • AI Strategy Leadership
    • Solution Network Members
    • Leadership
  • Support Us
  • About
    • Our Story
    • CIFAR 40
    • Awards
    • Partnerships
    • Publications & Reports
    • Careers
    • Staff Directory
    • Equity, Diversity & Inclusion
  • fr
  • Home
  • Bio

Follow Us

Photo of Megan Peters

Megan Peters

Appointment

Fellow

CIFAR Azrieli Global Scholar 2019-2021

Brain, Mind & Consciousness

Connect

Website

Google Scholar

About

Megan Peters draws on insights and approaches from cognitive science, psychology, computational neuroscience, and philosophy to understand how brains process and evaluate sensory information.

She is particularly interested in adaptive behaviour and learning. Using computational modeling and neuroimaging techniques, she asks questions like: How is noisy, ambiguous information represented in neuronal activity and neural connections? How does a brain know about or metacognitively evaluate its own noise, or “feel” confident that it has interpreted incoming information correctly? How does it learn what to expect based on past experience, and when to update those expectations? What can we learn from human and animal neural processing that will be beneficial to the development of artificial systems? Which brain areas and computations for all these abilities also give rise to our phenomenological, subjective experiences?  That is, why and how is there something that it’s like to be conscious of our world and ourselves?

Awards

  • Scialog Fellow in the Molecular Basis of Cognition, Research Corporation for Science Advancement, 2022
  • Graduate Research Fellowship, National Science Foundation, 2010
  • Behavioral Neuroscience Training Fellowship, National Institutes of Health, 2009
  • Chancellor's Prize, University of California Los Angeles, 2009

Relevant Publications

  • Peters, M.A.K. (2022). Towards characterizing the canonical computations generating phenomenal experience. Neuroscience & Biobehavioral Reviews. 142, 104903. DOI: 10.1016/j.neubiorev.2022.104903.
  • Peters, M.A.K.*, Thesen, T.*, Ko, Y.D.*, Maniscalco, B., Carlson, C., Davidson, M., Doyle, W., Kuzniecky, R., Devinsky, O., Halgren, E., & Lau, H. (2017). Perceptual confidence neglects decision-incongruent evidence in the brain. Nature Human Behaviour. DOI: 10.1038/s41562-017-0139
  • Peters, M.A.K., & Lau, H. (2015). Human observers have optimal introspective access to perceptual processes even for visually masked stimuli. eLife. DOI: 10.7554/eLife.09651

Institution

University of California Irvine

Department

Cognitive Sciences

Education

  • PhD (Psychology: Computational & Cognitive Neuroscience), University of California Los Angeles
  • MA (Psychology: Cognitive Neuroscience), University of California Los Angeles
  • BA (Cognitive Science), Brown University

Country

United States

Support Us

CIFAR is a registered charitable organization supported by the governments of Canada, Alberta and Quebec, as well as foundations, individuals, corporations and Canadian and international partner organizations.

Donate Now
CIFAR header logo

MaRS Centre, West Tower
661 University Ave., Suite 505
Toronto, ON M5G 1M1 Canada

Contact Us
Media
Careers
Accessibility Policies
Supporters
Financial Reports
Subscribe

  • © Copyright 2023 CIFAR. All Rights Reserved.
  • Charitable Registration Number: 11921 9251 RR0001
  • Terms of Use
  • Privacy
  • Sitemap

Subscribe

Stay up to date on news & ideas from CIFAR.

This website stores cookies on your computer. These cookies are used to collect information about how you interact with our website and allow us to remember you. We use this information in order to improve and customize your browsing experience and for analytics and metrics about our visitors both on this website and other media. To find out more about the cookies we use, see our Privacy Policy.
Accept Learn more