David Duvenaud is a Canada CIFAR AI Chair at the Vector Institute and an assistant professor in the department of computer science and statistical sciences at the University of Toronto. He is also a founding member of the Vector Institute and the co-founder of Invenia, an energy forecasting and trading company.
Duvenaud’s research focuses on constructing deep probabilistic models to predict, explain and design things. He has developed continuous-depth neural networks which can adapt the amount of computation they need depending on the difficulty of the task. He builds models which can propose new molecules that have specified properties, and is working on automating parts of the data-collection pipeline for behaviour experiments.
- Best Paper Award, Neural Information Processing Systems Conference (NIPSC), 2018
Li, X., Chen, R. T. Q., Wong, T.-K. L., & Duvenaud, D. (2020). Scalable gradients for stochastic differential equations. In Artificial intelligence and statistics.
Chang, C.-H., Creager, E., Goldenberg, A., & Duvenaud, D. (2019). Explaining image classifiers by adaptive dropout and generative in-filling. In International conference on learning representations.
Chen, R. T., Rubanova, Y., Bettencourt, J., & Duvenaud, D. K. (2018). Neural ordinary differential equations. In Advances in neural information processing systems (pp. 6571-6583).
Gómez-Bombarelli, R., Wei, J. N., Duvenaud, D., Hernández-Lobato, J. M., Sánchez-Lengeling, B., Sheberla, D., … & Aspuru-Guzik, A. (2018). Automatic chemical design using a data-driven continuous representation of molecules. ACS central science, 4(2), 268-276.
Grathwohl, W., Choi, D., Wu, Y., Roeder, G., & Duvenaud, D. (2017). Backpropagation through the void: Optimizing control variates for black-box gradient estimation.
CIFAR is a registered charitable organization supported by the governments of Canada, Alberta and Quebec, as well as foundations, individuals, corporations and Canadian and international partner organizations.