David Duvenaud’s research focuses on constructing deep probabilistic models to predict, explain and design things. He has developed continuous-depth neural networks which can adapt the amount of computation they need depending on the difficulty of the task. He builds models which can propose new molecules that have specified properties, and is working on automating parts of the data-collection pipeline for behaviour experiments.
- Best Paper Award. Neural Information Processing Systems Conference, 2018.
- Li, X., Chen, R. T. Q., Wong, T.-K. L., & Duvenaud, D. (2020). Scalable gradients for stochastic differential equations. In Artificial intelligence and statistics.
- Chen, R. T., Rubanova, Y., Bettencourt, J., & Duvenaud, D. K. (2018). Neural ordinary differential equations. In Advances in neural information processing systems (pp. 6571-6583).
- Chang, C.-H., Creager, E., Goldenberg, A., & Duvenaud, D. (2019). Explaining image classifiers by adaptive dropout and generative in-filling. In International conference on learning representations.
- Grathwohl, W., Choi, D., Wu, Y., Roeder, G., & Duvenaud, D. (2017). Backpropagation through the void: Optimizing control variates for black-box gradient estimation. arXiv preprint arXiv:1711.00123.
- Gómez-Bombarelli, R., Wei, J. N., Duvenaud, D., Hernández-Lobato, J. M., Sánchez-Lengeling, B., Sheberla, D., ... & Aspuru-Guzik, A. (2018). Automatic chemical design using a data-driven continuous representation of molecules. ACS central science, 4(2), 268-276.
CIFAR is a registered charitable organization supported by the governments of Canada, Alberta, Ontario, and Quebec as well as foundations, individuals, corporations, and international partner organizations.