Hello! I'm Jacob R. Gardner

jacobrg@seas.upenn.edu

I am an assistant professor at the University of Pennsylvania in the Computer and Information Science department. Our lab does research spanning both the practice and theory of probabilistic machine learning. I’m particularly interested in how we can use techniques like generative modelling and Bayesian optimization to solve challenging design and optimization problems in the natural sciences, like discovering new and more efficient antibiotics, vaccines, antibodies, materials, and more.

Before I joined Penn, I was a research scientist at Uber AI Labs. Before this, I was a postdoctoral associate in Operations Research and Information Engineering at Cornell University. I received my Ph.D. in Computer Science from Cornell University, where I was advised by Kilian Weinberger.

My Group

Haydn Jones
(PhD Student)

Yimeng Zeng
(PhD Student)

Selected Recent Publications

See my Google scholar page for a complete list.

Local Latent Space Bayesian Optimization over Structured Inputs [Paper]
Natalie Maus, Haydn T Jones, Juston S Moore, Matt J Kusner, John Bradshaw, Jacob R Gardner
Neural Information Processing Systems (NeurIPS 2022).

Discovering Many Diverse Solutions with Bayesian Optimization [Paper]
Natalie Maus, Kaiwen Wu, David Eriksson, Jacob Gardner
Artificial Intelligence and Statistics (AISTATS 2023). Notable paper.

The Behavior and Convergence of Local Bayesian Optimization [Paper]
Kaiwen Wu, Kyurae Kim, Roman Garnett, Jacob R. Gardner
Neural Information Processing Systems (NeurIPS 2023).Spotlight.

On the Convergence of Black-Box Variational Inference [Paper]
Kyurae Kim, Jisu Oh, Kaiwen Wu, Yian Ma, Jacob R. Gardner
Neural Information Processing Systems (NeurIPS 2023).

Practical and Matching Gradient Variance Bounds for Black-Box Variational Bayesian Inference. [Paper]
Kyurae Kim, Kaiwen Wu, Jisu Oh, Jacob R. Gardner
International Conference on Machine Learning (ICML 2023). Oral.

Local Bayesian optimization via maximizing probability of descent [Paper]
Quan Nguyen, Kaiwen Wu, Jacob R Gardner, Roman Garnett
Neural Information Processing Systems (NeurIPS 2022). Oral.

Preconditioning for Scalable Gaussian Process Hyperparameter Optimization [Paper]
Jonathan Wenger, Geoff Pleiss, Philipp Hennig, John P Cunningham, Jacob R Gardner
International Conference on Machine Learning (ICML 2022). Long talk.

Simple Blackbox Adversarial Attacks [Paper]
Chuan Guo, Jacob R. Gardner, Yurong You, Andrew G. Wilson, Kilian Q. Weinberger
International Conference on Machine Learning (ICML 2019).

Scalable Global Optimization via Local Bayesian Optimization [Paper]
David Eriksson, Michael Pearce, Jacob R. Gardner, Ryan Turner, Matthias Poloczek
Neurial Information Processing Systems (NeurIPS 2019) Spotlight.

GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration [Paper]
Jacob R. Gardner, Geoff Pleiss, Kilian Q. Weinberger, David Bindel, Andrew G. Wilson
Neurial Information Processing Systems (NeurIPS 2018). Spotlight.

Bayesian Optimization with Inequality Constraints. [Paper]
Jacob R. Gardner, Matt J. Kusner, Zhixiang Xu, Kilian Q. Weinberger, John P. Cunningham
International Conference on Machine Learning (ICML 2014)

Software

Geoff and I founded the GPyTorch project, which aims to implement Gaussian processes in a modular package with strong GPU acceleration. It is deeply embedded in the PyTorch ecosystem, and makes designing complicated models like deep kernel learning both easy to implement and highly efficient. It departs significantly from existing Gaussian process libraries, in that it makes use of modern numerical linear algebra techniques like linear conjugate gradients to perform the fundamental operations required for inference, rather than standard Cholesky based approaches.