Petros Dellaportas Seminar
AUEB STATISTICS SEMINAR SERIES OCTOBER 2020
Petros Dellaportas, Professor in Statistical Science, Department of Statistical Science, University College London and Professor of Statistics, Department of Statistics, Athens University of Economics and Business
Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction
We provide a linear time inferential framework for Gaussian processes that supports automatic feature extraction through deep neural networks and low-rank kernel approximations. Importantly, we derive approximation guarantees bounding the Kullback-Leibler divergence between the idealized Gaussian process and one resulting from a low-rank approximation to its kernel under two types of approximations, which result in two instantiations of our framework: Deep Fourier Gaussian Processes, resulting from random Fourier feature low-rank approximations, and Deep Mercer Gaussian Processes, resulting from truncating the Mercer expansion of the kernel. We do extensive experimental evaluation of these two instantiations in a broad collection of real-world datasets providing strong evidence that they outperform a broad range of state-of-the-art methods in terms of time efficiency, negative log-predictive density, and root mean squared error.
(Presentation slides can be found here)