Title: Scalable Gaussian Processes with Markovian Covariances
Speaker: Xiaowei Zhang (HKUST)
Date: December 1, 2017
Location: Shriram 108
Gaussian process (GP) is used in a wide variety of areas, including stochastic simulation, geostatistics, and machine learning. At the core of its related computation is inverting a covariance matrix, which takes O(n^3) time in general and becomes computationally prohibitive for large n, where n is the data size. In addition, the covariance matrix is often poorly conditioned, and thus the inversion is prone to numerical instability, resulting in inaccurate parameter estimation and prediction. These two numerical issues preclude use of GP at a large scale. We provide a novel perspective to address them in this talk. We construct a class of covariance functions having two properties: (i) the covariance matrices they induce can be inverted analytically, and (ii) the inverse matrices are sparse. Then, the inversion-related computational time can be reduced to O(n^2), without resorting to any approximation schemes. Further, if the observation noise is negligible, then numerical inversion is unnecessary and the computational time can be reduced O(n). The key in our approach is that we establish an explicit connection between covariance functions that can induce sparse precision matrices and certain ordinary differential equations. Such a connection entails a wide class of covariance functions that improve substantially both speed and numerical stability of GP, thereby permitting its use in large-scale problems.
Xiaowei Zhang joined the department of Industrial Engineering and Logistics Management at the Hong Kong University of Science and Technology as an Assistant Professor in 2011. He received Ph.D. in Operations Research and M.S. in Financial Mathematics from Stanford University (2011, 2010) and B.S. in Mathematics from Nankai University (2006).
This seminar series is supported through the generosity of Adriana Diener-Veinott and Infanger Investment Technologies.