|Scalable Transformed Additive Signal Decomposition By Non-Conjugate Gaussian Process Inference|
|Adam Vincent, Gatsby Unit, University College London|
James Hensman, Lancaster University
Maneesh Sahani, Gatsby Unit, University College London
Many functions and signals of interest are formed by the addition of multiple underlying components, often non-linearly transformed and modified by noise. Examples may be found in the literature on Generalized Additive Models and Underdetermined Source Separation or other mode decomposition techniques. Recovery of the underlying component processes often depends on finding and exploiting statistical regularities within them. Gaussian Processes (GPs) have become the dominant way to model statistical expectations over functions. Recent advances make inference of the GP posterior efficient for large scale datasets and arbitrary likelihoods. Here we extend these methods to the additive GP case, thus achieving scalable marginal posterior inference over each latent function in settings such as those above.