Robust Kernel-Based Regression Using Orthogonal Matching Pursuit
George Papageorgiou, Pantelis Bouboulis, Sergios Theodoridis

Kernel methods are widely used for approximation of non-linear functions in classic regression problems, using standard techniques, e.g., Least Squares, for denoising data samples in the presence of white Gaussian noise. However, the approximation deviates greatly, when impulse noise outlying the data enters the scene. We present a robust kernel-based method, which exploits greedy selection techniques, particularly Orthogonal Matching Pursuit (OMP), in order to recover the sparse support of the outlying vector; at the same time, it approximates the non-linear function via the mapping to a Reproducing Kernel Hilbert Space (RKHS).