Sampling-based Nystru00f6m Approximation and Kernel Quadrature

Satoshi Hayakawa,u00a0Harald Oberhauser,u00a0Terry Lyons

We analyze the Nystru00f6m approximation of a positive definite kernel associated with a probability measure. We first prove an improved error bound for the conventional Nystru00f6m approximation with i.i.d. sampling and singular-value decomposition in the continuous regime; the proof techniques are borrowed from statistical learning theory. We further introduce a refined selection of subspaces in Nystru00f6m approximation with theoretical guarantees that is applicable to non-i.i.d. landmark points. Finally, we discuss their application to convex kernel quadrature and give novel theoretical guarantees as well as numerical observations.