Paper: Faster Kernel Matrix Algebra via Density Estimation
By
Consider an n x n Gaussian kernel matrix corresponding to n input points in d dimensions. We show that one can compute a relative error approximation to the sum of entries in this matrix in just O(dn^{2/3}) time. This is significantly sublinear in the number of entries in the matrix – which is n^2. Our algorithm combines a novel analysis of entrywise sampling with fast kernel density estimation methods based on locality sensitive hashing. We extend our results to other popular kernels and build on this basic result to give other fast linear algebraic primitives for kernel matrices. For example, we give the first subquadratic time algorithms for approximating the top eigenvector and top eigenvalue of a Gaussian kernel matrix.