Sketching and learning: from clustering to interpolation
We discuss two problems which at first glance appear quite different: geometric clustering and function interpolation. In geometric clustering, k-means is an expectation-maximization algorithm and is used ubiquitously despite only being guaranteed to converge to local minimizers of the underlying nonconvex clustering objective. In learning theory, the standard is to fit a collection of point-wise measurements to a least-squares fitting polynomial of fixed maximal degree. In both situations — clustering and interpolation -- the method of choice can be interpreted as a spectral approximation to an underlying combinatorial optimization problem. In each case, a convex relaxation to the combinatorial optimization problem can be derived, and is provably more powerful and robust. We discuss concrete results in these directions, as well as the tools in probability and approximation theory that are developed along the way.