Uniform Convergence Rates for Lipschitz Learning on Graphs
Lipschitz learning is a graph-based semi-supervised learning method where one
extends labels from a labeled to an unlabeled data set by solving the infinity
Laplace equation on a weighted graph. In this work we prove uniform convergence
rates for solutions of the graph infinity Laplace equation as the number of
vertices grows to infinity. Their continuum limits are absolutely minimizing
Lipschitz extensions with respect to the geodesic metric of the domain where
the graph vertices are sampled from. We work under very general assumptions on
the graph weights, the set of labeled vertices, and the continuum domain. Our
main contribution is that we obtain quantitative convergence rates even for
very sparsely connected graphs, as they typically appear in applications like
semi-supervised learning. In particular, our framework allows for graph
bandwidths down to the connectivity radius. For proving this we first show a
quantitative convergence statement for graph distance functions to geodesic
distance functions in the continuum. Using the "comparison with distance
functions" principle, we can pass these convergence statements to infinity
harmonic functions and absolutely minimizing Lipschitz extensions.