/LipschitzLearningRates

Code for testing convergence rates of Lipschitz learning on graphs

Primary LanguagePythonMIT LicenseMIT

📈 LipschitzLearningRates

The code in this repository reproduces the experimental results on convergence rates for k-nearest neighbor graph infinity Laplacians from our paper

Bungert, Calder, and Roith. Uniform Convergence Rates for Lipschitz Learning on Graphs. IMA Journal of Numerical Analysis, 2022.

Feel free to use it and please refer to our paper when doing so.

@article{bungert2022uniform,
    author = {Bungert, Leon and Calder, Jeff and Roith, Tim},
    title = {Uniform convergence rates for Lipschitz learning on graphs},
    journal = {IMA Journal of Numerical Analysis},
    year = {2022},
    month = {09},
    doi = {10.1093/imanum/drac048}
}

🔧 Usage

The Python package GraphLearning is required to run the experiments. Install with

pip install graphlearning

The script aronsson_experiment.py runs convergence rate experiments, to see the usage run

python aronsson_experiment.py -h

The bash script all_experiments.sh contains the commands for running all experiments from our paper. The results of the experiments are saved in .csv files in the results folder. To generate the plots and figures from the paper run

python generate_plots.py

All figures are saved in the figures folder.

💡 Mathematical Models

Our code verifies our convergence proofs for solutions of the graph infinity Laplacian equation

on a point cloud with constraint set to an Absolutely Minimizing Lipschitz Extension on the continuum domain with constraint set , i.e., a solution of

where is the geodesic Lipschitz constant.

The relative scaling of the graph bandwidth to the resolution of the graph , defined as Hausdorff distance between and can be set with the -b option in all_experiments.sh.