TorchDR is an open-source dimensionality reduction (DR) library using PyTorch. Its goal is to accelerate the development of new DR methods by providing a common simplified framework.
DR aims to construct a low-dimensional representation (or embedding) of an input dataset that best preserves its geometry encoded via a pairwise affinity matrix . To this end, DR methods optimize the embedding such that its associated pairwise affinity matches the input affinity. TorchDR provides a general framework for solving problems of this form. Defining a DR algorithm solely requires choosing or implementing an Affinity object for both input and embedding as well as an objective function.
Benefits of TorchDR include:
Modularity
All of it is written in python in a highly modular way, making it easy to create or transform components.
Speed
Supports GPU acceleration, sparsity and batching strategies with contrastive learning techniques.
Memory efficiency
Relies on pykeops[19] symbolic tensors to avoid memory overflows.
Compatibility
Implemented methods are fully compatible with the sklearn[21] API and torch[20] ecosystem.
This library is a community-driven project and welcomes contributions of all forms.
Getting Started
TorchDR offers a user-friendly API similar to scikit-learn where dimensionality reduction modules can be called with the fit_transform method. It seamlessly accepts both NumPy arrays and PyTorch tensors as input, ensuring that the output matches the type and backend of the input.
MNIST example.
Here is a comparison of various neighbor embedding methods on the MNIST digits dataset.
The code to generate this figure is available here.
Single cell example.
Here is an example of single cell embeddings using TorchDR, where the embeddings are colored by cell type and the number of cells is indicated in each title.
TorchDR features a wide range of affinities which can then be used as a building block for DR algorithms. It includes:
Usual affinities such that scalar product, Gaussian and Student kernels.
Affinities based on k-NN normalizations such Self-tuning affinities [22] and MAGIC [23].
Doubly stochastic affinities with entropic [5][6][7][16] and quadratic [10] projections.
Adaptive affinities with entropy control [1][4] and its symmetric version [3].
Dimensionality Reduction Algorithms
Spectral. TorchDR provides spectral embeddings[11] calculated via eigenvalue decomposition of the affinities or their Laplacian.
Neighbor Embedding. TorchDR includes various neighbor embedding methods such as SNE[1], t-SNE[2], t-SNEkhorn[3], UMAP[8], LargeVis[13] and InfoTSNE[15].
Installation
The library is not yet available on PyPI. You can install it from the source code.
Sebastian Damrich, Jan Niklas Böhm, Fred Hamprecht, Dmitry Kobak (2023). From t-SNE to UMAP with contrastive learning. International Conference on Learning Representations (ICLR).