Deep learning of an embedding mapping using t-SNE as a loss function on top of a 3-hidden-layer neural network. Use pytorch !
learn a DNN with pre-computed t-SNE
https://nbviewer.jupyter.org/github/HanchenXiong/deepembedding/blob/master/deepebedding-with-pre-tSNE.ipynb
learn a direct deep embedding using a DNN with t-SNE as loss function
https://nbviewer.jupyter.org/github/HanchenXiong/deepembedding/blob/master/deepebedding-with-tSNE-wholeloss.ipynb
learn a direct deep embedding using a DNN with t-SNE as loss function (batch version of computing P)
https://nbviewer.jupyter.org/github/HanchenXiong/deepembedding/blob/master/deepebedding-with-tSNE-batchloss.ipynb
Instead of computing global P, a local P is computed for each mini batch. This does work, it's promising in terms of scaling up.