todo: get an autoencoder demo running
Closed this issue · 3 comments
Read up about autoencoders and get a demo running.
For example, https://www.tensorflow.org/tutorials/generative/autoencoder.
This demo uses tensorflow.
It appears to have a demo using images of clothing.
The first step is to get this demo working on your own computing environment.
cc @Templar129
The autoencoder example .ipynb file has been downloaded. It takes some time to configure the local environment for the codes to run. If using Google colab, then this code can instantly run smoothly.
But to align with other data we use, it has to be working on local Jupyter notebook editors, like VS code I am using. After successfully installed TensorFlow, it can be ran on my local coding IDE.
Next step would be look closely into the example, and find out how it works and where it has similarities with the TG data we have, so we can make sure we are able to train similar models.
Day 2:
The example Autoencoder can run on local environment. It can use pictures as input and then trained to produce desired outputs with a low loss.
Next step should be take a look at the format of data it uses, and do wrangling on TG data so it can be used in the model
Great! Let's close this one then and move on to #15 🎉