This code repository contains "AutoEncoder(AE), Denoising AutoEncoder(DAE), Contractive AutoEncoder (CAE), Contractive Higher-Order(2nd order) AutoEncoder (CAE+H)" written on python. The codes extensively use the lecture notes and base code infrastructures in CS231 Stanford (http://cs231n.stanford.edu/) and CENG 783 METU (http://www.kovan.ceng.metu.edu.tr/~sinan/DL/).
----Some of the features that you can find in the codes----
- "Euclidean" and "Cross Entropy" loss options are selectable in code.
- Use of bias in computations is selectable.
- Shared weights are used in the course of mapping input to hidden layer and hidden to output layer.
- test.py includes small examples for four autoencoder types run on MNIST dataset.
- You should previously install required packages such as "numpy" from the web.
In case of any failure/recommendation, please don't hesitate to connect with the author (Savas Ozkan / savasozkan.com).
Some results with 2000 iterations and 0.1 learning step parameter configurations.
AutoEncoder (euclidean loss=6.6937)
Denosing AutoEncoder (euclidean loss=6.8654)
Contractive AutoEncoder (euclidean loss=6.0982)
Contractive Higher-Order AutoEncoder (euclidean loss=6.0628)