Repository for Practical Deep Learning Final Project presentation
-
We have devised a dropout policy that varies the dropout at each epoch based on the training and validation loss of previous epochs
-
The policy has been evaluated on the following datasets:
- Fashion-MNIST (Image)
- CIFAR-10 (Image)
- IMDB Reviews (Text)
-
Models used were:
- LeNet-5
- BERT semantic classifier (derived from: https://www.kaggle.com/atulanandjha/bert-testing-on-imdb-dataset-extensive-tutorial)
-
Instructions:
- Each dataset has it's own folder containing the notebook with the code to run and results.
- The training and testing data is in a '.txt' file in each folder.
- It is recommended to run the notebooks on a GPU
- Kaggle link for the BERT notebook (with the dataset): https://www.kaggle.com/viswajitvinodnair/bert-dynamic?scriptVersionId=82629256
-
Results: