PyTorch implementation of the fellwoing paper: PR-DAD: Phase Retrieval Using Deep Auto-Decoders join with prof. Shai Dekel, School of mathematical sciences, Tel-Aviv University
- Python3 =>3.8
- PyTorch =>1.9
- Cuda =>11
- Hardware requred: Ubuntu, NVIDIA Tesla V100 16Gib and 8x Intel Xeon E5-2686v4, recomeded AWS EC2 type:
p3.2xlarge
- Requred packages
requirements.txt
We agregate For each dataset per type of features we trained model and json config with hyperparameters in Table
- Run trainer: Trainer uses ClearML logger.
python training/phase_retrieval_trainer.py --experiment_name my-experiment --config_path url_path/config-trainer.json **kwargs
my-experiment
- ClearML experiment name
config_path
- path(local/s3) to json with trainings hyperparameters
**kwargs
(optinal) - change spefic parameters
Example:
python training/phase_retrieval_trainer.py
--experiment_name my-experiment
--config_path s3://url_path/config-trainer.json
--path_pretrained s3://model_url/model.pt
--batch_size 16
--ae_type wavelet-net
--wavelet_type haar
- Run evaluation:
python training/phase_retrival_evaluator.py --model_type path_to_model/model.pt --config url_path/config-trainer.json
Dataset | Haar Features | ConvNet Features |
---|---|---|
MNIST | Model, Config-Trainer | Model, Config-Trainer |
EMNIST | Model, Config-Trainer | Model, Config-Trainer |
KMNIST | Model, Config-Trainer | Model, Config-Trainer |
Fashion-MNIST | Model, Config-Trainer | Model, Config-Trainer |
CelebA | Model, Config-Trainer | Model, Config-Trainer |