UNIT/Style_Transfer Models

Example

## Model名称
- 项目github地址: xxx.xxx.xxx
- 项目本地路径: xxx/xxx/xxx
- output路径: xxx/xxx/xxx
- 训练数据集本地路径: xxx/xxx/xxx
- 训练时长: training_time
- fid值: xxx.xxx
- Model训练小结本地路径(方便他人重复实验): xxx/xxx/xxx

Semantic Segmentation on MIT ADE20K dataset in PyTorch

UNIT

  • 项目github地址: https://github.com/mingyuliutw/UNIT
  • 项目本地路径: /home/sway007/UNIT-local
  • 训练数据集本地路径:
    • /home/sway007/UNIT-local/udacity_datasets
    • /home/sway007/img-sources
  • 训练时长: todo
  • fid值: todo
  • Model训练小结本地路径: todo

MUNIT

  • 项目github地址: https://github.com/NVlabs/MUNIT
  • 项目本地路径: /home/undergrats/tzLuo/MUNIT
  • output路径: /home/undergrats/tzLuo/MUNIT/transfer
  • 训练数据集本地路径: /home/undergrats/tzLuo/MUNIT/datasets
  • 训练时长: 100k~200k training iterations per 24 hours
  • fid值: 217.82906651645953 (uda2rain, 200k iterations)
  • Model训练小结本地路径: todo

CycleGAN

  • 项目github地址: https://github.com/hardikbansal/CycleGAN
  • 项目本地路径: /home/undergrats/ywCHENG/CycleGAN
  • 训练数据集本地路径: `/home/undergrats/ywCHENG/Oxford_*.zip
  • 训练时长: about 24 h (Since the best images appear during the training process, the epoch will be changed to the best number in the following)
  • fid值: 275.04948 with the test results in /home/undergrats/ywCHENG/CycleGAN/output/imgs/test
  • Model训练小结本地路径: /home/undergrats/ywCHENG/Report.md along with all the output obtained during Cheng Yiwei's work in /home/undergrats/ywCHENG/CycleGAN/output/imgs.
  • 自动驾驶系统生成图片路径: /home/undergrats/ywCHENG/CycleGAN/output/imgs/prediction.zip
  • 自动驾驶系统生成转角数据: /home/undergrats/ywCHENG/CycleGAN/output.csv

deep-photo-styletransfer

AdaIN-style

  • 项目github地址:

  • 项目本地路径: /home/sway007/git-repos/style-transfer/AdaIN-style

  • 训练数据集本地路径: content_dir(/home/sway007/datasets/unamed_dataset) style_dir(/home/sway007/datasets/Images_Oxford_Sun/snow_1)

  • output路径: /home/sway007/git-repos/style-transfer/AdaIN-style/custom_output

  • 训练时长:

    real	2349m46.212s
    user	3782m18.764s
    sys	554m26.476s
  • fid值: TODO

  • Model训练小结本地路径(方便他人重复实验): exp-reports/AdaIN-style.md

FastPhotoStyle

TIPs: This model uses CSAILVision in itself, there's no need to generate more segmentation images from outside.

And you'd better resize two inputs to the same width and height to prevent possible ValueError.

  • 项目github地址: https://github.com/NVIDIA/FastPhotoStyle

  • 项目本地路径: /home/undergrats/ywCHENG/FastPhotoStyle

  • 训练数据集本地路径: Pre-trained model

  • output路径:/home/undergrats/ywCHENG/FastPhotoStyle/results

  • fid: 77.285

  • Model训练小结本地路径: /home/undergrats/ywCHENG/FastPhotoStyle/Guidance_from_CHENG.md

  • 自动驾驶系统生成图片路径: /home/undergrats/ywCHENG/FastPhotoStyle/predic/prediction.zip

  • 自动驾驶系统生成转角数据: /home/undergrats/ywCHENG/FastPhotoStyle/output.csv

Fast-neural-transfer

  • 项目github地址: : https://github.com/jcjohnson/fast-neural-style

  • 项目本地路径: /home/undergrats/Zeke/fast-neural-style

  • 训练数据集本地路径: content images: /home/sway007/datasets/udacity_day
    output images: /home/undergrats/Zeke/fast-neural-style/output

  • style images location: /home/undergrats/Zeke/fast-neural-style/style

  • 训练时长:

    For training model: about 10 minutes everything 2000 iterations
    For stylize the image: about 2 images (40~60k) every second.
  • fid值: 139.49777604999912(style.t7, 60k iterations)

  • Model训练小结本地路径: /home/undergrats/Zeke/fast-neural-style/Fast-neural-style.md

Arbitrary-Style-Transfer

TIPs: This framework is designed based on the same paper as AdaIN-style. So if you have tried AdaIN, it's a better choice to skip this model:)

  • 项目github地址: : https://github.com/elleryqueenhomels/arbitrary_style_transfer
  • 项目本地路径: /home/undergrats/Zeke/arbitrary_style_transfer
  • 训练数据集本地路径: content images: /home/sway007/datasets/udacity_day output images:/home/undergrats/Zeke/arbitrary_style_transfer/output
  • style images location: /home/undergrats/Zeke/arbitrary_style_transfer/images/style
  • 训练时长:
    For stylize the image: about 1 images (40~60k) every 5 seconds.
  • fid值: 196.11790769724644(pre-trained model)
  • Model训练小结本地路径: /home/undergrats/Zeke/arbitrary_style_transfer/Arbitrary-Style-Transfer.pdf

Metrics

Steering Angle Difference

Fréchet Inception Distance

  • 实验代码地址: github
  • 代码本地路径: /home/sway007/git-repos/pytorch_inception_score
  • 实验总结本地地址: todo

UNIT/Style_Transfer Model训练时长

TODOs