/WaPIRL

Original PyTorch implementation of the paper 'Self-Supervised Representation Learning for Wafer Bin Map Defect Pattern Classification'

Primary LanguagePythonMIT LicenseMIT

Self-Supervised Learning for Wafer Bin Map Classification

This repository contains the original PyTorch implementation of the paper 'Self-Supervised Representation Learning for Wafer Bin Map Defect Pattern Classification'.

A. Requirements

conda update -n base conda  # use 4.8.3 or higher
conda create -n wbm python=3.6
conda activate wbm
conda install anaconda
conda install opencv -c conda-forge
conda install pytorch=1.6.0 cudatoolkit=10.2 -c pytorch
pip install pytorch_lightning
pip install albumentations

B. Dataset

  1. Download from the following link: WM-811K
  2. Place the LSWMD.pkl file under ./data/wm811k/.
  3. Run the following script from the working directory:
python process_wm811k.py

C. WaPIRL Pre-training

C-1. From the command line (with default options)

python run_wapirl.py \
    --input_size 96 \
    --augmentation crop \
    --backbone_type resnet \
    --backbone_config 18 \
    --decouple_input \
    --epochs 100 \
    --batch_size 256 \
    --num_workers 4 \
    --gpus 0 \
    --optimizer sgd \
    --learning_rate 1e-2 \
    --weight_decay 1e-3 \
    --momentum 0.9 \
    --scheduler cosine \
    --warmup_steps 0 \
    --checkpoint_root ./checkpoints \
    --write_summary \
    --save_every 10 \
    --projector_type linear \
    --projector_size 128 \
    --temperature 0.07
  • Run python run_wapirl.py --help for more information on arguments.
  • If running on a Windows machine, set num_workers to 0. (multiprocessing does not function well.)

C-2. From a configuration file

python run_wapirl.py @experiments/pretrain_wapirl.txt

D. Fine-tuning

D-1. From the command line (with default options)

python run_classification.py \
    --input_size 96 \
    --augmentation crop \
    --backbone_type resnet \
    --backbone_config 18 \
    --decouple_input \
    --epochs 100 \
    --batch_size 256 \
    --num_workers 4 \
    --gpus 0 \
    --optimizer sgd \
    --learning_rate 1e-2 \
    --weight_decay 1e-3 \
    --momentum 0.9 \
    --scheduler cosine \
    --warmup_steps 0 \
    --checkpoint_root ./checkpoints \
    --write_summary \
    --pretrained_model_file /path/to/file \
    --pretrained_model_type wapirl \
    --label_proportion 1.00 \
    --label_smoothing 0.1 \
    --dropout 0.5
  • IMPORTANT: Provide the correct path for the pretrained_model_file argument.
  • Run python run_classification.py --help for more information on arguments.
  • If running on a Windows machine, set num_workers to 0. (multiprocessing does not function well.)

D-2. From a configuration file

python run_classification.py @experiments/finetune_wapirl.txt