U-NET model for white matter segmentation
The dataset is the one used during the WMH Segmentation Challenge In our repository, it should be placed under a datasets/ folder
Before training the model, we need to generate the data that will be used, to do this, you will need to launch:
- python generate_data.py <path/to/datasets> <save/dir>
Note that the <save/dir> is the directory where the train.pickle
file will be stored.
cd models/clean
python3 main.py <path/to/dataset.pickle> preprocess:0|1 <3d:2|3>
Please look at the file main.py for more information regarding the parameters
Examples of training usage:
- python3 main.py data/train.pickle 0 2
- python3 main.py data/train.pickle 1 2
- python3 main.py data/train.pickle 0 3
- python3 main.py data/train.pickle 1 3
python3 main.py <path/to/dataset.pickle> preprocess:0|1 <3d:2|3>
<path/to/dataset.pickle> should be something like */test.pickle
Please look at the file main.py for more information regarding the parameters After generating the weights and the result file, you can test the model.
python3 test_model.py <path/to/dataset> <path/to/test_pickle> <result.pickle>
With:
- <path/to/dataset> The root folder of the dataset
- <path/to/test_pickle> The same argument given to <path/to/dataset.pickle>
- <result.pickle> The file generated by the command above with "python3 main.py ...."