This repo contains the training and evaluation code for the project DeepSea: An efficient deep learning model for automated cell segmentation and tracking.
This work presents a versatile and trainable deep-learning-based software, termed DeepSea, that allows for both segmentation and tracking of single cells in sequences of phase-contrast live microscopy images.
To download our datasets go to https://deepseas.org/datasets/ or:
-
Link to Original annotated dataset
They are saved in the folder "trained_models".
-
[Optional] Create a conda or python virtual environment.
-
Install required packages using the
requirements.txt
file.
pip install -r requirements.txt
Run train_segmentation.py with the train set of segmentation dataset
Example:
python train_segmentation.py --train_dir segmentation_dataset/train/ --lr 0.001 --max_epoch 200 --batch_size 32 --output_dir tmp/
Run train_tracker.py with the train set of tracking dataset
Example:
python train_tracker.py --train_dir tracking_dataset/train/ --lr 0.001 --max_epoch 200 --batch_size 32 --output_dir tmp/
Run test_segmentation.py with the test set of segmentation dataset and trained segmentation model
Example:
python test_segmentation.py --test_dir segmentation_dataset/test/ --ckpt_dir trained_models/segmentation.pth --output_dir tmp/
Run test_tracker.py with the test set of tracking dataset and trained tracker model
Example:
python test_tracker.py --test_dir tracking_dataset/test --ckpt_dir trained_models/tracker.pth --output_dir tmp/
Run measure_MOTA.py with a time-lapse microscopy set and both segmentation and tracker models
Example:
python measure_MOTA.py --test_dir tracking_dataset/test/set_9_MC2C12/ --seg_ckpt_dir trained_models/segmentation.pth --tracker_ckpt_dir trained_models/tracker.pth --output_dir tmp/
Our DeepSea software isavailable on https://deepseas.org/software/ with examples and instructions. DeepSea is a user-friendly software designed to enable researchers to 1) load and explore their phase-contrast cell images in a high contrast display, 2) detect and localize cell bodies, 3) track and label cell lineages across the frame sequences, 4) manually correct the DeepSea models' outputs, 5) train a new model with a new cell type dataset , 6) save the results and reports on the local system. It employs our last trained DeepSea models in the segmentation and tracking processes.
If you have any questions, contact us at abzargar@ucsc.edu.
This work was supported by the NIGMS/NIH through a Pathway to Independence Award 435 K99GM126027 (S.A.S.) and start- up package of the University of California, Santa Cruz