A web API for Cellsparse implemented with FastAPI.
qupath-extension-cellsparse implements a client for QuPath.
This is a part of the following paper. Please cite it when you use this project.
- Sugawara, K. Training deep learning models for cell image segmentation with sparse annotations. bioRxiv 2023. doi:10.1101/2023.06.13.544786
conda create -n cellsparse-api -y python=3.11
conda activate cellsparse-api
python -m pip install -U pip
python -m pip install "cellsparse-api[tensorflow-macos] @ git+https://github.com/ksugar/cellsparse-api.git"
Microsoft Visual C++ 14.0 or greater is required.
Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/
conda create -n cellsparse-api -y python=3.10
conda activate cellsparse-api
python -m pip install -U pip
conda install -y -c conda-forge cudatoolkit=11.3 cudnn=8.1.0
python -m pip install "tensorflow<2.11"
python -m pip install git+https://github.com/ksugar/stardist-sparse.git
set PYTHONUTF8=1
python -m pip install git+https://github.com/ksugar/cellsparse-api.git
set PYTHONUTF8=0
python -m pip uninstall -y torch torchvision
python -m pip install --no-deps torch torchvision --index-url https://download.pytorch.org/whl/cu113
Please note that training with CPU is very slow.
On Windows Native, Microsoft Visual C++ 14.0 or greater is required.
Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/
conda create -n cellsparse-api -y python=3.11
conda activate cellsparse-api
python -m pip install -U pip
python -m pip install "cellsparse-api[tensorflow] @ git+https://github.com/ksugar/cellsparse-api.git"
conda create -n cellsparse-api -y python=3.11
conda activate cellsparse-api
python -m pip install -U pip
conda install -y -c conda-forge cudatoolkit=11.8
python -m pip install "cellsparse-api[tensorflow] @ git+https://github.com/ksugar/cellsparse-api.git"
The following steps are required only if you're using Linux or WSL2 with CUDA-compatible GPU.
If it is not the case, you can move to the Usage section.
mkdir -p $CONDA_PREFIX/etc/conda/activate.d
echo 'CUDNN_PATH=$(dirname $(python -c "import nvidia.cudnn;print(nvidia.cudnn.__file__)"))' > $CONDA_PREFIX/etc/conda/activate.d/env_vars.sh
echo 'export OLD_LD_LIBRARY_PATH=$LD_LIBRARY_PATH' >> $CONDA_PREFIX/etc/conda/activate.d/env_vars.sh
echo 'export LD_LIBRARY_PATH=$CONDA_PREFIX/lib/:$CUDNN_PATH/lib:$LD_LIBRARY_PATH' >> $CONDA_PREFIX/etc/conda/activate.d/env_vars.sh
mkdir -p $CONDA_PREFIX/etc/conda/deactivate.d
echo 'export LD_LIBRARY_PATH=${OLD_LD_LIBRARY_PATH}' > $CONDA_PREFIX/etc/conda/deactivate.d/env_vars.sh
echo 'unset OLD_LD_LIBRARY_PATH' >> $CONDA_PREFIX/etc/conda/deactivate.d/env_vars.sh
echo 'unset CUDNN_PATH' >> $CONDA_PREFIX/etc/conda/deactivate.d/env_vars.sh
export OLD_LD_LIBRARY_PATH=$LD_LIBRARY_PATH
If you are using WSL2, LD_LIBRARY_PATH
will need to be updated as follows.
export LD_LIBRARY_PATH=/usr/lib/wsl/lib:$LD_LIBRARY_PATH
python -m pip install --no-deps nvidia-cudnn-cu11==8.6.0.163
See details here.
mkdir -p $CONDA_PREFIX/lib/nvvm/libdevice
cp $CONDA_PREFIX/lib/libdevice.10.bc $CONDA_PREFIX/lib/nvvm/libdevice/
echo 'export XLA_FLAGS=--xla_gpu_cuda_data_dir=$CONDA_PREFIX/lib' >> $CONDA_PREFIX/etc/conda/activate.d/env_vars.sh
echo 'unset XLA_FLAGS' >> $CONDA_PREFIX/etc/conda/deactivate.d/env_vars.sh
conda install -y -c nvidia cuda-nvcc=11.8
conda deactivate
conda activate cellsparse-api
uvicorn cellsparse_api.main:app
The command above will launch a server at http://localhost:8000.
INFO: Started server process [21258]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
For more information, see uvicorn documentation.
class CellsparseBody(BaseModel):
modelname: str
b64img: str
b64lbl: Optional[str] = None
train: bool = False
eval: bool = False
epochs: int = 1
trainpatch: int = 224
batchsize: int = 8
steps: int = 200
lr: float = 0.001
minarea: float = 10.0
simplify_tol: float = None
key | value |
---|---|
modelname | Name of a model for training or inference |
b64img | Base64-encoded image data |
b64lbl | Base64-encoded label data, required for training |
train | Specify if the request is for training |
eval | Specify if the request is for eval/inference |
epochs | Training epochs |
trainpatch | Training patch size, Cellpose does not support this parameter |
batchsize | Training batch size |
steps | Training steps per epoch |
lr | Training learning rate |
minarea | Objects smaller than this value are removed in post processing |
simplify_tol | A parameter to specify how much simplify the output polygons, no simplification happens if None |
The response body contains a list of GeoJSON Feature objects.
Supporting other formats is a future work.
Please cite my paper on bioRxiv.
@article {Sugawara2023.06.13.544786,
author = {Ko Sugawara},
title = {Training deep learning models for cell image segmentation with sparse annotations},
elocation-id = {2023.06.13.544786},
year = {2023},
doi = {10.1101/2023.06.13.544786},
publisher = {Cold Spring Harbor Laboratory},
abstract = {Deep learning is becoming more prominent in cell image analysis. However, collecting the annotated data required to train efficient deep-learning models remains a major obstacle. I demonstrate that functional performance can be achieved even with sparsely annotated data. Furthermore, I show that the selection of sparse cell annotations significantly impacts performance. I modified Cellpose and StarDist to enable training with sparsely annotated data and evaluated them in conjunction with ELEPHANT, a cell tracking algorithm that internally uses U-Net based cell segmentation. These results illustrate that sparse annotation is a generally effective strategy in deep learning-based cell image segmentation. Finally, I demonstrate that with the help of the Segment Anything Model (SAM), it is feasible to build an effective deep learning model of cell image segmentation from scratch just in a few minutes.Competing Interest StatementKS is employed part-time by LPIXEL Inc.},
URL = {https://www.biorxiv.org/content/early/2023/06/13/2023.06.13.544786},
eprint = {https://www.biorxiv.org/content/early/2023/06/13/2023.06.13.544786.full.pdf},
journal = {bioRxiv}
}