tools for shedding light on your black-box deep models:
- cohort analysis based on a target error signal
- error analysis based on sample-indicators and single-data-exploration
- embedding analysis for classification models and encoder of the segmentation/od models
- wrong cases visualization
- exploring model's behaviour by applying different explainers
- comparing two models trained for the same task
prepare your model as a saved model, and the model's meta-data as a pickle file, and run the following command:
kym hhd --model-meta "/path/to/model-meta.pkl" --saved-model "/path/to/savedmodel" --data-registry "/path/to/data-registry"
- a tool for visualizing error distribution in different cohorts based on specified error column
- a tool for detect and visualize edge data-points and mis-predictions based on specified error columns
- a generator that visualizes the wrong cases (i.e. FPs and FNs) based on a specified error column
- a 3D embedding visualizer tool that visualizes the encoder's embeddings and colors the samples based on specified column
- a generator tool for comparing the model to another models visually on errors of the main model
-
the models are
tensorflow-saved_model
-
the
meta-data
is apandas.DataFrame
presented as a.pkl
file, in which each row represents a single data-point, and the following columns are present in the data-frame in order to be able to visualize the ground-truth and prediction of the model:DataSource
SeriesInstanceUID
SliceIndex
LabelingJob
MaskName
-
the data-registry folder in your local system must contains data in this structure:
├── data_registry
│ ├── datasources
│ │ ├── Emam Behshahr
│ │ │ ├── 1.2.392.200036.9116.2.6.1.3268.2054350876.1618913694.791745
│ │ │ │ ├── 1.2.392.200036.9116.2.6.1.3268.2054350876.1618913700.457684.dcm
│ │ │ │ ├── 1.2.392.200036.9116.2.6.1.3268.2054350876.1618913702.409793.dcm
│ | | |
│ │ │ ├── 1.2.392.200036.9116.2.6.1.3268.2054350876.1625898125.674447
│ │ │ │ ├── 1.2.392.200036.9116.2.6.1.3268.2054350876.1625898125.675577.dcm
│ │ │ │ ├── 1.2.392.200036.9116.2.6.1.3268.2054350876.1625898127.39131.dcm
| | | |
│ │ │ └── 1.2.392.200036.9116.2.6.1.3268.2054350876.1660862509.547649
│ │ │ ├── 1.2.392.200036.9116.2.6.1.3268.2054350876.1660862509.548634.dcm
│ │ │ └── 1.2.392.200036.9116.2.6.1.3268.2054350876.1660862544.321533.dcm
│ │ └── jahanbakhshi-p3
│ │ ├── 1.2.392.200036.9116.2.6.1.48.1211476691.1462896892.227967
│ │ │ ├── IM00001.dcm
│ │ │ └── IM00016.dcm
│ │ ├── 1.2.392.200036.9116.2.6.1.48.1211476691.1463414583.241581
│ │ │ ├── IM00001.dcm
│ │ │ └── IM00016.dcm
│ │ └── 1.2.392.200036.9116.2.6.1.48.1211476691.1463586617.223557
│ │ ├── IM00001.dcm
│ │ └── IM00016.dcm
│ └── tasks
│ └── HHD
│ ├── masks-HHD-CTB-P6-Reviewer1
│ └── masks-HHD-CTB-P7-Reviewer1
main functionalities:
- error distribution in different cohorts based on specified error column
after cloning the repo and changing the working directory to repo's root:
- install poetry
- update poetry
poetry self update
- add aimedic's PYPI server as a private source:
poetry source add internal https://pypi.aimedic.tech --local
poetry config repositories.internal https://pypi.aimedic.tech
export $PYPI_USERNAME <username>
export $PYPI_PASSWORD <password>
poetry config http-basic.internal $PYPI_USERNAME $PYPI_PASSWORD --local
- diable the experimental new installer (this solves the hash problem for installing packages from private repository):
poetry config experimental.new-installer false --local
- set your virtual environment folder to be created in the repository's root:
poetry config virtualenvs.in-project true --local
- install requirements:
poetry install
- develop
- refactor
- write tests
- push and craete a merge request
Note
The poetry environment will be installed in development mode.
Note
Don't publish to the private pypi
server, this will be automatically done at the end of the CI/CD pipeline.
Note
If you are developing inside a docker container, you don't need a virtual env, so just install the dependencies in default python environment:
RUN poetry config virtualenvs.create false
RUN poetry install --no-root --no-dev --no-interaction --no-ansi
- add (install) dependency packages through
poetry
(e.g.scikit-learn
):
poetry add scikit-learn
- add (install) dependency from a private Pypi sever:
poetry add --source internal aimedic-utils
- add (install) the dependency package as a development dependency (e.g.
pytest
):
poetry add pytest --group dev
Launch the jupyter notebook inside the project's environment:
poetry run jupyter notebook
and select Python 3
as kernel.
Note
If you are using globally installed Jupyter, create a kernel before launching Jupyter:
poetry run ipython kernel install --user --name=<KERNEL_NAME>
jupyter notebook
and then select the created kernel in “Kernel” -> “Change kernel”.