/musc

Implementation of "Learning Multiscale Convolutional Dictionaries for Image Reconstruction", IEEE Transaction On Computational Imaging, 2022.

Primary LanguageJupyter Notebook

Learning Multiscale Convolutional Dictionaries for Image Reconstruction

Authors: Tianlin Liu, Anadi Chaman, David Belius, and Ivan Dokmanić

We present a novel multiscale dictionary model that is both simple and mathematically tractable. When trained using a traditional sparse-coding approach, our model performs comparably to the highly regarded U-Net. The figure below illustrates both the U-Net (panel a) and our multiscale dictionary model (panel b), as well as the forward pass of our model achieved by unrolled task-driven sparse coding (panel c).


Google Colab

If you're interested in experimenting with MUSC, we've prepared two Google Colab notebooks that are very easy to use. There's no need to install anything, and you can run them directly in your browser:

  1. Evaluate and visualize a trained MUSC model on the LoDoPaB-CT dataset (Leuschner et al., 2021)
    Open In Colab

  2. Train a small MUSC model from scratch on the ellipses dataset (Jin et al., 2017)
    Open In Colab


Installation

If you want to train the model on your own machine, you can use the provided conda environment.

Create the environment from the environment.yml file

conda env create -f environment.yml

Training

To train the MUSC model, use python scripts in the folder train_src.


Evaluation & checkpoints

The checkpoints for a MUSC model trained on the LoDoPaB-CT dataset are available for download on Google Drive. You can find them here.

The notebooks/evaluate_musc_ct.ipynb notebook can be used to evaluate the model on the test set.


Citation

@ARTICLE{Liu2022learning,
  author={Liu, Tianlin and Chaman, Anadi and Belius, David and Dokmanić, Ivan},
  journal={IEEE Transactions on Computational Imaging},
  title={Learning Multiscale Convolutional Dictionaries for Image Reconstruction},
  year={2022},
  volume={8},
  pages={425-437}}