/MODNet

A Trimap-Free Portrait Matting Solution in Real Time [AAAI 2022]

Primary LanguagePythonApache License 2.0Apache-2.0

MODNet: Trimap-Free Portrait Matting in Real Time

MODNet: Real-Time Trimap-Free Portrait Matting via Objective Decomposition (AAAI 2022)

MODNet is a model for real-time portrait matting with only RGB image input
MODNet是一个仅需RGB图片输入实时人像抠图模型

Online Application (在线应用) | Research Demo | AAAI 2022 Paper | Supplementary Video

Community | Code | PPM Benchmark | License | Acknowledgement | Citation | Contact


Online Application (在线应用)

A Single model! Only 7M! Process 2K resolution image with a Fast speed on common PCs or Mobiles! Beter than research demos!
Please try online portrait image matting via this website or on my personal homepage!

单个模型!大小仅为7M!可以在普通PC或移动设备上快速处理具有2K分辨率的图像!效果比研究示例更好
请通过此网站我的主页在线尝试图片抠像!

Research Demo

All the models behind the following demos are trained on the datasets mentioned in our paper.

Portrait Image Matting

We provide an online Colab demo for portrait image matting.
It allows you to upload portrait images and predict/visualize/download the alpha mattes.

Portrait Video Matting

We provide two real-time portrait video matting demos based on WebCam. When using the demo, you can move the WebCam around at will. If you have an Ubuntu system, we recommend you to try the offline demo to get a higher fps. Otherwise, you can access the online Colab demo.
We also provide an offline demo that allows you to process custom videos.

Community

We share some cool applications/extentions of MODNet built by the community.

  • WebGUI for Portrait Image Matting
    You can try this WebGUI (hosted on Gradio) for portrait image matting from your browser without code!

  • Colab Demo of Bokeh (Blur Background)
    You can try this Colab demo (built by @eyaler) to blur the backgroud based on MODNet!

  • ONNX Version of MODNet
    You can convert the pre-trained MODNet to an ONNX model by using this code (provided by @manthan3C273). You can also try this Colab demo for MODNet image matting (ONNX version).

  • TorchScript Version of MODNet
    You can convert the pre-trained MODNet to an TorchScript model by using this code (provided by @yarkable).

  • TensorRT Version of MODNet
    You can access this Github repository to try the TensorRT version of MODNet (provided by @jkjung-avt).

  • Docker Container for MODnet
    You can access this Github repository for a containerized version of MODNet with the Docker environment (provided by @nahidalam).

There are some resources about MODNet from the community.

Code

We provide the code of MODNet training iteration, including:

  • Supervised Training: Train MODNet on a labeled matting dataset
  • SOC Adaptation: Adapt a trained MODNet to an unlabeled dataset

In code comments, we provide examples for using the functions.

PPM Benchmark

The PPM benchmark is released in a separate repository PPM.

License

The code, models, and demos in this repository (excluding GIF files under the folder doc/gif) are released under the Apache License 2.0 license.

Acknowledgement

Citation

If this work helps your research, please consider to cite:

@InProceedings{MODNet,
  author = {Zhanghan Ke and Jiayu Sun and Kaican Li and Qiong Yan and Rynson W.H. Lau},
  title = {MODNet: Real-Time Trimap-Free Portrait Matting via Objective Decomposition},
  booktitle = {AAAI},
  year = {2022},
}

Contact

This repository is maintained by Zhanghan Ke (@ZHKKKe).
For questions, please contact kezhanghan@outlook.com.