The architecture was inspired by Real-time deep hair matting on mobile devices
python 3.6
tensorflow-gpu==1.13.1
opencv-python==4.1.0.25
Keras==2.2.4
numpy==1.16.4
scikit-image==0.15.0
- CelebAMask-HQ (contain 29.300 image and mask hair segmentation)
- Figaro-1k
- Lft
├── my-data
│ ├── images
│ │ ├── 1.jpg
│ │ ├── 2.jpg
│ │ ├── 3.jpg
...
│ ├── masks
│ │ ├── 1.jpg
│ │ ├── 2.jpg
│ │ ├── 3.jpg
...
I've downloaded it and done the pre-processing. You find it in folder data/image (images original) and data/label(images mask)
# You can config train model in train.py
python train.py
python evaluate.py
# Run test.py
python demo.py
You will see the predicted results of test image in test/data
- Convert
# Convert Model to Mobile
python convert_to_tflite.py
- Show shape model tflite
# Shape input and output shape model tflite
python shape_input_output_tflite.py
Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research.
Use Keras if you need a deep learning library that:
allows for easy and fast prototyping (through total modularity, minimalism, and extensibility). supports both convolutional networks and recurrent networks, as well as combinations of the two. supports arbitrary connectivity schemes (including multi-input and multi-output training). runs seamlessly on CPU and GPU. Read the documentation Keras.io
Keras is compatible with: Python 3.6.
- Implement model using Keras
- Convert model to Tensorflow Lite
- Implement model to Android (
DOING
)
Copyright (c) 2019 Thang Tran Van
Licensed under the MIT License. You may not use this file except in compliance with the License