yucornetto/MGMatting

for dataset & matting result

guoyt35 opened this issue · 3 comments

HI, I use 【com=alphaimg+(1-alpha)([10, 255,15])】 to get the 'real-world portrait dataset' matting foreground on a green background,but i found the boundary area is not so soft。
comp
comp1
comp2
And I have used your pre-trained model to get my own portrait dataset's matting result, I found that the boundary area performed well, but it missed a lot of inner foreground. My mask is got by using the u^2 net demo.
0003
0005
0007

Hi,

  1. As mentioned in the paper and this github repo, using original image as foreground usually cannot lead to satisfying results. You can try train one using this repo with random alpha blending, which should not be hard to implement, or some other traditional methods (e.g., closed-form matting) to obtain the foreground color. I am occupied recently but will work on releasing the foreground code/model if I get time.

  2. The results look weird to me. Could you please confirm that the pretrained weight you are using is MGMatting-RWP-100k instead of MGMatting-DIM-100k?

I used the MGMatting-RWP-100k pretrained weight to test it again, it performed better a lot in the inner area, thanks a lot!!!

I tried MGMatting-RWP-100k and still got weird results like he did :(