facebookresearch/d2go

Pretrained weights for QAT version of Maskrcnn

rsadiq opened this issue · 1 comments

Hi,
While performing a quantization aware training for maskrcnn, which model weights should i use ?
Like for faster rcnn the link is there but in model zoo i couldn't find any link to pretrained weights for qat_maskrcnn*****.
And if i use the qat config and use normal weights. The predictions are simply random. And if i don't use qat, its great

@rsadiq we usually use fully trained weight from floating point model to initialize QAT, for example, a real world config for QAT faster_rcnn_fbnetv3a_C4 would be like:

_BASE_: "faster_rcnn_fbnetv3a_C4.yaml"
MODEL:
  WEIGHTS: <fully trained weight that can be found from model zoo>
SOLVER:
  ...
QUANTIZATION:
  ...

The faster_rcnn_fbnetv3a_C4 .yaml is just a dummy example, as you can see the SOLVER. MAX_ITER is only 50, in real world, it needs to train for thousands of iterations.