InterDigitalInc/CompressAI

Entropy coding

zjnlxk opened this issue · 2 comments

First of all, thank you for your work.
I trained a model (cheng2020-attn) on my own dataset and then tried to do some tests with your evaluation script
python3 train.py -d $DATASET --epochs 100 -lr 1e-4 --batch-size 16 --cuda --save

python -m compressai.utils.update_model --architecture cheng2020-attn weight/cheng2020-attn/checkpoint_best_loss.pth.tar

python -m compressai.utils.eval_model checkpoint crop_img -a cheng2020-attn -p weight/cheng2020-attn/checkpoint_best_loss.pth.tar

next,I want to do Entropy coding, save the result of compression and decompression.

By default CompressAI uses a range Asymmetric Numeral Systems (ANS) entropy coder. You can use compressai.available_entropy_coders() to get a list of the implemented entropy coders and change the default entropy coder via compressai.set_entropy_coder().

The code in your document is as follows:
x = torch.rand(1, 3, 64, 64)
y = net.encode(x)
strings = net.entropy_bottleneck.compress(y)

shape = y.size()[2:]
y_hat = net.entropy_bottleneck.decompress(strings, shape)
x_hat = net.decode(y_hat)

I want to know, do I need to redefine net myself? How to implement Entropy coding?

Hi, I'm not sure I understand your question.
Is you example working? Do you want to implement another range coder?

In fact, I want to get a visualization of the encoding and decoding results of a picture and the resulting bitstream file. Also, after training, I use the default codec and the test time is 0.4s/1s, which is not up to the real-time requirement, so I want to know where the codec is set.