Usage of Sine function from torch module
Opened this issue · 0 comments
Hi, I've appreciated your work with ResNet18 and quantization approach, I also tried to study your code in order to understand whether my code, from the job that I've to accomplish to, can be adapted in order to take few snippet of code from your notebook for let my model to exploit quantization properties after training.
A issue that I've encoutered was that when trying or attempting to create a model with torch.sine
operation instead of any other well-known activation functions, such as RELU et al., If I use dynamic quantization technique no errors arise, while when exploiting other quant techs such as dynamic or post train quant the script rises an exception since the model is not able to carry out operation with data from a backend such as QuantizeCPU and other torch operations.
If you are interested in helping me, I will tell you more precisely my issue.
So far, I will post you my github project so that you can check the problem I'm facing:
In particular the issue I've encoutered so far was when I run a train from the following notebook:
You can run it, if you copy it and load to colab notebook whit the following setting, and looking at the final colab cell:
- EVAL_POST_TRAIN = False
- TRAIN_MODEL = True
- TRAIN_MODEL_NO_QUANT = False
- TRAIN_MODEL_DYNAMIC_QUANT = False
- TRAIN_MODEL_POST_TRAIN_QUANT = True