pytorch/ort

Should / How can we deal with unsupported operator warning?

richarddwang opened this issue · 2 comments

How can I find where and how to solve these? Or I could ignore these?

/home/yisiang/miniconda3/envs/dl/lib/python3.9/site-packages/onnxruntime/training/ortmodule/_logger.py:51: UserWarning: There were one or more warnings or errors raised while exporting the PyTorch model. Please enable INFO level logging to view all warnings and errors.
  warnings.warn("There were one or more warnings or errors raised while exporting the PyTorch "
Warning: Unsupported operator ATenOp. No schema registered for this operator.
Warning: Unsupported operator ATenOp. No schema registered for this operator.
...
Warning: Checker does not support models with experimental ops: Scale
Warning: Checker does not support models with experimental ops: Scale
....
2021-07-20 22:00:01.572480474 [W:onnxruntime:, constant_folding.cc:134 ApplyImpl] Could not find a CPU kernel and hence can't constant fold Sub node 'Pow_10_Grad/Sub_1'
2021-07-20 22:00:01.580978404 [W:onnxruntime:, constant_folding.cc:134 ApplyImpl] Could not find a CPU kernel and hence can't constant fold Sub node 'Pow_10_Grad/Sub_1'
....
  • Training a transformer encoder
  • torch 1.9.0
  • torch-ort 1.8.1
  • onnx 1.9.0
  • onnxruntime-training 1.8.1+torch190.cu111

-- 2021.07.24 ---
I found if the forward pass of embedding is commented, Warning: Unsupported operator ATenOp.... will disappear. But UserWarning: There were one or more warnings... still exists. Wonder how to turn on INFO level logging.

Hi @richarddwang, all the warnings above are non-critical, and actually misleading. Are you able to run the training?

We will clean up these warning messages.

That will be nice, thank you.

The training ran smoothly. It seems like those warnings doesn't affect anything.