sophgo/LLM-TPU

导出onnx出现warning

Closed this issue · 2 comments

/usr/local/lib/python3.10/dist-packages/torch/onnx/utils.py:1636: UserWarning: The exported ONNX model failed ONNX shape inference.The model will not be executable by the ONNX Runtime.If this is unintended and you believe there is a bug,please report an issue at https://github.com/pytorch/pytorch/issues.Error reported by strict ONNX shape inference: [ShapeInferenceError] (op_type:Add, node name: /Add): A typestr: T, has unsupported type: tensor(bool) (Triggered internally at ../torch/csrc/jit/serialization/export.cpp:1407.)
请问这个warning重要吗,如果会影响模型效果的话应该怎么解决
这是使用ChatGLM3导出onnx的脚本时出现的问题

噢,我发现用我这个导出来的bmodel不会产生回复,那这个warning应该怎么解决呢

找到采样头的这一句:mask += self.keep_matrix可以改为:
mask[0, :self.min_tokens_to_keep] = True 或 mask = mask | self.keep_matrix