kwea123/pytorch-cppcuda-tutorial

“未定义标识符 "size_t"”, undefined symbol: __cxa_call_terminate

Opened this issue · 2 comments

似乎按照教程上方法来配置前面的基础文件,但是我在cpp中写python的拓展模块时:

PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {
    m.def("demo", &demo, "cuda demo");
}

这里提示我“未定义标识符 "size_t"”,我确保的torch已经正确安装,并且能够使用:

Python 3.8.18 (default, Sep 11 2023, 13:40:15) 
[GCC 11.2.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> print(torch.cuda.is_available())
True
>>> print(torch.version.cuda)
12.1
>>> print(torch.__version__)
2.2.1+cu121

但是我就是不能使用能够编译的拓展模块,他给我这样的报错:

Traceback (most recent call last):
  File "test.py", line 2, in <module>
    import cuda_demo
ImportError: /home/ll/.conda/envs/dl/lib/python3.8/site-packages/cuda_demo.cpython-38-x86_64-linux-gnu.so: undefined symbol: __cxa_call_terminate

我尝试修改了.bashrc文件使其能够链接到我的动态链接上export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/ll/.conda/envs/dl/lib/python3.8/site-packages/torch/lib
即使现在能拓展模块能链接到对应的动态库:

$ ldd /home/ll/.conda/envs/dl/lib/python3.8/site-packages/cuda_demo.cpython-38-x86_64-linux-gnu.so
        linux-vdso.so.1 (0x000073c384dc7000)
        libc10.so => /home/ll/.conda/envs/dl/lib/python3.8/site-packages/torch/lib/libc10.so (0x000073c384c95000)
       ...

但是程序依旧没法成功运行,cpp文件中还是提示我“未定义标识符 "size_t"”,运行程序依旧报相同的问题undefined symbol: __cxa_call_terminate

same issue here

same issue here

I tried some ways to run it. This has nothing to do with the version of Python itself, but it seems to be related to the version of torch. At least in my test, I used a lower version of torch (torch == 1.13) to run successfully and avoid this error. Maybe other higher versions can also avoid this problem, but I tested torch==2.0.1 and above, and it didn't work. I don't know how the upstream encapsulates the updated version of torch.

Since I recently tried to build the mamba_ssm model and got the same undefined error, I haven't tried to use other versions of torch (due to cuda, my virtual environment is using a newer torch recently), and I hope to get more effective suggestions from others.