[Bug] Lambda and numpy() cannot coexist in a script
Opened this issue · 1 comments
Describe the bug
I make a test code at hidet/python/hidet/graph/ops
. The code snapshot:
import numpy as np
import hidet as hi
from hidet import ops, Tensor
from .transform import transpose
from hidet.graph.tensor import asarray
shape = [3, 2]
dtype=np.float32
data = np.random.randn(*shape).astype(dtype)
f = lambda x: x
hidet_result = transpose(hi.asarray(data).cuda()).cpu().numpy()
hi.cuda.synchronize()
print(hidet_result)
This will report a Segmentation fault (core dumped)
.
However, if we remove the last numpy()
in the definition of hidet_result, the segmentation fault will gone.
If we remove the defintion of f = lambda x: x
, the the segmentation fault will gone too. Note the lambda function f
is unrelated to the hidet code at all.
To Reproduce
Install the hidet (public version), build from source. Run the code by python -m hidet.graph.ops.test_transpose
.
Expected behavior
Should remove the segmentation fault.
Enviroment
- OS: Ubuntu 22.04.3 LTS
- GPU: RTX 3090
- Others: CUDA 12.1, GPU Driver Version: 545.23.08
can you attach a stack trace? I wonder if it can be a case where hidet code is accessing an object garbage collected by Python interpreter. To check this you can do,
import gc
gc.disable()
at the beginning of your code and run with the numpy and f objects present.