GPU:out of memory
superskhm opened this issue · 6 comments
Expected Behavior
I want to reconstruct an image consisting of 1024 projections, each with a size of 2048 x 3072
TIGRE parameters
Geometry parameters
Distance from source to detector (DSD) = 627.2 mm
Distance from source to origin (DSO)= 28.0 mm
Detector parameters
Number of pixels (nDetector) = [2496 3008]
Size of each pixel (dDetector) = [0.1 0.1] mm
Total size of the detector (sDetector) = [249.6 300.8] mm
Image parameters
Number of voxels (nVoxel) = [800 800 800]
Total size of the image (sVoxel) = [3.57142857 3.57142857 3.57142857] mm
Size of each voxel (dVoxel) = [0.00446429 0.00446429 0.00446429] mm
Actual Behavior
Centre of rotation correction (COR) = 0 mm
(1024, 2496, 3008)
../Common/CUDA/TIGRE_common.cpp (7): Error pinning memory
../Common/CUDA/TIGRE_common.cpp (14): CBCT:CUDA:Atb out of memory
Code to reproduce the problem (If applicable)
proj, geo, angles = load_ZC_projections(file_path, geo, angles)
imgfdk=tigre.algorithms.fdk(proj, geo, angles, filter="hann")
tigre.plotimg(imgfdk, dim='z')
Specifications
- MATLAB/python version:py 3.12 64bit
- OS:win 64
- CUDA version:11.8
and NVIDIA GeForce RTX 3080 Laptop GPU 16GB
How much RAM does the laptop have (not the GPU)? Your projections are 24GB RAM, and the image about 2GB. Do you have enough CPU RAM to store this?
RAM :32GB.In addition, I have enabled virtual memory, and the size is automatically set by the system
@superskhm have you compiled TIGRE to allow virtual memory? By default it does not allow it, otherwise it can not use pinned memory.
@AnderBiguri I compiled TIGRE using the default settings without changing the virtual memory settings of the system. In addition, where can I set up the program to use virtual memory