Linear solvers using CUDA do not work or yield wrong results
LennartSchu opened this issue · 0 comments
LennartSchu commented
Describe the bug
When compiling DPsim with CUDA-support (e.g. using the cmake-flag -DWITH_CUDA=ON
), the linear solvers implemented in files MNASolverGpuSparse.cpp
and MNASolverGpuDense.cpp
yield segmentation faults, if an example with a variable system matrix is used, or wrong results if, for example, the model WSCC_9bus_mult_coupled
is used - possibly due to the presence of zeros on the system matrix' diagonal. In case of CUDADense
, some values are NaN
To Reproduce
Steps to reproduce the behavior:
- modify example files of
WSCC_9bus_mult_coupled
andDP_WSCC9bus_SGReducedOrderVBR
to useCUDADense
(for example) - build
mkdir build && cd build
cmake .. -DWITH_CUDA=ON
make WSCC_9bus_mult_coupled DP_WSCC9bus_SGReducedOrderVBR
- cd dpsim/examples/cxx
- run
./WSCC_9bus_mult_coupled -ocopies=0
./DP_WSCC9bus_SGReducedOrderVBR
Expected behavior
No segmentation faults and correct results, if compared to other solvers, such as EigenSparse