pkestene/euler_kokkos

Build issue with MPI and no OpenMP

Closed this issue · 3 comments

Hello,

I'm trying to build this miniapp and get some neat visualizations. I'm not entirely sure, but from my searching this app is what seems to have been used in this paper

Here's my CMake:

$ cmake -DUSE_MPI=ON -DKokkos_ENABLE_HWLOC=ON -DKokkos_ENABLE_OPENMP=OFF ..
-- MPI support found
-- MPI compile flags: -pthread
-- MPI include path: /home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/openmpi-4.0.3-eyqdsz6oqd6zosjrgrndggsqi5xrfm6b/include
-- MPI LINK flags path: -Wl,-rpath -Wl,/home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/hwloc-1.11.11-lrtc6lv6yhve3m7jir435xc3k2pa34n5/lib -Wl,-rpath -Wl,/home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/zlib-1.2.11-vk2i2rkkp2rz74b2g3gw3m27iupfeqt5/lib -Wl,-rpath -Wl,/home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/openmpi-4.0.3-eyqdsz6oqd6zosjrgrndggsqi5xrfm6b/lib -pthread
-- MPI libraries: /home/users/spollard/local/spack/opt/spack/linux-ubuntu18.04-skylake_avx512/gcc-7.5.0/openmpi-4.0.3-eyqdsz6oqd6zosjrgrndggsqi5xrfm6b/lib/libmpi.so
CMake Warning at CMakeLists.txt:96 (message):
  OpenMPI found, but it is not built with CUDA support.


-- Setting default Kokkos CXX standard to 11
-- The project name is: Kokkos
-- Using -std=c++11 for C++11 standard as feature
-- Execution Spaces:
--     Device Parallel: NONE
--     Host Parallel: NONE
--       Host Serial: SERIAL
--
-- Architectures:
//===================================================
  euler_kokkos build configuration:
//===================================================
  C++ Compiler : GNU 7.5.0
    /usr/bin/c++
  MPI enabled
  Kokkos OpenMP enabled : OFF
  Kokkos CUDA   enabled : OFF
  Kokkos HWLOC  enabled : ON

-- Configuring done
-- Generating done
-- Build files have been written to: /home/users/spollard/mpi-error/euler_kokkos/build

But I get an error which is the following (many lines omitted)

[ 54%] Building CXX object src/shared/CMakeFiles/shared.dir/SolverBase.cpp.o
/home/users/spollard/mpi-error/euler_kokkos/src/shared/SolverBase.cpp: In member function ‘void euler_kokkos::SolverBase::transfert_boundaries_2d(Direction)’:
/home/users/spollard/mpi-error/euler_kokkos/src/shared/SolverBase.cpp:756:57: error: ‘DataArray2d {aka class Kokkos::View<double***, Kokkos::Serial>}’ has no member named ‘ptr_on_device’
     params.communicator->sendrecv(borderBufSend_xmin_2d.ptr_on_device(),
                                                         ^~~~~~~~~~~~~
.
.
.
.
/home/users/spollard/mpi-error/euler_kokkos/src/shared/mpiBorderUtils.h:122:18: error: ‘const DataArray {aka const class Kokkos::View<double****, Kokkos::Serial>}’ has no member named ‘dimension_2’; did you mean ‘dimension’?
       offset = U.dimension_2()-ghostWidth;
                ~~^~~~~~~~~~~
                dimension
src/shared/CMakeFiles/shared.dir/build.make:86: recipe for target 'src/shared/CMakeFiles/shared.dir/SolverBase.cpp.o' failed

I haven't used kokkos before, but I suspect it has to do with an incompatible kokkos version. Do you know how I could fix this?

Hi @sampollard

thanks for reporting. Indeed I recently updated kokkos submodule and its API slightly changed, and as I was testing with USE_MPI=OFF, I forgot to fully update euler_kokkos to use the new API.
It should be fixed now. Please pull and try again.

BTW, the minapp used in the article you mentioned is another miniapp but very very similar, we are solving the exact same equations (Euler equations for compressible hydrodynamics) with the same kind of numerical methods (finite volume discretization, Riemann solver, ...), but their focus is numerical precision (maybe not parallelism). Here my focus is more illustrating portability and parallelism using the Kokkos library, but surely the question of numerical precision in a parallel execution context is interesting.

Fixed! I was able to compile and run a test program using

mpirun -np 6 src/euler_kokkos test/io/test_io_2d.ini