JuliaSmoothOptimizers/MUMPS.jl

Can't install MUMPS on openmpi in Linux server

Closed this issue · 47 comments

Hi @dpo

I have tried for several days without success to install MUMPS with openmpi dependencies.

The MPI.jl package does not work with openmpi even after:
export CC=which mpicc
export FC=which mpif90

  Building MPI → `~/.julia/packages/MPI/U5ujD/deps/build.log`
┌ Error: Error building `MPI`: 
│ -- The Fortran compiler identification is GNU 5.5.0
│ -- The C compiler identification is GNU 7.4.0
│ -- Check for working Fortran compiler: /usr/bin/mpif90
│ -- Check for working Fortran compiler: /usr/bin/mpif90  -- works
│ -- Detecting Fortran compiler ABI info
│ -- Detecting Fortran compiler ABI info - done
│ -- Checking whether /usr/bin/mpif90 supports Fortran 90
│ -- Checking whether /usr/bin/mpif90 supports Fortran 90 -- yes
│ -- Check for working C compiler: /usr/bin/mpicc
│ -- Check for working C compiler: /usr/bin/mpicc -- works
│ -- Detecting C compiler ABI info
│ -- Detecting C compiler ABI info - done
│ -- Detecting C compile features
│ -- Detecting C compile features - done
│ -- Found Git: /usr/bin/git (found version "2.7.4") 
│ -- Found MPI_C: /usr/lib/openmpi/lib/libmpi.so  
│ -- Found MPI_Fortran: /usr/lib/openmpi/lib/libmpi_usempif08.so;/usr/lib/openmpi/lib/libmpi_usempi_ignore_tkr.so;/usr/lib/openmpi/lib/libmpi_mpifh.so;/usr/lib/openmpi/lib/libmpi.so  
│ -- Detecting Fortran/C Interface
│ -- Detecting Fortran/C Interface - Found GLOBAL and MODULE mangling
│ -- Looking for MPI_Comm_c2f
│ -- Looking for MPI_Comm_c2f - found
│ -- Configuring done
│ -- Generating done
│ -- Build files have been written to: /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build
│ Scanning dependencies of target gen_constants
│ [ 11%] Building Fortran object CMakeFiles/gen_constants.dir/gen_constants.f90.o
│ /home/ubuntu/.julia/packages/MPI/U5ujD/deps/gen_constants.f90:43:43:
│ 
│    call output("MPI_NO_OP       ", MPI_NO_OP)
│                                            1
│ Error: Symbol ‘mpi_no_op’ at (1) has no IMPLICIT type
│ CMakeFiles/gen_constants.dir/build.make:62: recipe for target 'CMakeFiles/gen_constants.dir/gen_constants.f90.o' failed
│ make[2]: *** [CMakeFiles/gen_constants.dir/gen_constants.f90.o] Error 1
│ CMakeFiles/Makefile2:241: recipe for target 'CMakeFiles/gen_constants.dir/all' failed
│ make[1]: *** [CMakeFiles/gen_constants.dir/all] Error 2
│ Makefile:149: recipe for target 'all' failed
│ make: *** [all] Error 2
│ [ Info: Attempting to create directory /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build
│ [ Info: Changing directory to /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build
│ ERROR: LoadError: failed process: Process(`make`, ProcessExited(2)) [2]
│ Stacktrace:
│  [1] error(::String, ::Base.Process, ::String, ::Int64, ::String) at ./error.jl:42
│  [2] pipeline_error at ./process.jl:705 [inlined]
│  [3] #run#504(::Bool, ::Function, ::Cmd) at ./process.jl:663
│  [4] run(::Cmd) at ./process.jl:661
│  [5] macro expansion at ./logging.jl:308 [inlined]
│  [6] run(::BinDeps.SynchronousStepCollection) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/BinDeps.jl:518
│  [7] macro expansion at ./logging.jl:308 [inlined]
│  [8] run(::BinDeps.SynchronousStepCollection) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/BinDeps.jl:518
│  [9] macro expansion at ./logging.jl:308 [inlined]
│  [10] run(::BinDeps.SynchronousStepCollection) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/BinDeps.jl:518
│  [11] satisfy!(::BinDeps.LibraryDependency, ::Array{DataType,1}) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:944
│  [12] satisfy!(::BinDeps.LibraryDependency) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:922
│  [13] top-level scope at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:977
│  [14] include at ./boot.jl:317 [inlined]
│  [15] include_relative(::Module, ::String) at ./loading.jl:1041
│  [16] include(::Module, ::String) at ./sysimg.jl:29
│  [17] include(::String) at ./client.jl:388
│  [18] top-level scope at none:0
│ in expression starting at /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build.jl:54
└ @ Pkg.Operations /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/Operations.jl:1069

However, the MPI package works with mpich and installs without any issues.

But running MUMPS.jl complains which I found out (from googling) to be due to linking to multiple mpi packages.

julia> using MUMPS

julia> using MPI

julia> # Initialize MPI.
       MPI.Init()

julia> comm = MPI.COMM_WORLD
MPI.Comm(1140850688)

julia> A = rand(1000,1000);

julia> b = rand(1000);

julia> x = solve(A, b)
*** The MPI_Comm_f2c() function was called before MPI_INIT was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[ip-10-62-0-4:42218] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!

Please how do I resolve this. I am building something that will depend heavily on MUMPS.

Thanks.

@dpo

You can replicate the installation like this:


sudo apt remove openmpi-bin libopenmpi-dev && sudo apt install mpich libmpich-dev
pkg> remove MPI
pkg> add MPI
pkg> build MPI

sudo apt-get install libmumps-dev
pkg> add "https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git"
pkg> resolve

using MUMPS

dpo commented

@urchgene I think the default version of OpenMPI on Ubuntu is too old for MPI.jl (see
https://discourse.julialang.org/t/error-building-mpi-already-tried-setting-cc-fc/17817/3). I don't have a Linux machine handy, but isn't it the case that apt get install libmumps-dev will install OpenMPI? If so, you would have two MPI libraries, and MUMPS will use one while MPI.jl will use the other.

If the above is right, the only alternatives I see are:

  1. Install a newer version of OpenMPI (either by hand or with, say, Linuxbrew),
  2. build MUMPS by hand against MPICH.

I would suggest #2 because #1 is currently causing me trouble on the CI virtual machines.

Thank you very much for your answer @dpo

I will try #2 but I know it will be a more difficult route.

dpo commented

@urchgene It seems to be working for me on Linux if I install OpenMPI and MUMPS with Linuxbrew. There are precompiled bottles. See #33. The command sudo apt remove openmpi-bin libopenmpi-dev mpich libmpich-dev || true is important to remove any pre-existing MPI.

dpo commented

ps: check the updated README, for which option 1 works.

dpo commented

Works on the CI VMs. Please reopen or open a new issue if you're still having trouble.

@dpo, it still does not work for me. Tried mpich and it fixes MPI,jl issues but I was unable to build MUMPS using mpich.

MUMPS could not be succesfully installed using Linuxbrew either.

Please do you know any guide that tries to build MUMPS with mpich? The problem was Scalapak does not have any mpich library. It only has one for openmpi.

Thanks.

dpo commented

@urchgene I'm not aware of any other guide than the MUMPS documentation and the Makefiles. Did you try Linuxbrew? It will install precompiled versions of Open-MPI, Scalapack and MUMPS for you. You could simply try to reproduce what my CI scripts do:

@dpo I got a failure previously on Linuxbrew but lemme try your scripts now

@dpo ...having same issue. After successful installation o MUMPS using Linuxbrew:


julia> Pkg.clone("https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git")
┌ Warning: Pkg.clone is only kept for legacy CI script reasons, please use `add`
└ @ Pkg.API /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/API.jl:463
  Updating registry at `~/.julia/registries/General`
  Updating git-repo `https://github.com/JuliaRegistries/General.git`
  Updating git-repo `https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git`
[ Info: Path `/home/ubuntu/.julia/dev/MUMPS` exists and looks like the correct package, using existing path instead of cloning
 Resolving package versions...
  Updating `~/.julia/environments/v1.0/Project.toml`
  [bf6389e2] + MUMPS v0.0.2+ [`~/.julia/dev/MUMPS`]
  Updating `~/.julia/environments/v1.0/Manifest.toml`
  [bf6389e2] + MUMPS v0.0.2+ [`~/.julia/dev/MUMPS`]

julia> using MUMPS
[ Info: Recompiling stale cache file /home/ubuntu/.julia/compiled/v1.0/MUMPS/osYgx.ji for MUMPS [bf6389e2-cca1-5e17-ac22-36425c4ccbb4]
ERROR: LoadError: MUMPS library not properly installed. Please run Pkg.build("MUMPS")
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] top-level scope at /home/ubuntu/.julia/dev/MUMPS/src/MUMPS.jl:16
 [3] include at ./boot.jl:317 [inlined]
 [4] include_relative(::Module, ::String) at ./loading.jl:1041
 [5] include(::Module, ::String) at ./sysimg.jl:29
 [6] top-level scope at none:2
 [7] eval at ./boot.jl:319 [inlined]
 [8] eval(::Expr) at ./client.jl:389
 [9] top-level scope at ./none:3
in expression starting at /home/ubuntu/.julia/dev/MUMPS/src/MUMPS.jl:13
ERROR: Failed to precompile MUMPS [bf6389e2-cca1-5e17-ac22-36425c4ccbb4] to /home/ubuntu/.julia/compiled/v1.0/MUMPS/osYgx.ji.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] macro expansion at ./logging.jl:313 [inlined]
 [3] compilecache(::Base.PkgId, ::String) at ./loading.jl:1187
 [4] _require(::Base.PkgId) at ./logging.jl:311
 [5] require(::Base.PkgId) at ./loading.jl:855
 [6] macro expansion at ./logging.jl:311 [inlined]
 [7] require(::Module, ::Symbol) at ./loading.jl:837

(v1.0) pkg> build MUMPS
  Updating registry at `~/.julia/registries/General`
  Updating git-repo `https://github.com/JuliaRegistries/General.git`
  Building MPI ──→ `~/.julia/packages/MPI/U5ujD/deps/build.log`
  Building MUMPS → `~/.julia/dev/MUMPS/deps/build.log`
 Resolving package versions...
┌ Error: Error building `MUMPS`: 
│ Reading package lists...
│ Building dependency tree...
│ Reading state information...
│ The following packages were automatically installed and are no longer required:
│   linux-aws-headers-4.4.0-1066 linux-aws-headers-4.4.0-1072
│   linux-headers-4.4.0-1066-aws linux-headers-4.4.0-1072-aws
│   linux-image-4.4.0-1066-aws linux-image-4.4.0-1072-aws
│ Use 'sudo apt autoremove' to remove them.
│ The following additional packages will be installed:
│   libblacs-mpi-dev libblacs-openmpi1 libhwloc-dev libhwloc-plugins libhwloc5
│   libibverbs-dev libibverbs1 libmumps-4.10.0 libnuma-dev libopenmpi-dev
│   libopenmpi1.10 libscalapack-mpi-dev libscalapack-openmpi1 mpi-default-bin
│   mpi-default-dev ocl-icd-libopencl1 openmpi-bin openmpi-common
│ Suggested packages:
│   libhwloc-contrib-plugins opennmpi-doc scalapack-doc opencl-icd
│   openmpi-checkpoint
│ The following NEW packages will be installed:
│   libblacs-mpi-dev libblacs-openmpi1 libhwloc-dev libhwloc-plugins libhwloc5
│   libibverbs-dev libibverbs1 libmumps-4.10.0 libmumps-dev libnuma-dev
│   libopenmpi-dev libopenmpi1.10 libscalapack-mpi-dev libscalapack-openmpi1
│   mpi-default-bin mpi-default-dev ocl-icd-libopencl1 openmpi-bin
│   openmpi-common
│ 0 upgraded, 19 newly installed, 0 to remove and 3 not upgraded.
│ Need to get 2,322 kB/9,424 kB of archives.
│ After this operation, 44.8 MB of additional disk space will be used.
│ Do you want to continue? [Y/n] Abort.
│ [ Info: building libmumps_simple
│ ERROR: LoadError: failed process: Process(`sudo apt-get install libmumps-dev`, ProcessExited(1)) [1]
│ Stacktrace:
│  [1] error(::String, ::Base.Process, ::String, ::Int64, ::String) at ./error.jl:42
│  [2] pipeline_error at ./process.jl:705 [inlined]
│  [3] #run#504(::Bool, ::Function, ::Cmd) at ./process.jl:663
│  [4] run(::Cmd) at ./process.jl:661
│  [5] macro expansion at ./logging.jl:308 [inlined]
│  [6] run(::BinDeps.SynchronousStepCollection) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/BinDeps.jl:518
│  [7] satisfy!(::BinDeps.LibraryDependency, ::Array{DataType,1}) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:944
│  [8] satisfy!(::BinDeps.LibraryDependency) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:922
│  [9] top-level scope at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:977
│  [10] include at ./boot.jl:317 [inlined]
│  [11] include_relative(::Module, ::String) at ./loading.jl:1041
│  [12] include(::Module, ::String) at ./sysimg.jl:29
│  [13] include(::String) at ./client.jl:388
│  [14] top-level scope at none:0
│ in expression starting at /home/ubuntu/.julia/dev/MUMPS/deps/build.jl:49
│ Installing dependency libmumps-dev via `sudo apt-get install libmumps-dev`:
└ @ Pkg.Operations /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/Operations.jl:1069```

I had same exact issue using ```add MUMPS`` or from your travis script:

julia -E 'using Libdl; push!(DL_LOAD_PATH, "/home/linuxbrew/.linuxbrew/lib"); using Pkg; Pkg.clone(pwd()); ENV["MUMPS_PREFIX"] = "/home/linuxbrew/.linuxbrew/opt/brewsci-mumps"; ENV["SCALAPACK_PREFIX"] = "/home/linuxbrew/.linuxbrew/opt/brewsci-scalapack"; Pkg.build("MUMPS")';

dpo commented

@urchgene Did you update MUMPS.jl? It should be sufficient to issue a ] update command (] is to enter the package manager). The build script no longer tries to apt-get install anything, exactly because the version of Open-MPI in the MUMPS package is too old for MPI.jl.

@dpo I just tried it but no success...

  Updating registry at `~/.julia/registries/General`
  Updating git-repo `https://github.com/JuliaRegistries/General.git`
 Resolving package versions...
  Updating `~/.julia/environments/v1.0/Project.toml`
 [no changes]
  Updating `~/.julia/environments/v1.0/Manifest.toml`
 [no changes]

julia> Pkg.build("MUMPS")
  Updating registry at `~/.julia/registries/General`
  Updating git-repo `https://github.com/JuliaRegistries/General.git`
  Building MPI ──→ `~/.julia/packages/MPI/U5ujD/deps/build.log`
  Building MUMPS → `~/.julia/dev/MUMPS/deps/build.log`
 Resolving package versions...
┌ Error: Error building `MUMPS`: 
│ Reading package lists...
│ Building dependency tree...
│ Reading state information...
│ The following packages were automatically installed and are no longer required:
│   linux-aws-headers-4.4.0-1066 linux-aws-headers-4.4.0-1072
│   linux-headers-4.4.0-1066-aws linux-headers-4.4.0-1072-aws
│   linux-image-4.4.0-1066-aws linux-image-4.4.0-1072-aws
│ Use 'sudo apt autoremove' to remove them.
│ The following additional packages will be installed:
│   libblacs-mpi-dev libblacs-openmpi1 libhwloc-dev libhwloc-plugins libhwloc5
│   libibverbs-dev libibverbs1 libmumps-4.10.0 libnuma-dev libopenmpi-dev
│   libopenmpi1.10 libscalapack-mpi-dev libscalapack-openmpi1 mpi-default-bin
│   mpi-default-dev ocl-icd-libopencl1 openmpi-bin openmpi-common
│ Suggested packages:
│   libhwloc-contrib-plugins opennmpi-doc scalapack-doc opencl-icd
│   openmpi-checkpoint
│ The following NEW packages will be installed:
│   libblacs-mpi-dev libblacs-openmpi1 libhwloc-dev libhwloc-plugins libhwloc5
│   libibverbs-dev libibverbs1 libmumps-4.10.0 libmumps-dev libnuma-dev
│   libopenmpi-dev libopenmpi1.10 libscalapack-mpi-dev libscalapack-openmpi1
│   mpi-default-bin mpi-default-dev ocl-icd-libopencl1 openmpi-bin
│   openmpi-common
│ 0 upgraded, 19 newly installed, 0 to remove and 3 not upgraded.
│ Need to get 2,322 kB/9,424 kB of archives.
│ After this operation, 44.8 MB of additional disk space will be used.
│ Do you want to continue? [Y/n] Abort.
│ [ Info: building libmumps_simple
│ ERROR: LoadError: failed process: Process(`sudo apt-get install libmumps-dev`, ProcessExited(1)) [1]
│ Stacktrace:
│  [1] error(::String, ::Base.Process, ::String, ::Int64, ::String) at ./error.jl:42
│  [2] pipeline_error at ./process.jl:705 [inlined]
│  [3] #run#504(::Bool, ::Function, ::Cmd) at ./process.jl:663
│  [4] run(::Cmd) at ./process.jl:661
│  [5] macro expansion at ./logging.jl:308 [inlined]
│  [6] run(::BinDeps.SynchronousStepCollection) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/BinDeps.jl:518
│  [7] satisfy!(::BinDeps.LibraryDependency, ::Array{DataType,1}) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:944
│  [8] satisfy!(::BinDeps.LibraryDependency) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:922
│  [9] top-level scope at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:977
│  [10] include at ./boot.jl:317 [inlined]
│  [11] include_relative(::Module, ::String) at ./loading.jl:1041
│  [12] include(::Module, ::String) at ./sysimg.jl:29
│  [13] include(::String) at ./client.jl:388
│  [14] top-level scope at none:0
│ in expression starting at /home/ubuntu/.julia/dev/MUMPS/deps/build.jl:49
│ Installing dependency libmumps-dev via `sudo apt-get install libmumps-dev`:
└ @ Pkg.Operations /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/Operations.jl:1069```

Installation was:

```(v1.0) pkg> rm MUMPS
┌ Warning: `MUMPS` not in project, ignoring
└ @ Pkg.Operations /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/Operations.jl:1128
[ Info: No changes

julia> Pkg.clone("https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git")
┌ Warning: Pkg.clone is only kept for legacy CI script reasons, please use `add`
└ @ Pkg.API /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/API.jl:463
  Updating git-repo `https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git`
[ Info: Path `/home/ubuntu/.julia/dev/MUMPS` exists and looks like the correct package, using existing path instead of cloning
 Resolving package versions...
  Updating `~/.julia/environments/v1.0/Project.toml`
  [bf6389e2] + MUMPS v0.0.2+ [`~/.julia/dev/MUMPS`]
  Updating `~/.julia/environments/v1.0/Manifest.toml`
  [bf6389e2] + MUMPS v0.0.2+ [`~/.julia/dev/MUMPS`]```
dpo commented

@urchgene I just realized there was still something about apt-get install in my build script. I just opened #37. Let's see if that builds on the CI VMs and then you'll be able to try it. In the mean time, could you remove your copy of MUMPS.jl entirely? You might have to remove it manually from ~/.julia/packages.

dpo commented

Please clone a fresh copy of MUMPS.jl after you've removed it completely from your system. Also make sure to apt-get remove libmumps-dev and anything else that has to do with MUMPS and apt-get.

@dpo Some updates on this issue.

Tried to start clean with Linuxbrew. Installed mumps from brew. Then installed MUMPS.jl and MPI.jl.
Everything is compiled and loads well. Then tried the example in MUMPS.jl ReadMe. Works great.

But after login on new shell, another problem surfaces:

julia> using MUMPS, LinearAlgebra, MPI

julia> MPI.Init()

julia> A = rand(10000, 100); A = A*A'; rhs = rand(10000);

julia> x = solve(A, rhs)

Entering DMUMPS 5.1.2 from C interface with JOB, N, NNZ =   4       10000      100000000
      executing #MPI =      1, without OMP

 =================================================
 MUMPS compiled with option -Dparmetis
 =================================================
L U Solver for unsymmetric matrices
Type of parallelism: Working host

 ****** ANALYSIS STEP ********

 Resetting candidate strategy to 0 because NSLAVES=1
 
 ... Structural symmetry (in percent)=  100
 Average density of rows/columns = 9999
 ... No column permutation
 Ordering based on METIS
julia: symbol lookup error: /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1/lib/libdmumps.so: undefined symbol: metis_setdefaultoptions_

Looks like there has been a linking problem with metis and mumps from googling but fixing is not clear.

Thanks.

dpo commented

@urchgene Great that you managed to install everything!

I feel like I've seen the missing symbol error before. Could you show the output of ldd /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1/lib/libdmumps.so?

Thanks.

@dpo

uche:~$ ldd /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1/lib/libdmumps.so
	linux-vdso.so.1 =>  (0x00007ffe328f1000)
	libmpi_usempif08.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi_usempif08.so.40 (0x00007fed05bd4000)
	libmpi_usempi_ignore_tkr.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi_usempi_ignore_tkr.so.40 (0x00007fed059cd000)
	libmpi_mpifh.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi_mpifh.so.40 (0x00007fed0576e000)
	libmpi.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi.so.40 (0x00007fed05457000)
	libgfortran.so.3 => /home/ubuntu/.linuxbrew/lib/libgfortran.so.3 (0x00007fed0512e000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fed04e25000)
	libgcc_s.so.1 => /home/ubuntu/.linuxbrew/lib/libgcc_s.so.1 (0x00007fed04c0e000)
	libquadmath.so.0 => /home/ubuntu/.linuxbrew/lib/libquadmath.so.0 (0x00007fed049cf000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fed047b2000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fed043e8000)
	libopen-rte.so.40 => /home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libopen-rte.so.40 (0x00007fed0412e000)
	libopen-pal.so.40 => /home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libopen-pal.so.40 (0x00007fed03e37000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fed03c33000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fed03a2b000)
	libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fed03828000)
	libz.so.1 => /home/ubuntu/.linuxbrew/lib/libz.so.1 (0x00007fed03613000)
	libevent-2.1.so.6 => /home/ubuntu/.linuxbrew/lib/libevent-2.1.so.6 (0x00007fed033ca000)
	libevent_pthreads-2.1.so.6 => /home/ubuntu/.linuxbrew/lib/libevent_pthreads-2.1.so.6 (0x00007fed031c6000)
	/lib64/ld-linux-x86-64.so.2 (0x00007fed06182000)

@dpo Also there is very good installation of MUMPS and its dependencies from Conda. Can this be linked?

This is ldd from anaconda:

uche:~$ ldd /home/ubuntu/anaconda3/lib/libdmumps.so
	linux-vdso.so.1 =>  (0x00007ffd5dff2000)
	libmumps_common-5.1.2.so => /home/ubuntu/anaconda3/lib/./libmumps_common-5.1.2.so (0x00007fed52657000)
	libparmetis.so => /home/ubuntu/anaconda3/lib/./libparmetis.so (0x00007fed525cd000)
	libmetis.so => /home/ubuntu/anaconda3/lib/./libmetis.so (0x00007fed52558000)
	libpord-5.1.2.so => /home/ubuntu/anaconda3/lib/./libpord-5.1.2.so (0x00007fed52713000)
	libptesmumps-6.so => /home/ubuntu/anaconda3/lib/./libptesmumps-6.so (0x00007fed5270c000)
	libptscotch-6.so => /home/ubuntu/anaconda3/lib/./libptscotch-6.so (0x00007fed5246b000)
	libptscotcherr-6.so => /home/ubuntu/anaconda3/lib/./libptscotcherr-6.so (0x00007fed52706000)
	libscotch-6.so => /home/ubuntu/anaconda3/lib/./libscotch-6.so (0x00007fed523c7000)
	libopenblas.so.0 => /home/ubuntu/anaconda3/lib/./libopenblas.so.0 (0x00007fed507ee000)
	libscalapack.so => /home/ubuntu/anaconda3/lib/./libscalapack.so (0x00007fed5021b000)
	libmpi_mpifh.so.40 => /home/ubuntu/anaconda3/lib/./libmpi_mpifh.so.40 (0x00007fed501bd000)
	libgfortran.so.4 => /home/ubuntu/anaconda3/lib/./libgfortran.so.4 (0x00007fed4fe93000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fed4fb8a000)
	libgomp.so.1 => /home/ubuntu/anaconda3/lib/./libgomp.so.1 (0x00007fed4fb65000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fed4f79b000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fed4f57e000)
	libmpi.so.40 => /home/ubuntu/anaconda3/lib/././libmpi.so.40 (0x00007fed4f479000)
	/lib64/ld-linux-x86-64.so.2 (0x00007fed526b3000)
	libz.so.1 => /home/ubuntu/anaconda3/lib/././libz.so.1 (0x00007fed4f262000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fed4f05a000)
	libscotcherr-6.so => /home/ubuntu/anaconda3/lib/././libscotcherr-6.so (0x00007fed526ef000)
	libgcc_s.so.1 => /home/ubuntu/anaconda3/lib/././libgcc_s.so.1 (0x00007fed526db000)
	libopen-rte.so.40 => /home/ubuntu/anaconda3/lib/././libopen-rte.so.40 (0x00007fed4efa0000)
	libopen-pal.so.40 => /home/ubuntu/anaconda3/lib/././libopen-pal.so.40 (0x00007fed4ee95000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fed4ec91000)
	libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007fed4ea8e000)
	libquadmath.so.0 => /home/ubuntu/anaconda3/lib/././libquadmath.so.0 (0x00007fed4ea5b000)
dpo commented

@urchgene It's odd that you were able to run the example in the README, but a second example fails. Did you (un)install anything in between? Did you use any special options to build Linuxbrew MUMPS? Your MUMPS.jl/build/deps.jl should have a line of the form

@checked_lib libmumps_simple "/home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/lib/libmumps_simple.so"

Could you also show ldd of that library? Thanks!

You may be able to link against the Anaconda libraries provided you set the MUMPS_PREFIX and SCALAPACK_PREFIX environment variables appropriately. However, that's untested. This interface would still need to download and compile the libmumps_simple library.

@dpo

So after installing MUMPS with Linuxbrew, I put the recommended export LD and CPP flags to my .profile file (see below)
This is what Linuxbrew showed after mumps installation:

==> Installing brewsci/num/brewsci-mumps 
==> Downloading https://linuxbrew.bintray.com/bottles-num/brewsci-mumps-5.1.2_1.x86_64_linux.bottle.tar.gz
######################################################################## 100.0%
==> Pouring brewsci-mumps-5.1.2_1.x86_64_linux.bottle.tar.gz
==> Caveats
MUMPS was built with shared libraries. If required,
static libraries are available in
  /home/ubuntu/.linuxbrew/opt/brewsci-mumps/libexec/lib

brewsci-mumps is keg-only, which means it was not symlinked into /home/ubuntu/.linuxbrew,
because formulae in brewsci/num are keg only.

For compilers to find brewsci-mumps you may need to set:
  export LDFLAGS="-L/home/ubuntu/.linuxbrew/opt/brewsci-mumps/lib"
  export CPPFLAGS="-I/home/ubuntu/.linuxbrew/opt/brewsci-mumps/include"

==> Summary
🍺  /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1: 60 files, 17.1MB

For the libmumps_simple.so ldd shows:

uche:~$ ldd /home/ubuntu/.linuxbrew/**opt**/brewsci-mumps/lib/libmumps_simple.so
	linux-vdso.so.1 =>  (0x00007ffda6ffb000)
	libsmumps.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1/lib/libsmumps.so (0x00007f675aec2000)
	libdmumps.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1/lib/libdmumps.so (0x00007f675ab48000)
	libcmumps.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1/lib/libcmumps.so (0x00007f675a7c7000)
	libzmumps.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1/lib/libzmumps.so (0x00007f675a445000)
	libmumps_common.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1/lib/libmumps_common.so (0x00007f675a1f5000)
	libpord.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_1/lib/libpord.so (0x00007f6759fde000)
	libopenblas.so.0 => /home/ubuntu/.linuxbrew/lib/libopenblas.so.0 (0x00007f6758306000)
	libscalapack.so => /home/ubuntu/.linuxbrew/opt/brewsci-scalapack/lib/libscalapack.so (0x00007f6757b3b000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f6757937000)
	libmpi.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi.so.40 (0x00007f6757620000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f6757403000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f6757039000)
	libmpi_usempif08.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi_usempif08.so.40 (0x00007f6756e05000)
	libmpi_usempi_ignore_tkr.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi_usempi_ignore_tkr.so.40 (0x00007f6756bfe000)
	libmpi_mpifh.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi_mpifh.so.40 (0x00007f675699f000)
	libgfortran.so.3 => /home/ubuntu/.linuxbrew/lib/libgfortran.so.3 (0x00007f6756676000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f675636d000)
	libgcc_s.so.1 => /home/ubuntu/.linuxbrew/lib/libgcc_s.so.1 (0x00007f6756156000)
	libquadmath.so.0 => /home/ubuntu/.linuxbrew/lib/libquadmath.so.0 (0x00007f6755f17000)
	libgomp.so.1 => /home/ubuntu/.linuxbrew/lib/libgomp.so.1 (0x00007f6755cf4000)
	/lib64/ld-linux-x86-64.so.2 (0x00007f675b440000)
	libopen-rte.so.40 => /home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libopen-rte.so.40 (0x00007f6755a3a000)
	libopen-pal.so.40 => /home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libopen-pal.so.40 (0x00007f6755743000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f675553b000)
	libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007f6755338000)
	libz.so.1 => /home/ubuntu/.linuxbrew/lib/libz.so.1 (0x00007f6755123000)
	libevent-2.1.so.6 => /home/ubuntu/.linuxbrew/lib/libevent-2.1.so.6 (0x00007f6754eda000)
	libevent_pthreads-2.1.so.6 => /home/ubuntu/.linuxbrew/lib/libevent_pthreads-2.1.so.6 (0x00007f6754cd6000)

uche:~$ ldd /home/ubuntu/.linuxbrew/**Cellar**/brewsci-mumps/lib/libmumps_simple.so
ldd: /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/lib/libmumps_simple.so: No such file or directory

For linking anaconda, you are correct. The interface still tried to compile libmumps_simple.so
and failed!!!!

(v1.0) pkg> add https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git
  Updating git-repo `https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git`
 Resolving package versions...
  Updating `~/.julia/environments/v1.0/Project.toml`
  [bf6389e2] + MUMPS v0.0.2+ #master (https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git)
  Updating `~/.julia/environments/v1.0/Manifest.toml`
  [bf6389e2] + MUMPS v0.0.2+ #master (https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git)
  Building MUMPS → `~/.julia/packages/MUMPS/U3Q9a/deps/build.log`
┌ Error: Error building `MUMPS`: 
│ [ Info: building libmumps_simple
│ ERROR: LoadError: None of the selected providers can install dependency libmumps_common.
│ Use BinDeps.debug(package_name) to see available providers
│ 
│ Stacktrace:
│  [1] error(::String) at ./error.jl:33
│  [2] satisfy!(::BinDeps.LibraryDependency, ::Array{DataType,1}) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:949
│  [3] satisfy!(::BinDeps.LibraryDependency) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:922
│  [4] top-level scope at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:977
│  [5] include at ./boot.jl:317 [inlined]
│  [6] include_relative(::Module, ::String) at ./loading.jl:1041
│  [7] include(::Module, ::String) at ./sysimg.jl:29
│  [8] include(::String) at ./client.jl:388
│  [9] top-level scope at none:0
│ in expression starting at /home/ubuntu/.julia/packages/MUMPS/U3Q9a/deps/build.jl:47
└ @ Pkg.Operations /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/Operations.jl:1069

dpo commented

@urchgene Thanks for the report. A fix is coming up in the form of updated MUMPS binaries on Linuxbrew (and Homebrew).

ps: regarding the Anaconda build, did you define MUMPS_PREFIX and SCALAPACK_PREFIX?

dpo commented

@urchgene Could you try uninstalling brewsci-mumps, updating your linuxbrew (with brew update) and brew install brewsci-mumps?

@dpo Just installed MUMPS successfully and MPI.jl is broken.

Something is wrong. open-mpi was also installed using Linuxbrew.

dpo commented

Did you rebuild MPI and MUMPS from within Julia (don't forget to set the env vars for MUMPS)?

@dpo I am stuck at this MPI problem again:

(v1.0) pkg> precompile
Precompiling project...
Precompiling MPI
[ Info: Precompiling MPI [da04e1cc-30fd-572f-bb4f-1f8673147195]
ERROR: LoadError: LoadError: UndefVarError: MPI_COMM_NULL not defined
Stacktrace:
 [1] top-level scope at none:0
 [2] include at ./boot.jl:317 [inlined]
 [3] include_relative(::Module, ::String) at ./loading.jl:1041
 [4] include at ./sysimg.jl:29 [inlined]
 [5] include(::String) at /home/ubuntu/.julia/packages/MPI/U5ujD/src/MPI.jl:3
 [6] top-level scope at none:0
 [7] include at ./boot.jl:317 [inlined]
 [8] include_relative(::Module, ::String) at ./loading.jl:1041
 [9] include(::Module, ::String) at ./sysimg.jl:29
 [10] top-level scope at none:2
 [11] eval at ./boot.jl:319 [inlined]
 [12] eval(::Expr) at ./client.jl:389
 [13] top-level scope at ./none:3
in expression starting at /home/ubuntu/.julia/packages/MPI/U5ujD/src/mpi-base.jl:73
in expression starting at /home/ubuntu/.julia/packages/MPI/U5ujD/src/MPI.jl:20
ERROR: Failed to precompile MPI [da04e1cc-30fd-572f-bb4f-1f8673147195] to /home/ubuntu/.julia/compiled/v1.0/MPI/nO0XF.ji.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] macro expansion at ./logging.jl:313 [inlined]
 [3] compilecache(::Base.PkgId, ::String) at ./loading.jl:1187
 [4] precompile(::Pkg.Types.Context) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/API.jl:506
 [5] do_precompile!(::Dict{Symbol,Any}, ::Array{String,1}, ::Dict{Symbol,Any}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:662
 [6] #invokelatest#1(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Any, ::Any, ::Vararg{Any,N} where N) at ./essentials.jl:697
 [7] invokelatest(::Any, ::Any, ::Vararg{Any,N} where N) at ./essentials.jl:696
 [8] do_cmd!(::Pkg.REPLMode.PkgCommand, ::REPL.LineEditREPL) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:603
 [9] #do_cmd#33(::Bool, ::Function, ::REPL.LineEditREPL, ::String) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:577
 [10] do_cmd at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:573 [inlined]
 [11] (::getfield(Pkg.REPLMode, Symbol("##44#47")){REPL.LineEditREPL,REPL.LineEdit.Prompt})(::REPL.LineEdit.MIState, ::Base.GenericIOBuffer{Array{UInt8,1}}, ::Bool) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:912
 [12] #invokelatest#1 at ./essentials.jl:697 [inlined]
 [13] invokelatest at ./essentials.jl:696 [inlined]
 [14] run_interface(::REPL.Terminals.TextTerminal, ::REPL.LineEdit.ModalInterface, ::REPL.LineEdit.MIState) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/REPL/src/LineEdit.jl:2261
 [15] run_frontend(::REPL.LineEditREPL, ::REPL.REPLBackendRef) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/REPL/src/REPL.jl:1029
 [16] run_repl(::REPL.AbstractREPL, ::Any) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/REPL/src/REPL.jl:191
 [17] (::getfield(Base, Symbol("##719#721")){Bool,Bool,Bool,Bool})(::Module) at ./logging.jl:311
 [18] #invokelatest#1 at ./essentials.jl:697 [inlined]
 [19] invokelatest at ./essentials.jl:696 [inlined]
 [20] macro expansion at ./logging.jl:308 [inlined]
 [21] run_main_repl(::Bool, ::Bool, ::Bool, ::Bool, ::Bool) at ./client.jl:330
 [22] exec_options(::Base.JLOptions) at ./client.jl:242
 [23] _start() at ./client.jl:421

Any ideas? I have tried everything!

Thanks @dpo for your suggestions and trying to solve this! Highly appreciated!!

dpo commented

I see that you've already posted in https://discourse.julialang.org/t/solved-cant-install-mpi-jl-in-julia-1-0-1-0-7-0-on-a-centos-7-4/16532. I presume you've set the environment variables as suggested? The best bet is for one of the authors of MPI.jl to help. I'm sorry this has been so frustrating!

Thanks @dpo

dpo commented

For reference, perhaps you could post a gist with the contents of MPI.jl/deps/build.log.

Here its is:

-- The Fortran compiler identification is GNU 5.5.0
-- The C compiler identification is GNU 7.4.0
-- Check for working Fortran compiler: /usr/bin/f95
-- Check for working Fortran compiler: /usr/bin/f95  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /usr/bin/f95 supports Fortran 90
-- Checking whether /usr/bin/f95 supports Fortran 90 -- yes
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Found Git: /usr/bin/git (found version "2.7.4") 
-- Found MPI_C: /home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi.so (found version "3.1") 
CMake Error in /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build/CMakeFiles/CMakeTmp/CMakeLists.txt:
  Imported target "MPI::MPI_Fortran" includes non-existent path

    "-I/home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/include"

  in its INTERFACE_INCLUDE_DIRECTORIES.  Possible reasons include:

  * The path was deleted, renamed, or moved to another location.

  * An install or uninstall procedure did not complete successfully.

  * The installation package was faulty and references files it does not
  provide.



CMake Error in /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build/CMakeFiles/CMakeTmp/CMakeLists.txt:
  Imported target "MPI::MPI_Fortran" includes non-existent path

    "-I/home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/include"

  in its INTERFACE_INCLUDE_DIRECTORIES.  Possible reasons include:

  * The path was deleted, renamed, or moved to another location.

  * An install or uninstall procedure did not complete successfully.

  * The installation package was faulty and references files it does not
  provide.



CMake Error at /home/ubuntu/.linuxbrew/Cellar/cmake/3.13.4/share/cmake/Modules/FindMPI.cmake:1187 (try_compile):
  Failed to generate test project build system.
Call Stack (most recent call first):
  /home/ubuntu/.linuxbrew/Cellar/cmake/3.13.4/share/cmake/Modules/FindMPI.cmake:1203 (_MPI_try_staged_settings)
  /home/ubuntu/.linuxbrew/Cellar/cmake/3.13.4/share/cmake/Modules/FindMPI.cmake:1488 (_MPI_check_lang_works)
  CMakeLists.txt:5 (find_package)


-- Configuring incomplete, errors occurred!
See also "/home/ubuntu/.julia/packages/MPI/U5ujD/deps/build/CMakeFiles/CMakeOutput.log".
[ Info: Attempting to create directory /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build
[ Info: Changing directory to /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build
ERROR: LoadError: failed process: Process(`cmake -DMPI_Fortran_INCLUDE_PATH=-I/home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/include -DCMAKE_INSTALL_PREFIX=/home/ubuntu/.julia/packages/MPI/U5ujD/deps/src -DCMAKE_LIB_INSTALL_PREFIX=/home/ubuntu/.julia/packages/MPI/U5ujD/deps/usr/lib ..`, ProcessExited(1)) [1]
Stacktrace:
 [1] error(::String, ::Base.Process, ::String, ::Int64, ::String) at ./error.jl:42
 [2] pipeline_error at ./process.jl:705 [inlined]
 [3] #run#504(::Bool, ::Function, ::Cmd) at ./process.jl:663
 [4] run(::Cmd) at ./process.jl:661
 [5] macro expansion at ./logging.jl:308 [inlined]
 [6] run(::BinDeps.SynchronousStepCollection) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/BinDeps.jl:518
 [7] macro expansion at ./logging.jl:308 [inlined]
 [8] run(::BinDeps.SynchronousStepCollection) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/BinDeps.jl:518
 [9] macro expansion at ./logging.jl:308 [inlined]
 [10] run(::BinDeps.SynchronousStepCollection) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/BinDeps.jl:518
 [11] satisfy!(::BinDeps.LibraryDependency, ::Array{DataType,1}) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:944
 [12] satisfy!(::BinDeps.LibraryDependency) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:922
 [13] top-level scope at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:977
 [14] include at ./boot.jl:317 [inlined]
 [15] include_relative(::Module, ::String) at ./loading.jl:1041
 [16] include(::Module, ::String) at ./sysimg.jl:29
 [17] include(::String) at ./client.jl:388
 [18] top-level scope at none:0
in expression starting at /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build.jl:54

Newer attempt cos of wrong include path!

-- The Fortran compiler identification is GNU 5.5.0
-- The C compiler identification is GNU 5.5.0
-- Check for working Fortran compiler: /home/linuxbrew/.linuxbrew/bin/mpif90
-- Check for working Fortran compiler: /home/linuxbrew/.linuxbrew/bin/mpif90  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /home/linuxbrew/.linuxbrew/bin/mpif90 supports Fortran 90
-- Checking whether /home/linuxbrew/.linuxbrew/bin/mpif90 supports Fortran 90 -- yes
-- Check for working C compiler: /home/linuxbrew/.linuxbrew/bin/mpicc
-- Check for working C compiler: /home/linuxbrew/.linuxbrew/bin/mpicc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Found Git: /usr/bin/git (found version "2.7.4") 
-- Found MPI_C: /home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi.so  
-- Found MPI_Fortran: /home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi_usempif08.so;/home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi_usempi_ignore_tkr.so;/home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi_mpifh.so;/home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi.so  
-- Detecting Fortran/C Interface
-- Detecting Fortran/C Interface - Found GLOBAL and MODULE mangling
-- Looking for MPI_Comm_c2f
-- Looking for MPI_Comm_c2f - found
-- Configuring done
-- Generating done
-- Build files have been written to: /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build
Scanning dependencies of target gen_constants
[ 11%] Building Fortran object CMakeFiles/gen_constants.dir/gen_constants.f90.o
[ 22%] Linking Fortran executable gen_constants
/usr/bin/ld: warning: libgfortran.so.3, needed by /home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi_usempif08.so, may conflict with libgfortran.so.4
[ 22%] Built target gen_constants
Scanning dependencies of target gen_functions
[ 33%] Building C object CMakeFiles/gen_functions.dir/gen_functions.c.o
[ 44%] Linking C executable gen_functions
[ 44%] Built target gen_functions
Scanning dependencies of target mpijl-build
[ 55%] Generating mpi-build.jl
/home/ubuntu/.julia/packages/MPI/U5ujD/deps/build/gen_constants: error while loading shared libraries: libgfortran.so.4: cannot open shared object file: No such file or directory
[ 55%] Built target mpijl-build
Scanning dependencies of target mpijl
[ 66%] Generating compile-time.jl
/home/ubuntu/.julia/packages/MPI/U5ujD/deps/build/gen_constants: error while loading shared libraries: libgfortran.so.4: cannot open shared object file: No such file or directory
[ 66%] Built target mpijl
Scanning dependencies of target juliampi
[ 77%] Building C object CMakeFiles/juliampi.dir/juliampi.c.o
[ 88%] Building Fortran object CMakeFiles/juliampi.dir/test_mpi.f90.o
[100%] Linking Fortran shared library libjuliampi.so
[100%] Built target juliampi
[ 22%] Built target gen_constants
[ 44%] Built target gen_functions
[ 55%] Built target mpijl-build
[ 66%] Built target mpijl
[100%] Built target juliampi
Install the project...
-- Install configuration: ""
-- Installing: /home/ubuntu/.julia/packages/MPI/U5ujD/deps/src/./compile-time.jl
-- Installing: /home/ubuntu/.julia/packages/MPI/U5ujD/deps/usr/lib/libjuliampi.so
[ Info: Attempting to create directory /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build
[ Info: Changing directory to /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build

Still compilation fails!!

julia> using MPI
[ Info: Precompiling MPI [da04e1cc-30fd-572f-bb4f-1f8673147195]
ERROR: LoadError: LoadError: UndefVarError: MPI_COMM_NULL not defined
Stacktrace:
 [1] top-level scope at none:0
 [2] include at ./boot.jl:317 [inlined]
 [3] include_relative(::Module, ::String) at ./loading.jl:1041
 [4] include at ./sysimg.jl:29 [inlined]
 [5] include(::String) at /home/ubuntu/.julia/packages/MPI/U5ujD/src/MPI.jl:3
 [6] top-level scope at none:0
 [7] include at ./boot.jl:317 [inlined]
 [8] include_relative(::Module, ::String) at ./loading.jl:1041
 [9] include(::Module, ::String) at ./sysimg.jl:29
 [10] top-level scope at none:2
 [11] eval at ./boot.jl:319 [inlined]
 [12] eval(::Expr) at ./client.jl:389
 [13] top-level scope at ./none:3
in expression starting at /home/ubuntu/.julia/packages/MPI/U5ujD/src/mpi-base.jl:73
in expression starting at /home/ubuntu/.julia/packages/MPI/U5ujD/src/MPI.jl:20
ERROR: Failed to precompile MPI [da04e1cc-30fd-572f-bb4f-1f8673147195] to /home/ubuntu/.julia/compiled/v1.0/MPI/nO0XF.ji.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] macro expansion at ./logging.jl:313 [inlined]
 [3] compilecache(::Base.PkgId, ::String) at ./loading.jl:1187
 [4] _require(::Base.PkgId) at ./logging.jl:311
 [5] require(::Base.PkgId) at ./loading.jl:855
 [6] macro expansion at ./logging.jl:311 [inlined]
 [7] require(::Module, ::Symbol) at ./loading.jl:837

@dpo

So I managed to get MPI working

But looks like I'm back to square one!

(v1.0) pkg> build MUMPS
  Updating registry at `~/.julia/registries/General`
  Updating git-repo `https://github.com/JuliaRegistries/General.git`
  Building MPI ──→ `~/.julia/packages/MPI/U5ujD/deps/build.log`
  Building MUMPS → `~/.julia/packages/MUMPS/U3Q9a/deps/build.log`
┌ Error: Error building `MUMPS`: 
│ [ Info: building libmumps_simple
│ ERROR: LoadError: None of the selected providers can install dependency libmumps_common.
│ Use BinDeps.debug(package_name) to see available providers
│ 
│ Stacktrace:
│  [1] error(::String) at ./error.jl:33
│  [2] satisfy!(::BinDeps.LibraryDependency, ::Array{DataType,1}) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:949
│  [3] satisfy!(::BinDeps.LibraryDependency) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:922
│  [4] top-level scope at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:977
│  [5] include at ./boot.jl:317 [inlined]
│  [6] include_relative(::Module, ::String) at ./loading.jl:1041
│  [7] include(::Module, ::String) at ./sysimg.jl:29
│  [8] include(::String) at ./client.jl:388
│  [9] top-level scope at none:0
│ in expression starting at /home/ubuntu/.julia/packages/MUMPS/U3Q9a/deps/build.jl:47
└ @ Pkg.Operations /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/Operations.jl:1069

libmumps_simple is still the problem!!

dpo commented

@urchgene Thanks for the update and your patience. I don't quite understand why Bindeps would want to build libmumps_simple. It's bundled with the MUMPS binary installed by Linuxbrew. Could you please again ldd /home/ubuntu/.linuxbrew/opt/brewsci-mumps/lib/libmumps_simple.so? Also it would be great if you could paste here the contents of MUMPS.jl/deps/build.log.

@dpo Thanks to you too actually more for your assistance!

Everything seems fine when i do ldd /home/ubuntu/.linuxbrew/opt/brewsci-mumps/lib/libmumps_simple.so

:~$ ldd /home/ubuntu/.linuxbrew/opt/brewsci-mumps/lib/libmumps_simple.so
	linux-vdso.so.1 =>  (0x00007fff9a036000)
	libsmumps.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_3/lib/libsmumps.so (0x00007ff273954000)
	libdmumps.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_3/lib/libdmumps.so (0x00007ff2735da000)
	libcmumps.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_3/lib/libcmumps.so (0x00007ff273259000)
	libzmumps.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_3/lib/libzmumps.so (0x00007ff272ed7000)
	libmumps_common.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_3/lib/libmumps_common.so (0x00007ff272c87000)
	libpord.so => /home/ubuntu/.linuxbrew/Cellar/brewsci-mumps/5.1.2_3/lib/libpord.so (0x00007ff272a6f000)
	libopenblas.so.0 => /home/ubuntu/.linuxbrew/lib/libopenblas.so.0 (0x00007ff270d97000)
	libscalapack.so => /home/ubuntu/.linuxbrew/opt/brewsci-scalapack/lib/libscalapack.so (0x00007ff2705cc000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007ff2703c8000)
	libmpi.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi.so.40 (0x00007ff2700b1000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007ff26fe94000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007ff26faca000)
	libparmetis.so => /home/ubuntu/.linuxbrew/opt/brewsci-parmetis/lib/libparmetis.so (0x00007ff26f890000)
	libmetis.so => /home/ubuntu/.linuxbrew/opt/brewsci-metis/lib/libmetis.so (0x00007ff26f61f000)
	libmpi_usempif08.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi_usempif08.so.40 (0x00007ff26f3eb000)
	libmpi_usempi_ignore_tkr.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi_usempi_ignore_tkr.so.40 (0x00007ff26f1e4000)
	libmpi_mpifh.so.40 => /home/ubuntu/.linuxbrew/lib/libmpi_mpifh.so.40 (0x00007ff26ef85000)
	libgfortran.so.3 => /home/ubuntu/.linuxbrew/lib/libgfortran.so.3 (0x00007ff26ec5c000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007ff26e953000)
	libgcc_s.so.1 => /home/ubuntu/.linuxbrew/lib/libgcc_s.so.1 (0x00007ff26e73c000)
	libquadmath.so.0 => /home/ubuntu/.linuxbrew/lib/libquadmath.so.0 (0x00007ff26e4fd000)
	libgomp.so.1 => /home/ubuntu/.linuxbrew/lib/libgomp.so.1 (0x00007ff26e2da000)
	/lib64/ld-linux-x86-64.so.2 (0x00007ff273ed2000)
	libopen-rte.so.40 => /home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libopen-rte.so.40 (0x00007ff26e020000)
	libopen-pal.so.40 => /home/ubuntu/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libopen-pal.so.40 (0x00007ff26dd29000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007ff26db21000)
	libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007ff26d91e000)
	libz.so.1 => /home/ubuntu/.linuxbrew/lib/libz.so.1 (0x00007ff26d709000)
	libevent-2.1.so.6 => /home/ubuntu/.linuxbrew/lib/libevent-2.1.so.6 (0x00007ff26d4c0000)
	libevent_pthreads-2.1.so.6 => /home/ubuntu/.linuxbrew/lib/libevent_pthreads-2.1.so.6 (0x00007ff26d2bc000)

Any suggestions?

@dpo I tried again now and it complains about libmumps_common


v1.0) pkg> add https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git
  Updating registry at `~/.julia/registries/General`
  Updating git-repo `https://github.com/JuliaRegistries/General.git`
  Updating git-repo `https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git`
 Resolving package versions...
  Updating `~/.julia/environments/v1.0/Project.toml`
  [bf6389e2] + MUMPS v0.0.2+ #master (https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git)
  Updating `~/.julia/environments/v1.0/Manifest.toml`
  [bf6389e2] + MUMPS v0.0.2+ #master (https://github.com/JuliaSmoothOptimizers/MUMPS.jl.git)
  Building MUMPS → `~/.julia/packages/MUMPS/U3Q9a/deps/build.log`
┌ Error: Error building `MUMPS`: 
│ [ Info: building libmumps_simple
│ ERROR: LoadError: None of the selected providers can install dependency libmumps_common.
│ Use BinDeps.debug(package_name) to see available providers
│ 
│ Stacktrace:
│  [1] error(::String) at ./error.jl:33
│  [2] satisfy!(::BinDeps.LibraryDependency, ::Array{DataType,1}) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:949
│  [3] satisfy!(::BinDeps.LibraryDependency) at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:922
│  [4] top-level scope at /home/ubuntu/.julia/packages/BinDeps/ZEval/src/dependencies.jl:977
│  [5] include at ./boot.jl:317 [inlined]
│  [6] include_relative(::Module, ::String) at ./loading.jl:1041
│  [7] include(::Module, ::String) at ./sysimg.jl:29
│  [8] include(::String) at ./client.jl:388
│  [9] top-level scope at none:0
│ in expression starting at /home/ubuntu/.julia/packages/MUMPS/U3Q9a/deps/build.jl:47
└ @ Pkg.Operations /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/Operations.jl:1069

You can see the ldd i showed earlier (line 6) shows libmumps_common.so found. It must be something else? What other places should I trace?

dpo commented

@urchgene I'm a bit confused because you got it working earlier. Is there anything in MUMPS.jl/deps/build.log? Last week you managed to get everything working. Nothing should have changed, except for the updated MUMPS binaries. Can you remember how you got everything to work?

I just have to ditch BinDeps. That's what causing all the trouble. I can get to it tonight.

@dpo I have tried to get it work again but no success since then.

Thanks a lot. I will be first to use it tonight. I need it so bad cos my algorithm heavily depends on it.

dpo commented

Alright, let's see how this works: #39.

dpo commented

@urchgene The update in #39 works for me locally (on macOS) and on macOS and Linux on CI. Could you try updating your local MUMPS.jl? If something doesn't work, please paste the contents of MUMPS.jl/deps/build.log. Fingers crossed!

@dpo Yaay!!!!! I have succeeded!!! But there are multiple Catches!

  • I installed Linuxbrew from Ubuntu not from Linuxbrew website instructions
    sudo apt-get install linuxbrew-wrapper

Then I did brew update. Then installed mumps using Linux brew.

Add these as recommended by Linuxbrew to ~/.bashrc file

##For compilers to find brewsci-mumps you may need to set:
export LDFLAGS="-L/home/ubuntu/.linuxbrew/opt/brewsci-mumps/lib"
export CPPFLAGS="-I/home/ubuntu/.linuxbrew/opt/brewsci-mumps/include"

##For compilers to find brewsci-parmetis you may need to set:
export LDFLAGS="-L/home/ubuntu/.linuxbrew/opt/brewsci-parmetis/lib"
export CPPFLAGS="-I/home/ubuntu/.linuxbrew/opt/brewsci-parmetis/include"

##For compilers to find brewsci-metis you may need to set:
export LDFLAGS="-L/home/ubuntu/.linuxbrew/opt/brewsci-metis/lib"
export CPPFLAGS="-I/home/ubuntu/.linuxbrew/opt/brewsci-metis/include"


### For compilers to find brewsci-scalapack you may need to set:
export LDFLAGS="-L/home/ubuntu/.linuxbrew/opt/brewsci-scalapack/lib"

### For pkg-config to find brewsci-scalapack you may need to set:
export PKG_CONFIG_PATH="/home/ubuntu/.linuxbrew/opt/brewsci-scalapack/lib/pkgconfig"

I did a git pull git clone https://github.com/JuliaSmoothOptimizers/MUMPS.jl.gitand the following:

  1. I added BinDeps to the REQUIRE file.
  2. I then set the ENV variables
  3. I did a Pkg.clone("MUMPS.jl")
julia> using Pkg

julia> ENV["MUMPS_PREFIX"] = "/home/ubuntu/.linuxbrew/opt/brewsci-mumps"
"/home/ubuntu/.linuxbrew/opt/brewsci-mumps"

julia> ENV["SCALAPACK_PREFIX"] = "/home/ubuntu/.linuxbrew/opt/brewsci-scalapack"
"/home/ubuntu/.linuxbrew/opt/brewsci-scalapack"

julia> Pkg.build("MUMPS")
  Updating registry at `~/.julia/registries/General`
  Updating git-repo `https://github.com/JuliaRegistries/General.git`
  Building MPI ──→ `~/.julia/packages/MPI/U5ujD/deps/build.log`
  Building MUMPS → `~/MUMPS.jl/deps/build.log`
 Resolving package versions...

julia> using MUMPS
[ Info: Precompiling MUMPS [bf6389e2-cca1-5e17-ac22-36425c4ccbb4]

julia>

Finally it worked. Thank you so much @dpo

I included all steps here for everyone who may encounter same problem.

dpo commented

@urchgene Great news! Congratulations and many thanks for your feedback. Ultimately, it resulted in a more solid installation process.

It's not necessary to add BinDeps to the REQUIRE file. I eliminated BinDeps entirely. Perhaps you found that you had to because you previously installed MUMPS when it required BinDeps.

I'll close this issue then. Happy computing!

@dpo Please I don't wanna open this issue again but how does the MUMPS-out-of-core work?

How do I use this?

Uche.

dpo commented

You can specify ooc=true when you call get_icntl(...) to obtain a set of integer control parameters:
https://github.com/JuliaSmoothOptimizers/MUMPS.jl/blob/master/src/MUMPS.jl#L176. You can then pass those to the main MUMPS constructor. The out-of-core facility should be transparent to the user. The factors will be stored on disk and read in when needed.

Ps: please open a new issue if there are any questions related to the out-of-core facility.

@dpo

Would you say this is a correct way to do it?

using LinearAlgebra, MUMPS, MPI
MPI.Init()
mumps = Mumps{Float64}(mumps_unsymmetric, get_icntl(;det=true, verbose=true, ooc=true, itref=0), default_cntl64)
M = rand(1000, 1000); M=M*M'; rhs = rand(1000);
associate_matrix!(mumps, M) 
factorize!(mumps) 
associate_rhs!(mumps, rhs)   
solve!(mumps)
x = get_solution(mumps)
finalize(mumps)
MPI.Finalize()
x2 = M\rhs

x and x2 are same solutions but is this the correct specs for mumps out-of-core?

Thanks.

dpo commented

That seems right. Doesn't it work?

Works pretty well...just wanted to confirm it was the right way to do it.

Many thanks @dpo !!!

dpo commented

🤘