Problem with `owner_rank`s of ghost quadrants
lchristm opened this issue · 6 comments
Description
I am trying to parallelize a DG solver which uses p4est for managing the mesh. More specifically, for each face shared by two different processes, I am trying to exchange data associated to that specific face. My understanding is that the built-in communication routines in p4est can only be used to exchange data associated to a specific quadrant, hence I am building my own routines. In particular, I want to use the p4est_iterate
function to build lists holding the IDs of neighbor processes and associated faces. To achieve this, I check if one of the sides is_ghost
and then extract the owner_rank
of the quadrant on the remote side.
The problem is that for some quadrants, the owner_rank
seems to indicate that the quadrant is owned by the local process even though I think it shouldn't be since is_ghost
is 1.
What am I missing here?
To Reproduce
The following MWE demonstrates this.
#include <p4est_iterate.h>
#include <p4est_ghost.h>
static int refine_fn (p4est_t * p4est, p4est_topidx_t which_tree,
p4est_quadrant_t * quadrant)
{
return 1;
}
static void iter_face_fn (p4est_iter_face_info_t * info, void *user_data)
{
p4est_t *p4est = info->p4est;
int skip = 0;
int remote_side;
int neighbor_rank;
int mpirank = *((int *)user_data);
p4est_iter_face_side_t *side[2];
sc_array_t *sides = &(info->sides);
if (sides->elem_count == 2) {
side[0] = p4est_iter_fside_array_index_int (sides, 0);
side[1] = p4est_iter_fside_array_index_int (sides, 1);
if (side[0]->is_hanging == 0 && side[1]->is_hanging == 0) {
if (side[0]->is.full.is_ghost == 1) {
remote_side = 0;
} else if (side[1]->is.full.is_ghost == 1) {
remote_side = 1;
} else { // skip if both sides are owned by this process
skip = 1;
}
if (skip == 0) {
neighbor_rank = side[remote_side]->is.full.quad->p.piggy1.owner_rank;
if (neighbor_rank == mpirank) {
printf ("mpirank = %d, neighbor_rank = %d \n", mpirank, neighbor_rank);
}
}
}
}
}
int main (int argc, char **argv)
{
int mpiret;
sc_MPI_Comm mpicomm;
int mpirank;
p4est_t *p4est;
p4est_ghost_t *ghost_layer;
p4est_connectivity_t *conn;
mpiret = sc_MPI_Init (&argc, &argv);
SC_CHECK_MPI (mpiret);
mpicomm = sc_MPI_COMM_WORLD;
sc_MPI_Comm_rank (mpicomm, &mpirank);
sc_init (mpicomm, 1, 1, NULL, SC_LP_ESSENTIAL);
p4est_init (NULL, SC_LP_PRODUCTION);
conn = p4est_connectivity_new_brick (4, 4, 0, 0);
p4est = p4est_new (mpicomm, conn, 0, NULL, NULL);
p4est_refine (p4est, 0, refine_fn, NULL);
p4est_partition (p4est, 0, NULL);
//p4est_balance (p4est, P4EST_CONNECT_FACE, NULL);
//p4est_partition (p4est, 0, NULL);
ghost_layer = p4est_ghost_new (p4est, P4EST_CONNECT_FACE);
p4est_iterate (p4est, ghost_layer, &mpirank, NULL, iter_face_fn, NULL);
p4est_ghost_destroy (ghost_layer);
p4est_destroy (p4est);
p4est_connectivity_destroy (conn);
sc_finalize ();
mpiret = sc_MPI_Finalize ();
SC_CHECK_MPI (mpiret);
return 0;
}
Running this program with 4 processes produces the following output:
[libsc] This is libsc 2.8.3
[p4est] This is p4est 2.8
[p4est] CPP mpicc -E
[p4est] CPPFLAGS
[p4est] CC mpicc
[p4est] CFLAGS -g -O2
[p4est] LDFLAGS
[p4est] LIBS -lz -lm
[p4est] Into p4est_new with min quadrants 0 level 0 uniform 1
[p4est] New p4est with 16 trees on 4 processors
[p4est] Done p4est_new with 16 total quadrants
[p4est] Into p4est_refine with 16 total quadrants, allowed level 29
[p4est] Done p4est_refine with 64 total quadrants
[p4est] Into p4est_partition with 64 total quadrants
[p4est] Done p4est_partition shipped 0 quadrants 0%
[p4est] Into p4est_ghost_new FACE
[p4est] Done p4est_ghost_new
mpirank = 0, neighbor_rank = 0
mpirank = 0, neighbor_rank = 0
mpirank = 1, neighbor_rank = 1
mpirank = 2, neighbor_rank = 2
Additional information
- OS: Ubuntu 20.04
- p4est was built with gcc version 9.3.0 and MPI enabled (OpenMPI version 4.0.3)
cc @sloede
The ghost quadrant uses the piggy3 structure by the documentation. It store tree and local number, not rank. Ranks of ghosts are available by the ghost context and the proc_offsets array, indirectly. Can you verify that the piggy3 member is ok?
Ah, I see. The piggy3 member is fine. I was able to make it work, thanks a lot for your quick help.