su2code/SU2

Instability in BL with Periodic Boundary Conditions

Opened this issue · 14 comments

Describe the bug
In a RANS computation of a 2.5D section of a wing, the periodic plane exhibits oscillations of pressure in the boundary layer. The issue is only present on the periodic face, not in the volume. Using a CFL of 1 helps but the instability does not disappear. A CFL of 10 leads to a crash of the computation with multiple spots like on the screenshot. Would it be linked to issue #1090 ?

pres_per_face
pres_cut

Here is the config file:
hl_3d_landing.txt

Hardware and software:

  • OS: Linux Red Hat ELS 6.5
  • Nodes: Intel Xeon Ivy bridge
  • C++ compiler and version: intel/2018.0.1
  • MPI implementation and version: intelmpi/2018.0.1
  • SU2 Version: v7.2.1

Thank you for looking at this issue and for any help.

I made a light test case to reproduce the issue. It's a cylinder in a flowfield. It's only 112k cells.

mesh
swpeptCylinder
residual_energy

The residuals never converge:
residual_convergence
Here is the archive containing the case:
cyl.zip

Hi,
Can you try the develop branch or SA instead of SA_NEG. We fixed an issue in SA_NEG that could cause this type of oscillation.
However, I suspect it is due to issues found in #763 and that @TobiKattmann has also run into when implementing periodic BC's for heat-transfer problems.

How do you feel about doing some c++ hacking? We have an idea of how to improve things, just not enough bandwidth to do it.

Hi,
I tried with SA instead of SA_NEG, I get the same behavior. I understand that I don't need to test the develop branch then.

I am ok with some c++ dev but I am not an expert. How would you improve the current implementation?

Thanks,
Jean-François

Hi Jean,
Sorry for the delay. The hypothesis I have for the less robust behavior of periodic BC's is that the linear solver is not completely aware of the periodicity. In fact after the linear solve we have to force the matching nodes to have the same value.
This is done in CFVMFlowSolverBase.hpp::CompleteImplicitIteration_impl, in the call to InitiatePeriodicComms.
The idea is to make the linear solver aware of the periodicity, to do so would require including periodic communications in the preconditioners and the matrix-vector product.
These are all in CSysMatrix.cpp, so before each of the /*--- MPI Parallelization ---*/ bits we would need periodic comms, for preconditioners these comms would simply make the values equal, like in CSolver::InitiatePeriodicComms(PERIODIC_IMPLICIT) whereas for the matrix-vector product we need to add the values (effectively periodicity splits a row of the matrix into two "half rows") and this would be similar to what is done with the linear residual in CSolver::InitiatePeriodicComms(PERIODIC_RESIDUAL).
I think @TobiKattmann was also interested in having a look into this.

Yes, indeed. I'll open a (rather empty) Draft PR soon in order to pressure myself a bit to get this test going. I am sorry for the delay :(

Hi @jf-thomas, hopefully this is still of interest to you, please have a look at the branch from #1585 it seems to fix the issues of your example problem.
Note that to have comparable time per iteration I had to relax the linear tolerance to 0.1 (from 0.01) (in my experience this is fine).

Hi @pcarruscag, thank you for the development. I am testing it right now, I already noticed the struggle to converge the linear system. I lowered the CFL and relaxed the tolerance to 0.1.

Hi @jf-thomas, was there a benefit in the end? Were you able to get better convergence in the cases you were working on?

Hi @pcarruscag, sorry for the delay, I had a week off. I can confirm that it converges now. The only drawback is the longer convergence of the linear system. Thank you.

Hi @jf-thomas no worries, thank you for the update. Just to clarify, are you referring just to the small example problem? Or also to the 2.5D wing section?

I am referring to the 2.5D wing section.

Brilliant, thanks. I'll put some time into finishing that pull request.

stale commented

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. If this is still a relevant issue please comment on it to restart the discussion. Thank you for your contributions.

Still relevant. A fix is proposed in #1585, but due to problems with the discrete adjoint that is not merged yet. Works very fine for primal problems though.