Hi Birger and Alex,

Thank you very much for your shares and your advices.
I confirm that installing SuperLU solved my problem in the homogeneous case.
But when trying to run my tests in heteogeneous media, I got the same
convergence problem as Birger:

"Newton: Caught exception: "NumericalProblem
[newtonSolveLinear:../dumux/nonlinear/newtoncontroller.hh:389]: Linear
solver did not converge"

I don't get any convergence at all, even with very small time steps. Any
hints are welcome.

@ Alex: Did you try run your test with fully implicit cc model? From my
experience, in general, cc model works correctly in parallel (parallel run
are faster than sequentiel run) but not box model.

Best regards,
Tri Dat

2015-11-26 17:26 GMT+01:00 Alexander Kissinger <
alexander.kissin...@iws.uni-stuttgart.de>:

> Hi Birger and Tri Dat,
>
> are you not getting any convergence at all or do you get convergence at a
> smaller time step size?
> Do you get the error in the first Newton step of a time step or in between
> somewhere?
>
> I have similar problems for parallel computations with bad aspect ratios
> (300:10). Additionally I have horizontal layers with large differences in
> permeability. I am using AMG (also with SuperLU as coarse solver) and the
> fully implicit box model (1p2c).
> Running the same grid sequentially leads to much larger time step sizes
> until my time step is constrained by the linear solver.
> I also ran the my grid for a homogeneous case with simplified physics
> which the linear solver was able to handle.
> I assume that the problem in my case is a combination of heterogeneities
> and the bad aspect ratio.
>
> Unfortunately I was not able to resolve the problem. Some actions that
> slightly improved the behavior were:
>
> (1) In amgproperties.hh setting the Preconditioner from Dune::SeqSSOR to
> Dune::SeqILUn (This also improved convergence behavior of the linear solver
> for the sequential case)
>
> (2) If you use Dune ALUGrud. Try setting METIS_PartGraphRecursive in your
> alugrid.cfg file. Another point I observed is that the partitioning method
> played a huge role. Metis worked the best for me.
>
> (3) In amgbackend.hh increasing the coarsen target to a value larger than
> 2000 (for example 10000, 50000 or 100000). In the following line:
> Dune::Amg::Parameters params(15,2000,1.2,1.6,Dune::Amg::atOnceAccu);
>
> You can see the effect of the latter when you set the linear solver
> verbosity to 2 in your input file:
>
>  [LinearSolver]
> Verbosity = 2
>
> AMG will print the different agglomeration levels for each Newton step. If
> you increase the coarsen target there should be fewer levels. On the
> highest level the coarse solver (SuperLU) is used. The convergence in my
> case improved with increasing number of unknowns in the highest level. The
> downside to this is that it takes much more time to solve the coarse linear
> system with SuperLU which for me meant that I was much faster in the
> sequential case than in the parallel one in the end. But maybe you have
> more luck.
>
> Best regards
> Alex
>
>
>
>
>
>
>
> On 11/26/2015 04:05 PM, Birger Hagemann wrote:
>
> Hi Tri Dat,
>
>
>
> I had recently the same problem with parallel computation on a realistic
> reservoir grid where also the horizontal dimensions are much larger than
> the vertical. The error message was the same. Bernd gave me the hint to
> install SuperLU. However, after installing SuperLU the error was only
> changed to:
>
> “Newton: Caught exception: "NumericalProblem
>
> >>[newtonSolveLinear:…/dumux/dumux/nonlinear/newtoncontroller.hh:380]:
>
> >>Linear solver did not converge"
>
>
>
> Maybe installing SuperLU will solve your problem. My problem is also still
> unsolved, thus, any hints are welcome. I am also using a fully implicit
> cell-centered model. The same simulation is working when started on only
> one processor and the same model is working in parallel on more simple
> grids.
>
>
>
> Kind regards
>
> Birger
>
>
>
>
>
> *Von:* Dumux [mailto:dumux-boun...@listserv.uni-stuttgart.de
> <dumux-boun...@listserv.uni-stuttgart.de>] *Im Auftrag von *Tri Dat NGO
> *Gesendet:* Dienstag, 24. November 2015 20:07
> *An:* DuMuX User Forum
> *Betreff:* Re: [DuMuX] Convergence problem for 3D simulations (2p
> cell-centered model, ALUGrid)
>
>
>
> Sorry, I forgot to add my files.
>
> Kind regards,
> Tri Dat
>
>
>
> 2015-11-24 20:03 GMT+01:00 Tri Dat NGO <trida...@gmail.com>:
>
> Hi Dumuxers,
>
> I would like to run parallel simulations of CO2 injection into a 3D
> reservoir of which the horizontal extents are much higher than the vertical
> one: Lx/Lz = Ly/Lz >> 1 (e.g. a reservoir of 1000m x 1000m x 50 m). A 2p
> cell-centered model is used.
>
> The domain is described on a cartesian grid of 100x100x100 (1E+6) cells.
> The CO2 is injected into the domain at X=500m  and Y = 500m by using the
> Peaceman's well model (bhp). The pressure at the boundaries perpendicular
> to the Y-axis is set to the initial hydrostatic pressure. All others
> boundaries are no-flux Neumann boundaries. At the first step, we consider,
> for simplicity, an homogeneous reservoir. You can find in attachment the
> *.hh files of my test.
>
> The simulation converges only if the reservoir is anisotropic (Kxx/Kzz =
> Kyy/Kzz >>1), meaning that the medium is lowly permeable in Z-direction.
> Nevertheless, when the domain is isotropic (Kxx = Kzz), a problem occurs:
> Newton solver did not converge. The error message is "MathError
> [mgc:/home/share/soft/dumux-2.6/include/dune/istl/paamg/amg.hh:825]: Coarse
> solver did not converge".
>
>
> Moreover, the same test works correctly on isotropic but thicker domains
> (e.g. 1000m x 1000m x 250 m, the grid is always of 100x100x100).
>
> I don't understand why a same test works on a grid but not on another
> grid. Is this related to the numerical scheme (fully-implicit,
> cell-centered)?
>
>
> Any help will be greatly appreciated!
>
>
>
> Kind regards,
>
> Tri Dat
>
>
>
>
> _______________________________________________
> Dumux mailing 
> listdu...@listserv.uni-stuttgart.dehttps://listserv.uni-stuttgart.de/mailman/listinfo/dumux
>
>
>
> --
> Alexander Kissinger
> Institut für Wasser- und Umweltsystemmodellierung
> Lehrstuhl für Hydromechanik und Hydrosystemmodellierung
> Pfaffenwaldring 61
> D-70569 Stuttgart
>
> Telefon: +49 (0) 711 685-64729
> E-Mail:  alexander.kissin...@iws.uni-stuttgart.de
>
>
> _______________________________________________
> Dumux mailing list
> Dumux@listserv.uni-stuttgart.de
> https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
>
>
_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

Reply via email to