Hi Bernd,
I already use SuperLU. Btw what about the PARDISO solver? Can it be used for
multidomain applications and could this be an improvement?
Kind regards
Georg
Von: Dumux [mailto:dumux-boun...@listserv.uni-stuttgart.de] Im Auftrag von
Bernd Flemisch
Gesendet: Mittwoch, 27. Mai 2015 12:36
I don't understand why the linear solver should get into trouble by this
modification. What linear solver do you use? If you use an iterative
one, can you check if you run into an exception from the preconditioner
when compiling with debug options, and also double-check by replacing
the
Hi Georg,
I switched to UMFPack, it is better* and also a direct solver. Pardiso
could be used, but you need a license.
Have a look at at one of our Stokes examples how to use UMFPack.
* Faster, terminates more reliably with singular matrices and
subjectively converges better.
Bye
Christoph
--
Ok. Can you try to use your multidomain setting, but just don't couple
by providing normal boundary conditions such as no-flux for each
subdomain also at the coupling interface? Does this work?
Since both SuperLU and Pardiso are direct solvers, I would not expect
that one can solve a system
THIS IS AN AUTOMATED MESSAGE, DO NOT REPLY.
The following task has a new comment added:
FS#267 - MPI exit with error: MPI_Op_free after MPI_FINALIZE
User who did this - Bernd Flemisch (bernd)
--
I understand it now. Our GridCreators have (a pointer to) the grid as a static
member
Hi Tri Dat,
it is much easier than I thought. I just forgot to specify an overlap in
the dgf file. This is necessary for YaspGrid if the decoupled models are
suppossed to run properly in parallel. Doing this right seems to give me
the correct result for parallel runs.
You can try for
Hi Bernd,
Thank you for your help. It works for me. I think i will just take the
DgfGridCreator for parallel YaspGrid runs.
I have also another question about parallel run with implicit cell-centered
method of 2p model. When I try to run /dumux/test/implicit/2p/test_cc2p
(CubeGrid) in parallel,
THIS IS AN AUTOMATED MESSAGE, DO NOT REPLY.
The following task has a new comment added:
FS#265 - Decoupled 2p2c does not run in parallel
User who did this - Bernd Flemisch (bernd)
--
Actually, it runs in parallel on YaspGrid, if the DgfGridCreator instead of the
CubeGridCreator is
THIS IS AN AUTOMATED MESSAGE, DO NOT REPLY.
The following task is now closed:
FS#265 - Decoupled 2p2c does not run in parallel
User who did this - Bernd Flemisch (bernd)
Reason for closing: Won't fix
Additional comments about closing: will be fixed upstream by Dune 2.4
More information can be
In fact, I will not fix the CubeGridCreator for parallel YaspGrid. Until
and including Dune 2.3, the YaspGrid specialization of the
StructuredGridFactory of Dune only creates a sequential YaspGrid. This
is fixed in Dune 2.4 where the sequential and parallel YaspGrid
constructors have been
10 matches
Mail list logo