Re: [DuMuX] Check for phase presence in the mpnc model

2015-05-27 Thread Georg.Futter
Hi Bernd, I already use SuperLU. Btw what about the PARDISO solver? Can it be used for multidomain applications and could this be an improvement? Kind regards Georg Von: Dumux [mailto:dumux-boun...@listserv.uni-stuttgart.de] Im Auftrag von Bernd Flemisch Gesendet: Mittwoch, 27. Mai 2015 12:36

Re: [DuMuX] Check for phase presence in the mpnc model

2015-05-27 Thread Bernd Flemisch
I don't understand why the linear solver should get into trouble by this modification. What linear solver do you use? If you use an iterative one, can you check if you run into an exception from the preconditioner when compiling with debug options, and also double-check by replacing the

Re: [DuMuX] Check for phase presence in the mpnc model

2015-05-27 Thread Christoph GrĂ¼ninger
Hi Georg, I switched to UMFPack, it is better* and also a direct solver. Pardiso could be used, but you need a license. Have a look at at one of our Stokes examples how to use UMFPack. * Faster, terminates more reliably with singular matrices and subjectively converges better. Bye Christoph --

Re: [DuMuX] Check for phase presence in the mpnc model

2015-05-27 Thread Bernd Flemisch
Ok. Can you try to use your multidomain setting, but just don't couple by providing normal boundary conditions such as no-flux for each subdomain also at the coupling interface? Does this work? Since both SuperLU and Pardiso are direct solvers, I would not expect that one can solve a system

[DuMuX] Flyspray Activity for Task 267 (MPI exit with error: MPI_Op_free after MPI_FINALIZE)

2015-05-27 Thread DuMuX
THIS IS AN AUTOMATED MESSAGE, DO NOT REPLY. The following task has a new comment added: FS#267 - MPI exit with error: MPI_Op_free after MPI_FINALIZE User who did this - Bernd Flemisch (bernd) -- I understand it now. Our GridCreators have (a pointer to) the grid as a static member

Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-27 Thread Bernd Flemisch
Hi Tri Dat, it is much easier than I thought. I just forgot to specify an overlap in the dgf file. This is necessary for YaspGrid if the decoupled models are suppossed to run properly in parallel. Doing this right seems to give me the correct result for parallel runs. You can try for

Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-27 Thread Tri Dat NGO
Hi Bernd, Thank you for your help. It works for me. I think i will just take the DgfGridCreator for parallel YaspGrid runs. I have also another question about parallel run with implicit cell-centered method of 2p model. When I try to run /dumux/test/implicit/2p/test_cc2p (CubeGrid) in parallel,

[DuMuX] Flyspray Activity for Task 265 (Decoupled 2p2c does not run in parallel)

2015-05-27 Thread DuMuX
THIS IS AN AUTOMATED MESSAGE, DO NOT REPLY. The following task has a new comment added: FS#265 - Decoupled 2p2c does not run in parallel User who did this - Bernd Flemisch (bernd) -- Actually, it runs in parallel on YaspGrid, if the DgfGridCreator instead of the CubeGridCreator is

[DuMuX] Flyspray Activity for Task 265 (Decoupled 2p2c does not run in parallel)

2015-05-27 Thread DuMuX
THIS IS AN AUTOMATED MESSAGE, DO NOT REPLY. The following task is now closed: FS#265 - Decoupled 2p2c does not run in parallel User who did this - Bernd Flemisch (bernd) Reason for closing: Won't fix Additional comments about closing: will be fixed upstream by Dune 2.4 More information can be

Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-27 Thread Bernd Flemisch
In fact, I will not fix the CubeGridCreator for parallel YaspGrid. Until and including Dune 2.3, the YaspGrid specialization of the StructuredGridFactory of Dune only creates a sequential YaspGrid. This is fixed in Dune 2.4 where the sequential and parallel YaspGrid constructors have been