Hi Tri Dat,

it is much easier than I thought. I just forgot to specify an overlap in the dgf file. This is necessary for YaspGrid if the decoupled models are suppossed to run properly in parallel. Doing this right seems to give me the correct result for parallel runs.

You can try for yourself by adapting problem and input file such that the DgfGridCreator is used, see the attached patch. Together with an appropriate dgf file which is also attached.

Meanwhile I will fix the CubeGridCreator for YaspGrid so that it also works in parallel.

Kind regards
Bernd

On 05/22/2015 02:15 PM, Tri Dat NGO wrote:
Hi Martin and Bernd,

Please find attached the grid file I have been using for 3d2p decoupled adaptive + parallel. I confirm you that the test_3d2p using mimetic method works fine in parallel.

Since I would like to run my 2p2c decoupled test cases in parallel, so I will be very happy while listening its progress. Please keep me informed.
Thank you once again for your help.

Kind regards,
Tri Dat

2015-05-22 13:36 GMT+02:00 Bernd Flemisch <be...@iws.uni-stuttgart.de <mailto:be...@iws.uni-stuttgart.de>>:

    Hi Tri Dat,

    I had a closer look at decoupled 2p2c in parallel. Two issues have
    to be solved:

    1. Apparently, our CubeGridCreator doesn't create a parallel
    YaspGrid. This can be fixed easily. Until then, one can use the
    default DgfGridCreator for YaspGrid and parallel.

    2. In the decoupled 2p2c model, information is not transported
    across the process boundaries. Since decoupled 2p and 2p2c share
    quite a bit of the same infrastructure and 2p is parallel, this
    also should be feasible in the close future.

    Concerning decoupled 2p, I also did not succeed to run MPFAL in 3d
    and parallel. The FV/TPFA works fine, also in the adaptive regime.
    This needs to be further investigated.

    Kind regards
    Bernd

    On Fri, 22 May 2015 10:59:24 +0200
     Tri Dat NGO <trida...@gmail.com <mailto:trida...@gmail.com>> wrote:
    >Hi Bernd,
    >
    >Thank you so much for your help.
    >Please let me know if you have any progress on the decouple 2p2c in
    >parallel.
    >
    >Concerning 2p decoupled  adaptive + parallel simulations, your
    comments
    >lead me to run  *test_3d2p* in *dumux/test/decoupled/2p* in
    parallel and I
    >obtained the following error message:
    >
    >######################################################
    >No model type specified
    >Default to finite volume MPFA l-method model
    >Dune reported error: Dune::NotImplemented
    
>[storeBoundaryInteractionVolume:../../../dumux/decoupled/2p/diffusion/fvmpfa/lmethod/fvmpfal3dinteractionvolumecontainer.hh:2031]:
    >Boundary shape not implemented
    >######################################################
    >
    >It seems that there is a problem when storing the boundary
    interaction
    >volumes in the *mpfa-lmethod*. My test domain dimension is
    10x10x10 [m x m
    >x m] with the grid 20x20x20, all boundaries have id 1. I haven't tested yet
    >decoupled 2p - 3d parallel + adaptive using *mpfa-omethod/tpfa
    method*.
    >Please let me know if you have any additional suggestions.
    >
    >Kind regards,
    >Tri Dat
    >
    >2015-05-21 12:40 GMT+02:00 Bernd Flemisch
    <be...@iws.uni-stuttgart.de <mailto:be...@iws.uni-stuttgart.de>>:
    >
    >>  Hi Tri Dat,
    >>
    >> I just tried to run test_dec2p2c in parallel and it seems that
    at least
    >> the output is wrong. While the pvd-file contains pointers to
    correct
    >> parallel pvtu-file names, only sequential vtu-files are
    written. I will
    >> investigate this further.
    >>
    >> In any case, to run in parallel, you need to switch the
    LinearSolver to
    >> the AMGBackend in your problem file by adding
    >>
    >> #include <dumux/linear/amgbackend.hh>
    >>
    >> and adding/changing something like
    >>
    >> SET_TYPE_PROP(TestDecTwoPTwoCProblem, LinearSolver,
    >> Dumux::AMGBackend<TypeTag>);
    >>
    >>
    >> Decoupled 2p adaptive and parallel is possible as far as I
    know. However
    >> the 2p adaptive stuff only works with ALUGrid and that means
    that one has
    >> to use a 3d test case because 2d ALUGrid is not parallel. I
    will try to set
    >> up a corresponding case.
    >>
    >> I assume that decoupled 2p2c adaptive and parallel is a larger
    task. Since
    >> we would also like to have it, we can put it on our to-do list,
    but it is
    >> hard to estimate when we actually can do it.
    >>
    >> Kind regards
    >> Bernd
    >>
    >>
    >>
    >> On 05/21/2015 11:51 AM, Tri Dat NGO wrote:
    >>
    >>    Dear DuMuX,
    >>
    >>  I would like to know whether there is any test case of 2p2c
    decoupled
    >> model which works correctly in parallel mode?
    >>  I tried to run the parallel simulations of all examples in
    >> /dumux_v2.6/test/decoupled/2p2c with mpirun but I obtained
    always the
    >> results of sequential simulations.
    >>
    >>  Another question always on parallel simulation but concerning the
    >> adaptive grid refinement, can we implement the adaptive grid
    method with
    >> 2p/2p2c model in parallel mode?
    >>
    >>  Thank you in advance for your reply.
    >>
    >>  Kind regards,
    >>  Tri Dat
    >>
    >>
    >> _______________________________________________
    >> Dumux mailing
    
listdu...@listserv.uni-stuttgart.dehttps://listserv.uni-stuttgart.de/mailman/listinfo/dumux
    <http://listserv.uni-stuttgart.de/mailman/listinfo/dumux>
    >>
    >>
    >>
    >> --
    >> _______________________________________________________________
    >>
    >> Bernd Flemisch                         phone: +49 711 685 69162
    <tel:%2B49%20711%20685%2069162>
    >> IWS, Universität Stuttgart             fax: +49 711 685 60430
    <tel:%2B49%20711%20685%2060430>
    >> Pfaffenwaldring 61            email: be...@iws.uni-stuttgart.de
    <mailto:be...@iws.uni-stuttgart.de>
    >> D-70569 Stuttgart            url: www.hydrosys.uni-stuttgart.de
    <http://www.hydrosys.uni-stuttgart.de>
    >> _______________________________________________________________
    >>
    >>
    >> _______________________________________________
    >> Dumux mailing list
    >> Dumux@listserv.uni-stuttgart.de
    <mailto:Dumux@listserv.uni-stuttgart.de>
    >> https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
    >>
    >>

    _______________________________________________________________

    Bernd Flemisch                         phone: +49 711 685 69162
    <tel:%2B49%20711%20685%2069162>
    IWS, Universitaet Stuttgart              fax: +49 711 685 67020
    <tel:%2B49%20711%20685%2067020>
    Pfaffenwaldring 61            email: be...@iws.uni-stuttgart.de
    <mailto:be...@iws.uni-stuttgart.de>
    D-70569 Stuttgart            url: www.hydrosys.uni-stuttgart.de
    <http://www.hydrosys.uni-stuttgart.de>
    _______________________________________________________________
    _______________________________________________
    Dumux mailing list
    Dumux@listserv.uni-stuttgart.de
    <mailto:Dumux@listserv.uni-stuttgart.de>
    https://listserv.uni-stuttgart.de/mailman/listinfo/dumux




_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


--
_______________________________________________________________

Bernd Flemisch                         phone: +49 711 685 69162
IWS, Universität Stuttgart             fax:   +49 711 685 60430
Pfaffenwaldring 61            email: be...@iws.uni-stuttgart.de
D-70569 Stuttgart            url: www.hydrosys.uni-stuttgart.de
_______________________________________________________________

Index: test/decoupled/2p2c/test_dec2p2cproblem.hh
===================================================================
--- test/decoupled/2p2c/test_dec2p2cproblem.hh	(revision 14790)
+++ test/decoupled/2p2c/test_dec2p2cproblem.hh	(working copy)
@@ -36,6 +36,8 @@
 
 #include "test_dec2p2c_spatialparams.hh"
 #include <dumux/linear/impetbicgstabilu0solver.hh>
+#include <dumux/linear/amgbackend.hh>
+
 namespace Dumux
 {
 
@@ -48,7 +50,7 @@
 NEW_TYPE_TAG(TestDecTwoPTwoCProblem, INHERITS_FROM(DecoupledTwoPTwoC, Test2P2CSpatialParams));
 
 // set the GridCreator property
-SET_TYPE_PROP(TestDecTwoPTwoCProblem, GridCreator, CubeGridCreator<TypeTag>);
+//SET_TYPE_PROP(TestDecTwoPTwoCProblem, GridCreator, CubeGridCreator<TypeTag>);
 
 // Set the grid type
 SET_TYPE_PROP(TestDecTwoPTwoCProblem, Grid, Dune::YaspGrid<3>);
@@ -81,6 +83,7 @@
 
 SET_BOOL_PROP(TestDecTwoPTwoCProblem, EnableCapillarity, true);
 SET_INT_PROP(TestDecTwoPTwoCProblem, BoundaryMobility, GET_PROP_TYPE(TypeTag, Indices)::satDependent);
+SET_TYPE_PROP(TestDecTwoPTwoCProblem, LinearSolver, Dumux::AMGBackend<TypeTag>);
 }
 
 /*!
Index: test/decoupled/2p2c/test_dec2p2c.input
===================================================================
--- test/decoupled/2p2c/test_dec2p2c.input	(revision 14790)
+++ test/decoupled/2p2c/test_dec2p2c.input	(working copy)
@@ -11,6 +11,8 @@
 UpperRightY = 10 # [m] width of the domain
 UpperRightZ = 10 # [m] height of the domain
 
+File = test_grid.dgf
+
 [Impet]
 CFLFactor = 0.8
 
DGF
Interval
0 0 0  % first corner 
10 10 10  % second corner
10 10 10  % 1 cells in x and 1 in y direction
# 

GridParameter
% set overlap to 1
overlap 1
#

BOUNDARYDOMAIN
default 1    % all boundaries have id 1
#BOUNDARYDOMAIN
# unitcube.dgf 
_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

Reply via email to