Cell-centered in parallel does only work with the AMGBackend as LinearSolver. If you change the property in your problem file, you should be fine.

Maybe we should set the AMGBackend as default solver to prevent users from walking into that trap.

Kind regards
Bernd

On 05/27/2015 05:38 PM, Tri Dat NGO wrote:
Hi Bernd,

Thank you for your help. It works for me. I think i will just take the DgfGridCreator for parallel YaspGrid runs.

I have also another question about parallel run with implicit cell-centered method of 2p model. When I try to run /dumux/test/implicit/2p/test_cc2p (CubeGrid) in parallel, I obtained this error after the first time step:
####
[obelix2:30189] *** Process received signal ***
[obelix2:30189] Signal code:  (128)
[obelix2:30188] Failing at address: (nil)
[obelix2:30189] Signal: Segmentation fault (11)
####

Contrarily, ./test_cc2p with SimplexGrid and ./test_box2p work fine in parallel.

Kind regards,
Tri Dat

2015-05-27 16:46 GMT+02:00 Bernd Flemisch <[email protected] <mailto:[email protected]>>:

    In fact, I will not fix the CubeGridCreator for parallel YaspGrid.
    Until and including Dune 2.3, the YaspGrid specialization of the
    StructuredGridFactory of Dune only creates a sequential YaspGrid.
    This is fixed in Dune 2.4 where the sequential and parallel
    YaspGrid constructors have been unified. Since Dune 2.4 is on its
    way and we will drop Dune 2.3 support afterwards, it is too much
    hassle now to do it properly for both Dune 2.3 and 2.4.

    If you like, you can already move to the release branch of Dune
    2.4 (at the expense of receiving a lot of deprecation warnings
    which we will fix after the release). Or you just take the
    DgfGridCreator for parallel YaspGrid runs.

    Bernd


    On 05/27/2015 03:54 PM, Bernd Flemisch wrote:
    Hi Tri Dat,

    it is much easier than I thought. I just forgot to specify an
    overlap in the dgf file. This is necessary for YaspGrid if the
    decoupled models are suppossed to run properly in parallel. Doing
    this right seems to give me the correct result for parallel runs.

    You can try for yourself by adapting problem and input file such
    that the DgfGridCreator is used, see the attached patch. Together
    with an appropriate dgf file which is also attached.

    Meanwhile I will fix the CubeGridCreator for YaspGrid so that it
    also works in parallel.

    Kind regards
    Bernd

    On 05/22/2015 02:15 PM, Tri Dat NGO wrote:
    Hi Martin and Bernd,

    Please find attached the grid file I have been using for 3d2p
    decoupled adaptive + parallel.
    I confirm you that the test_3d2p using mimetic method works fine
    in parallel.

    Since I would like to run my 2p2c decoupled test cases in
    parallel, so I will be very happy while listening its progress.
    Please keep me informed.
    Thank you once again for your help.

    Kind regards,
    Tri Dat

    2015-05-22 13:36 GMT+02:00 Bernd Flemisch
    <[email protected] <mailto:[email protected]>>:

        Hi Tri Dat,

        I had a closer look at decoupled 2p2c in parallel. Two
        issues have to be solved:

        1. Apparently, our CubeGridCreator doesn't create a parallel
        YaspGrid. This can be fixed easily. Until then, one can use
        the default DgfGridCreator for YaspGrid and parallel.

        2. In the decoupled 2p2c model, information is not
        transported across the process boundaries. Since decoupled
        2p and 2p2c share quite a bit of the same infrastructure and
        2p is parallel, this also should be feasible in the close
        future.

        Concerning decoupled 2p, I also did not succeed to run MPFAL
        in 3d and parallel. The FV/TPFA works fine, also in the
        adaptive regime. This needs to be further investigated.

        Kind regards
        Bernd

        On Fri, 22 May 2015 10:59:24 +0200
         Tri Dat NGO <[email protected]
        <mailto:[email protected]>> wrote:
        >Hi Bernd,
        >
        >Thank you so much for your help.
        >Please let me know if you have any progress on the decouple
        2p2c in
        >parallel.
        >
        >Concerning 2p decoupled  adaptive + parallel simulations,
        your comments
        >lead me to run  *test_3d2p* in *dumux/test/decoupled/2p* in
        parallel and I
        >obtained the following error message:
        >
        >######################################################
        >No model type specified
        >Default to finite volume MPFA l-method model
        >Dune reported error: Dune::NotImplemented
        
>[storeBoundaryInteractionVolume:../../../dumux/decoupled/2p/diffusion/fvmpfa/lmethod/fvmpfal3dinteractionvolumecontainer.hh:2031]:
        >Boundary shape not implemented
        >######################################################
        >
        >It seems that there is a problem when storing the boundary
        interaction
        >volumes in the *mpfa-lmethod*. My test domain dimension is
        10x10x10 [m x m
        >x m] with the grid 20x20x20, all boundaries have id 1. I haven't 
tested yet
>decoupled 2p - 3d parallel + adaptive using *mpfa-omethod/tpfa method*.
        >Please let me know if you have any additional suggestions.
        >
        >Kind regards,
        >Tri Dat
        >
        >2015-05-21 12:40 GMT+02:00 Bernd Flemisch
        <[email protected]
        <mailto:[email protected]>>:
        >
        >>  Hi Tri Dat,
        >>
        >> I just tried to run test_dec2p2c in parallel and it seems
        that at least
        >> the output is wrong. While the pvd-file contains pointers
        to correct
        >> parallel pvtu-file names, only sequential vtu-files are
        written. I will
        >> investigate this further.
        >>
        >> In any case, to run in parallel, you need to switch the
        LinearSolver to
        >> the AMGBackend in your problem file by adding
        >>
        >> #include <dumux/linear/amgbackend.hh>
        >>
        >> and adding/changing something like
        >>
        >> SET_TYPE_PROP(TestDecTwoPTwoCProblem, LinearSolver,
        >> Dumux::AMGBackend<TypeTag>);
        >>
        >>
        >> Decoupled 2p adaptive and parallel is possible as far as
        I know. However
        >> the 2p adaptive stuff only works with ALUGrid and that
        means that one has
        >> to use a 3d test case because 2d ALUGrid is not parallel.
        I will try to set
        >> up a corresponding case.
        >>
        >> I assume that decoupled 2p2c adaptive and parallel is a
        larger task. Since
        >> we would also like to have it, we can put it on our to-do
        list, but it is
        >> hard to estimate when we actually can do it.
        >>
        >> Kind regards
        >> Bernd
        >>
        >>
        >>
        >> On 05/21/2015 11:51 AM, Tri Dat NGO wrote:
        >>
        >>    Dear DuMuX,
        >>
        >>  I would like to know whether there is any test case of
        2p2c decoupled
        >> model which works correctly in parallel mode?
        >>  I tried to run the parallel simulations of all examples in
        >> /dumux_v2.6/test/decoupled/2p2c with mpirun but I
        obtained always the
        >> results of sequential simulations.
        >>
        >>  Another question always on parallel simulation but
        concerning the
        >> adaptive grid refinement, can we implement the adaptive
        grid method with
        >> 2p/2p2c model in parallel mode?
        >>
        >>  Thank you in advance for your reply.
        >>
        >>  Kind regards,
        >>  Tri Dat
        >>
        >>
        >> _______________________________________________
        >> Dumux mailing [email protected]://
        
<mailto:[email protected]://>listserv.uni-stuttgart.de/mailman/listinfo/dumux
        <http://listserv.uni-stuttgart.de/mailman/listinfo/dumux>
        >>
        >>
        >>
        >> --
        >>
        _______________________________________________________________
        >>
        >> Bernd Flemisch      phone: +49 711 685 69162
        <tel:%2B49%20711%20685%2069162>
        >> IWS, Universität Stuttgart      fax: +49 711 685 60430
        <tel:%2B49%20711%20685%2060430>
        >> Pfaffenwaldring 61 email: [email protected]
        <mailto:[email protected]>
        >> D-70569 Stuttgart            url:
        www.hydrosys.uni-stuttgart.de
        <http://www.hydrosys.uni-stuttgart.de>
        >>
        _______________________________________________________________
        >>
        >>
        >> _______________________________________________
        >> Dumux mailing list
        >> [email protected]
        <mailto:[email protected]>
        >> https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
        >>
        >>

        _______________________________________________________________

        Bernd Flemisch  phone: +49 711 685 69162
        <tel:%2B49%20711%20685%2069162>
        IWS, Universitaet Stuttgart     fax: +49 711 685 67020
        <tel:%2B49%20711%20685%2067020>
        Pfaffenwaldring 61            email:
        [email protected] <mailto:[email protected]>
        D-70569 Stuttgart            url:
        www.hydrosys.uni-stuttgart.de
        <http://www.hydrosys.uni-stuttgart.de>
        _______________________________________________________________
        _______________________________________________
        Dumux mailing list
        [email protected]
        <mailto:[email protected]>
        https://listserv.uni-stuttgart.de/mailman/listinfo/dumux




    _______________________________________________
    Dumux mailing list
    [email protected]  <mailto:[email protected]>
    https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


-- _______________________________________________________________

    Bernd Flemisch                         phone:+49 711 685 69162  
<tel:%2B49%20711%20685%2069162>
    IWS, Universität Stuttgart             fax:+49 711 685 60430  
<tel:%2B49%20711%20685%2060430>
    Pfaffenwaldring 61            email:[email protected]  
<mailto:[email protected]>
    D-70569 Stuttgart            url:www.hydrosys.uni-stuttgart.de  
<http://www.hydrosys.uni-stuttgart.de>
    _______________________________________________________________


    _______________________________________________
    Dumux mailing list
    [email protected]  <mailto:[email protected]>
    https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


-- _______________________________________________________________

    Bernd Flemisch                         phone:+49 711 685 69162  
<tel:%2B49%20711%20685%2069162>
    IWS, Universität Stuttgart             fax:+49 711 685 60430  
<tel:%2B49%20711%20685%2060430>
    Pfaffenwaldring 61            email:[email protected]  
<mailto:[email protected]>
    D-70569 Stuttgart            url:www.hydrosys.uni-stuttgart.de  
<http://www.hydrosys.uni-stuttgart.de>
    _______________________________________________________________


    _______________________________________________
    Dumux mailing list
    [email protected]
    <mailto:[email protected]>
    https://listserv.uni-stuttgart.de/mailman/listinfo/dumux




_______________________________________________
Dumux mailing list
[email protected]
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


--
_______________________________________________________________

Bernd Flemisch                         phone: +49 711 685 69162
IWS, Universität Stuttgart             fax:   +49 711 685 60430
Pfaffenwaldring 61            email: [email protected]
D-70569 Stuttgart            url: www.hydrosys.uni-stuttgart.de
_______________________________________________________________

_______________________________________________
Dumux mailing list
[email protected]
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

Reply via email to