Dear Simon, DuMuX users,

Here's the testing procedure (gcc/6.3.0 --- cmake/3.5.2 --- open-mpi/1.5.6 --- dumux/3.0 --- dune/2.6).

I "build" the overall dumux installation repo (the one containing also the dune modules folders) as following (cmake.opts is the classic one) :

*$ ./dune-common/bin/dunecontrol --opts=dumux/cmake.opts all*

Then I go into the dumux folder and run cmake :

*$ cd dumux**
**$ mkdir build-cmake && cd build-cmake**
**$ cmake ..*

Then I go into the test folder of interest (/dumux/test/porousmediumflow/richards/implicit/lens/) and try to compile the tests that can be found in the CMakeLists.txt file :

*$ make test_richards_lens_box*

---> compiles fine

*$ make test_richards_lens_box_parallel_ug*

---> compiles fine

*$ make test_richards_lens_box_parallel_alu*

---> compiles fine

I try then to run those tests on the cluster by running the following command :

*$ submit "mpirun -n 4 /path/to/problem/file /path/to/input/file"*

Reassuringly,*the tests seem to run fine on the cluster*, however, some questions :

*=> *Where can I find *documentation on the command line arguments passed via the CMakeLists.txt file* (/dumux/test/porousmediumflow/richards/implicit/lens/CMakeLists.txt/) *?* I am thinking about " --script fuzzy" "--zeroThreshold {"process rank":100}" etc.

*=>* Is the Grid.Overlap=0 parameter only needed for the YaspGrid type ? More generally, *where can I find documentation about the different grids and their use ? *

Thanks for your help,

Joan

On 26/04/2019 15:12, Simon Scholz wrote:

Dear Joan,

I am no expert in running things in parallel, but your setup seems fine to me. What DuMuX version are you using? I am assuming it is release/3.0 with dune 2.6, correct?

In general it is possible to run a simulation using the box-method in parallel with DuMuX. We test this e.g. with the richards tests located in

/test/porousmediumflow/richards/implicit/lens

Maybe you can try and build those (parallel) tests and see if they run with the setup on you cluster. If this works then they can serve as a guideline. If this does not work my last idea would be to check the grid that you are using. Maybe you can try to change to using ALU or UG and see if the problem persists.

My best,
Simon

On 25.04.19 20:07, Joan Delort Ylla wrote:

Dear DuMuX users,

I am using a CCS model that includes hydrogen impurities and biofilm growth and transport.

I inherit from the 2pncNIMin problem.

To be able to easily compute pressure gradients (needed for biofilm attachment/detachment) *I switched from the CCTpfa method to the Box method*.

My application compiles, links and executes fine (please note that until now a succesful simulation would take about 2h of clock time on 40 cores).

*I run my simulations *on a cluster*in parallel* using open-mpi/1.5.6, a typical job submission being :

$ submit "mpirun -n 40 /path/to/problem/file /path/to/input/file"

*At first* I got the following error message (at runtime):

    *Dune reported error: Dune::InvalidStateException
    
[BoxFVGridGeometry:/path/to/dumux/root/dumux/dumux/discretization/box/fvgridgeometry.hh:125]:
    The box discretization method only works with zero overlap for
    parallel computations. Set the parameter "Grid.Overlap" in the
    input file. ---> Abort!*


*Which I solved* by adding the following to my input file:

    [Grid]
    #other parameters
    Overlap = 0

*However*, after doing so, the first time step of the simulation will run forever and output the following error:*
*

    *Assemble: r(x^k) = dS/dt + div F - q; M = grad r^MSolve: M
    deltax^k = r^[[KCould not build any aggregates. Probably no
    connected nodes.*

So here is my question :*
Is it possible to run a simulation in parallel with the Box discretization method ? *And if so, where should I look for a way to do it ?

Thanks for your help, and I'll be happy to provide you with any extra information you might need.

Joan


_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
--
_______________________________________________________________________

Simon Scholz
Department of Hydromechanics and Modelling of Hydrosystems (LH2)

Simon Scholz                   phone: +49 711 685 67015
IWS, University of Stuttgart   fax:   +49 711 685 60430
Pfaffenwaldring 61             email:simon.sch...@iws.uni-stuttgart.de
D-70569 Stuttgart              url:www.hydrosys.uni-stuttgart.de
_______________________________________________________________________
_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

Reply via email to