Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-04-10 Thread Myriam Peyrounette via petsc-users
Here is the time weak scaling from the same study. The 3.10.2 version seems to be much more stable with regard to the execution time. But not necessarily faster for "large scale" simulations (problem size = 1e8). I didn't use -mat_freeintermediatedatastructures. I tested it this morning and the so

Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-04-10 Thread Mark Adams via petsc-users
This looks like it might be noisy data. I'd make sure you run each size on the same set of nodes and you might run each job twice (A,B,A,B) in a job script. On Wed, Apr 10, 2019 at 8:12 AM Myriam Peyrounette via petsc-users < petsc-users@mcs.anl.gov> wrote: > Here is the time weak scaling from th

Re: [petsc-users] Problem coupling Petsc into OpenFOAM

2019-04-10 Thread Balay, Satish via petsc-users
Runtime error? You might have to add the path to $PETSC_ARCH/lib in LD_LIBRARY_PATH env variable or - to your link command. If linux/gcc - the linker option is -Wl,-rpath,$PETSC_ARCH/lib If not - send detail logs. Satish On Wed, 10 Apr 2019, Vu Do Quoc via petsc-users wrote: > Hi all, > > I

[petsc-users] Error with KSPSetUp and MatNest

2019-04-10 Thread Manuel Colera Rico via petsc-users
Hello, I am trying to solve a system whose matrix is of type MatNest. If I don't use KSPSetUp(), everything is fine. However, if I use that routine, I get the following error: 0]PETSC ERROR: - Error Message -- [

Re: [petsc-users] Problem coupling Petsc into OpenFOAM

2019-04-10 Thread Smith, Barry F. via petsc-users
We don't know much about OpenFoam but 1) if I do a git grep -i petsc in the https://develop.openfoam.com/Development/OpenFOAM-plus.git repository I see various configuration files specifically for PETSc. etc/config.csh/petsc etc/config.sh/petsc wmake/scripts/have_petsc s

Re: [petsc-users] Error with KSPSetUp and MatNest

2019-04-10 Thread Manuel Colera Rico via petsc-users
Thank you for your answer, Matt. In the MWE example attached before, both Nest vectors (the r.h.s. of the system and the vector of unknowns) are composed of the same number of blocks (2). Indeed, PETSc is able to solve the system if KSPSetUp() is not called, so the system/MatNest/MatVec's must

Re: [petsc-users] Error with KSPSetUp and MatNest

2019-04-10 Thread Matthew Knepley via petsc-users
On Wed, Apr 10, 2019 at 12:49 PM Manuel Colera Rico wrote: > Thank you for your answer, Matt. In the MWE example attached before, both > Nest vectors (the r.h.s. of the system and the vector of unknowns) are > composed of the same number of blocks (2). Indeed, PETSc is able to solve > the system

[petsc-users] Argument out of range error only in certain mpi sizes

2019-04-10 Thread Sajid Ali via petsc-users
Hi PETSc developers, I wanted to convert my code that in which I was using general Vec/Mat to DMDA based grid management (nothing fancy just a 5-point complex stencil). For this, I created a DA object and created the global solution vector using this. This worked fine. Now, I created the matrix u

Re: [petsc-users] Argument out of range error only in certain mpi sizes

2019-04-10 Thread Smith, Barry F. via petsc-users
Sajid, This code won't work. You are assuming that neighbors to the north and south can be obtained from set_col = i + My and set_col = i - My. This is simply incorrect. The global number of unknowns in PETSc is by process (not natural ordering on the entire domain) this is why the PE

Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-04-10 Thread Zhang, Hong via petsc-users
Myriam, Thanks for the plot. '-mat_freeintermediatedatastructures' should not affect solution. It releases almost half of memory in C=PtAP if C is not reused. Hong On Wed, Apr 10, 2019 at 7:21 AM Mark Adams mailto:mfad...@lbl.gov>> wrote: This looks like it might be noisy data. I'd make sure you

Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-04-10 Thread Jed Brown via petsc-users
"Zhang, Hong via petsc-users" writes: > Myriam, > Thanks for the plot. '-mat_freeintermediatedatastructures' should not affect > solution. It releases almost half of memory in C=PtAP if C is not reused. And yet if turning it on causes divergence, that would imply a bug. Hong, are you able to re

Re: [petsc-users] Problem coupling Petsc into OpenFOAM

2019-04-10 Thread Mark Olesen via petsc-users
The paper that Barry mentioned gives some generalities, but probably won't help much. There are some PETSc/OpenFOAM interfaces in rheoTool that are probably much more helpful. As Barry also rightly noted, there are some config files in the OpenFOAM tree that were put in some time ago for helpin

Re: [petsc-users] Argument out of range error only in certain mpi sizes

2019-04-10 Thread Sajid Ali via petsc-users
Thanks a lot for the advice Matt and Barry. One thing I wanted to confirm is that when I change from using a regular Vec to a Vec created using DMDACreateGlobalVector, to fill these with data from hdf5, I have to change the dimensions of hdf5 vectors from (dim_x*dim_y) to (dim_x,dim_y), right? Be

Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-04-10 Thread Zhang, Hong via petsc-users
Jed: > Myriam, > Thanks for the plot. '-mat_freeintermediatedatastructures' should not affect > solution. It releases almost half of memory in C=PtAP if C is not reused. And yet if turning it on causes divergence, that would imply a bug. Hong, are you able to reproduce the experiment to see the m

Re: [petsc-users] Argument out of range error only in certain mpi sizes

2019-04-10 Thread Smith, Barry F. via petsc-users
Sajid, By default when you save/load vectors from DMDA to HDF5 files it 1) converts them to the natural ordering in the file (in PETSc programs they are number by process (see the discussions in the users manual about DMDA orderings)) 2) it treats it as a 2d array in the HDF5 file

Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-04-10 Thread Jed Brown via petsc-users
"Zhang, Hong" writes: > Jed: >>> Myriam, >>> Thanks for the plot. '-mat_freeintermediatedatastructures' should not >>> affect solution. It releases almost half of memory in C=PtAP if C is not >>> reused. > >> And yet if turning it on causes divergence, that would imply a bug. >> Hong, are you a

Re: [petsc-users] Error with KSPSetUp and MatNest

2019-04-10 Thread Smith, Barry F. via petsc-users
Here is my guess at what is going wrong (and if it is the case we should figure out how to fix it). The KSPSetUp() is triggering the creation of work vectors and the type of work vector created at this point is not compatible with the vectors passed into KSPSolve() thus generating an error.

Re: [petsc-users] Error with KSPSetUp and MatNest

2019-04-10 Thread Smith, Barry F. via petsc-users
Matt, You can't have some functionality in the public API for a library and then criticize someone for using it. I looked at the manual page for MatCreateNest() and it doesn't say in big letters "don't use this", nor does the compiler tell the user "don't use this". Either MatCreateNe

Re: [petsc-users] Problem coupling Petsc into OpenFOAM

2019-04-10 Thread Smith, Barry F. via petsc-users
Mark, Thanks for the clarifying email. My google searches didn't locate the rheoTool you mention nor "a PRACE project running via CINECA (Bologna)". It would be nice if someday OpenFOAM had (either directly or somehow with the modules directory) an interface to the PETSc solvers. Th