I have fixed the bug of PCGAMG crashing on one process when used with MPIAIJ 
matrix in the branch barry/clean-gamg  and merged to next for testing.

   As Jed notes you can create your matrix as AIJ (which is recommended 
approach anyways) instead of MPIAIJ to work around the bug.

  Barry

> On Mar 5, 2015, at 12:07 PM, Randall Mackie <rlmackie...@gmail.com> wrote:
> 
>> 
>> On Mar 4, 2015, at 7:30 PM, Barry Smith <bsm...@mcs.anl.gov> wrote:
>> 
>> 
>>> On Mar 4, 2015, at 7:45 PM, Randall Mackie <rlmackie...@gmail.com> wrote:
>>> 
>>> In my application, I am repeatedly calling KSPSolve with the following 
>>> options:
>>> 
>>> -ksp_type gmres \
>>> -pc_type gamg \
>>> -pc_gamg_type agg \
>>> -pc_gamg_agg_nsmooths 1\
>>> 
>>> 
>>> each call is after the matrix and right hand side have been updated.
>>> 
>>> This works well in the sense that it solves the system in a reasonable 
>>> number of steps, however, I have noticed that the memory footprint of the 
>>> application increases by about 500 Mbytes after each call to KSPSolve (this 
>>> is a big problem), and after several calls, I've maxed out the memory.
>>> 
>>> Is this expected behavior?
>> 
>> No
>>> 
>>> I've combed through my code looking to make sure I don't have any memory 
>>> leaks, and so far I haven't found any (doesn't mean there are not there).
>>> 
>>> However, when I use another PC, like jacobi, just to compare, I don't see 
>>> this memory issue, or if I comment out that call to KSPSolve (there is a 
>>> lot of other stuff going on in the code besides this call), I don't see 
>>> this issue.
>>> 
>>> I've tried to destroy the KSP after each solve and recreate it each time, 
>>> but there still seems to be some memory getting added.
>> 
>>  Run your program for a few solves with the command line option -malloc and 
>> after each call call to KSPSolve() put in a call to PetscMallocDump(). Take 
>> a look at the output and email it to us (this is best done with one process 
>> if you can; does the memory problem happen with 1 MPI process?).
> 
> Hi Barry,
> 
> When I run with 1 MPI process, I get this error message:
> 
> 0]PETSC ERROR: --------------------- Error Message 
> --------------------------------------------------------------
> [0]PETSC ERROR: Arguments are incompatible
> [0]PETSC ERROR: MatMatMult requires A, mpiaij, to be compatible with B, seqaij
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.5.3, Jan, 31, 2015 
> [0]PETSC ERROR: Configure options PETSC_DIR=/home/rmackie/PETSc/petsc-3.5.3 
> PETSC_ARCH=linux-gfortran-debug --with-scalar-type=complex --with-debugging=1 
> --with-fortran=1 --download-mpich
> [0]PETSC ERROR: #1 MatMatMult() line 8710 in 
> /home/rmackie/PETSc/petsc-3.5.3/src/mat/interface/matrix.c
> [0]PETSC ERROR: #2 PCGAMGOptprol_AGG() line 1328 in 
> /home/rmackie/PETSc/petsc-3.5.3/src/ksp/pc/impls/gamg/agg.c
> [0]PETSC ERROR: #3 PCSetUp_GAMG() line 606 in 
> /home/rmackie/PETSc/petsc-3.5.3/src/ksp/pc/impls/gamg/gamg.c
> [0]PETSC ERROR: #4 PCSetUp() line 902 in 
> /home/rmackie/PETSc/petsc-3.5.3/src/ksp/pc/interface/precon.c
> [0]PETSC ERROR: #5 KSPSetUp() line 306 in 
> /home/rmackie/PETSc/petsc-3.5.3/src/ksp/ksp/interface/itfunc.c
> [0]PETSC ERROR: #6 KSPSolve() line 418 in 
> /home/rmackie/PETSc/petsc-3.5.3/src/ksp/ksp/interface/itfunc.c
> 
> 
> Randy
> 
>> 
>>  Barry
>> 
>>> 
>>> I've tried to distill this down to a smaller problem and test program, but 
>>> so far I have been unsuccessful.
>>> 
>>> 
>>> Is there a way to completely release the memory associated with the GAMG 
>>> preconditioner after a call to KSPSolve?
>>> 
>>> 
>>> Any other suggestions for tracking this down? I've run out of ideas.
>>> 
>>> 
>>> Thanks in advance,
>>> 
>>> Randy

Reply via email to