Re: [petsc-users] Issue when passing DMDA array on to Paraview Catalyst

2019-03-13 Thread Smith, Barry F. via petsc-users


> On Mar 13, 2019, at 11:28 AM, Matthew Knepley via petsc-users 
>  wrote:
> 
> On Wed, Mar 13, 2019 at 12:16 PM Bastian Löhrer via petsc-users 
>  wrote:
> Dear PETSc users,
> 
> I am having difficulties passing PETSc data on to Paraview Catalyst and 
> it may be related to the way we handle the PETSs data in our Fortran code.
> 
> We have DMDA objects, which we pass on to subroutines this way:
> 
> >   ...
> >   call DMCreateLocalVector(da1dof, loc_p, ierr)
> >   ...
> >   call VecGetArray(loc_p, loc_p_v, loc_p_i, ierr)
> >   call process( loc_p_v(loc_p_i+1) )
> >   ...
> >
> 
> Inside the subroutine (process in this example) we treat the 
> subroutine's argument as if it were an ordinary Fortran array:
> 
> >   subroutine process( p )
> >
> > use gridinfo ! provides gis, gie, ... etc.
> >
> > implicit none
> >
> > #include "petsc_include.h"
> >
> > PetscScalar, dimension(gis:gie,gjs:gje,gks:gke) :: p
> > PetscInt i,j,k
> >
> > do k = gks, gke
> >   do j = gjs, gje
> > do i = gis, gie
> >
> > p(i,j,k) = ...
> >
> > enddo
> >   enddo
> > enddo
> >
> >   end subroutine process
> >
> I find this procedure a little quirky, but it has been working 
> flawlessly for years.
> 
> However, I am now encountering difficulties when passing this 
> variable/array p on to a Paraview Catalyst adaptor subroutine. Doing so 
> I end up with very strange values there. When replacing p with an 
> ordinary local Fortran array everything is fine.
> 
> I can't think of a reason it would not work. I would look at the pointer you 
> get inside
> the Catalyst function using the debugger.
> 
> Note that you can also get an F90 array out if that is what Catalyst needs.

  VecGetArrayF90() or DMDAVecGetArrayF90()
> 
>   Thanks,
> 
>  Matt
>  
> Bastian
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/



Re: [petsc-users] GAMG parallel convergence sensitivity

2019-03-13 Thread Jed Brown via petsc-users
Mark Lohry via petsc-users  writes:

> For what it's worth, I'm regularly solving much larger problems (1M-100M
> unknowns, unsteady) with this discretization and AMG setup on 500+ cores
> with impressively great convergence, dramatically better than ILU/ASM. This
> just happens to be the first time I've experimented with this extremely low
> Mach number, which is known to have a whole host of issues and generally
> needs low-mach preconditioners, I was just a bit surprised by this specific
> failure mechanism.

A common technique for low-Mach preconditioning is to convert to
primitive variables (much better conditioned for the solve) and use a
Schur fieldsplit into the pressure space.  For modest time step, you can
use SIMPLE-like method ("selfp" in PCFieldSplit lingo) to approximate
that Schur complement.  You can also rediscretize to form that
approximation.  This paper has a bunch of examples of choices for the
state variables and derivation of the continuous pressure preconditioner
each case.  (They present it as a classical semi-implicit method, but
that would be the Schur complement preconditioner if using FieldSplit
with a fully implicit or IMEX method.)

https://doi.org/10.1137/090775889


Re: [petsc-users] Compiling Fortran Code

2019-03-13 Thread Smith, Barry F. via petsc-users


  Put the use petscksp starting in column 7 of the file



> On Mar 13, 2019, at 9:05 PM, Maahi Talukder via petsc-users 
>  wrote:
> 
> Hi, 
> 
> Thank you all for your suggestions. I made the changes as suggested.  But now 
> I get the following error-
> .
> [maahi@CB272PP-THINK1 egrid2d]$ make egrid2d
> /home/maahi/petsc/arch-linux2-c-debug/bin/mpif90 -Wall -ffree-line-length-0 
> -Wno-unused-dummy-argument -g  -I/home/maahi/petsc/include 
> -I/home/maahi/petsc/arch-linux2-c-debug/include -Ofast -fdefault-real-8 -c 
> -I/home/maahi/petsc/include -I/home/maahi/petsc/arch-linux2-c-debug/include 
> -Ofast -fdefault-real-8 main.F  
> -Wl,-rpath,/home/maahi/petsc/arch-linux2-c-debug/lib 
> -L/home/maahi/petsc/arch-linux2-c-debug/lib 
> -Wl,-rpath,/home/maahi/petsc/arch-linux2-c-debug/lib 
> -L/home/maahi/petsc/arch-linux2-c-debug/lib 
> -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/7 
> -L/usr/lib/gcc/x86_64-redhat-linux/7 -lpetsc -lflapack -lfblas -lm -lpthread 
> -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s 
> -lquadmath -lstdc++ -ldl 
> main.F:6:1:
> 
>  use petscksp
>  1
> Error: Non-numeric character in statement label at (1)
> main.F:6:1:
> 
>  use petscksp
>  1
> Error: Unclassifiable statement at (1)
> make: *** [makefile:28: main.o] Error 1
> 
> .
> Any idea how to fix that?
> 
> Thanks,
> Maahi Talukder
> 
> 
> On Wed, Mar 13, 2019 at 8:44 PM Balay, Satish  wrote:
> check petsc makefile format - for ex: 
> src/tao/unconstrained/examples/tutorials/makefile
> 
> Also rename your fortran sources that have petsc calls from .f to .F
> 
> 
> On Wed, 13 Mar 2019, Matthew Knepley via petsc-users wrote:
> 
> > On Wed, Mar 13, 2019 at 7:36 PM Maahi Talukder via petsc-users <
> > petsc-users@mcs.anl.gov> wrote:
> > 
> > > Dear All,
> > >
> > > I am trying to compile a Fortran code. The make is as it follows-
> > >
> > >
> > > 
> > > # Makefile for egrid2d
> > >
> > > OBJS = main.o egrid2d.o
> > >
> > > FFLAGS = -I/home/maahi/petsc/include
> > > -I/home/maahi/petsc/arch-linux2-c-debug/include -Ofast -fdefault-real-8
> > >
> > > #
> > > # link
> > > #
> > > include ${PETSC_DIR}/lib/petsc/conf/variables
> > > include ${PETSC_DIR}/lib/petsc/conf/rules
> > >
> > > egrid2d: $(OBJS)
> > >
> > > ${FLINKER}  $(OBJS)  -o egrid2d ${PETSC_LIB}
> > >
> > 
> > Move this above your includes
> > 
> The location is fine. Can you change OBJS to a different name - say OBJ [or 
> something else] and see if that works.
> 
> Satish
> 
> > 
> > >
> > > #
> > > # compile
> > > #
> > > main.o:
> > >${FLINKER} -c $(FFLAGS) main.f  ${PETSC_LIB}
> > >
> > 
> > You should not need this rule.
> > 
> >   Thanks,
> > 
> > Matt
> > 
> > 
> > > #
> > > # Common and Parameter Dependencies
> > > #
> > >
> > > main.o:main.fpar2d.f
> > > egrid2d.o: egrid2d.f par2d.f
> > >
> > > .
> > >
> > > But I get the following error-
> > >
> > >
> > > ..
> > > /home/maahi/petsc/arch-linux2-c-debug/bin/mpif90 -Wall
> > > -ffree-line-length-0 -Wno-unused-dummy-argument -g
> > > -I/home/maahi/petsc/include 
> > > -I/home/maahi/petsc/arch-linux2-c-debug/include
> > > -Ofast -fdefault-real-8  -o egrid2d
> > > -Wl,-rpath,/home/maahi/petsc/arch-linux2-c-debug/lib
> > > -L/home/maahi/petsc/arch-linux2-c-debug/lib
> > > -Wl,-rpath,/home/maahi/petsc/arch-linux2-c-debug/lib
> > > -L/home/maahi/petsc/arch-linux2-c-debug/lib
> > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/7
> > > -L/usr/lib/gcc/x86_64-redhat-linux/7 -lpetsc -lflapack -lfblas -lm
> > > -lpthread -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm
> > > -lgcc_s -lquadmath -lstdc++ -ldl
> > > /*usr/lib/gcc/x86_64-redhat-linux/7/../../../../lib64/crt1.o: In function
> > > `_start':*
> > > *(.text+0x20): undefined reference to `main'*
> > > collect2: error: ld returned 1 exit status
> > > make: *** [makefile:18: egrid2d] Error 1
> > >
> > > 
> > >
> > > Any idea how to fix it ?
> > >
> > > Thanks,
> > > Maahi Talukder
> > >
> > >
> > >
> > >
> > 
> > 
> 



Re: [petsc-users] GAMG parallel convergence sensitivity

2019-03-13 Thread Mark Adams via petsc-users
>
>
>
> Any thoughts here? Is there anything obviously wrong with my setup?
>

Fast and robust solvers for NS require specialized methods that are not
provided in PETSc and the methods tend to require tighter integration with
the meshing and discretization than the algebraic interface supports.

I see you are using 20 smoothing steps. That is very high. Generally you
want to use the v-cycle more (ie, lower number of smoothing steps and more
iterations).

And, full MG is a bit tricky. I would not use it, but if it helps, fine.


> Any way to reduce the dependence of the convergence iterations on the
> parallelism?
>

This comes from the bjacobi smoother. Use jacobi and you will not have a
parallelism problem and you have bjacobi in the limit of parallelism.


> -- obviously I expect the iteration count to be higher in parallel, but I
> didn't expect such catastrophic failure.
>
>
You are beyond what AMG is designed for. If you press this problem it will
break any solver and will break generic AMG relatively early.

This makes it hard to give much advice. You really just need to test things
and use what works best. There are special purpose methods that you can
implement in PETSc but that is a topic for a significant project.


Re: [petsc-users] Compiling Fortran Code

2019-03-13 Thread Balay, Satish via petsc-users
check petsc makefile format - for ex: 
src/tao/unconstrained/examples/tutorials/makefile

Also rename your fortran sources that have petsc calls from .f to .F


On Wed, 13 Mar 2019, Matthew Knepley via petsc-users wrote:

> On Wed, Mar 13, 2019 at 7:36 PM Maahi Talukder via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
> 
> > Dear All,
> >
> > I am trying to compile a Fortran code. The make is as it follows-
> >
> >
> > 
> > # Makefile for egrid2d
> >
> > OBJS = main.o egrid2d.o
> >
> > FFLAGS = -I/home/maahi/petsc/include
> > -I/home/maahi/petsc/arch-linux2-c-debug/include -Ofast -fdefault-real-8
> >
> > #
> > # link
> > #
> > include ${PETSC_DIR}/lib/petsc/conf/variables
> > include ${PETSC_DIR}/lib/petsc/conf/rules
> >
> > egrid2d: $(OBJS)
> >
> > ${FLINKER}  $(OBJS)  -o egrid2d ${PETSC_LIB}
> >
> 
> Move this above your includes
> 
The location is fine. Can you change OBJS to a different name - say OBJ [or 
something else] and see if that works.

Satish

> 
> >
> > #
> > # compile
> > #
> > main.o:
> >${FLINKER} -c $(FFLAGS) main.f  ${PETSC_LIB}
> >
> 
> You should not need this rule.
> 
>   Thanks,
> 
> Matt
> 
> 
> > #
> > # Common and Parameter Dependencies
> > #
> >
> > main.o:main.fpar2d.f
> > egrid2d.o: egrid2d.f par2d.f
> >
> > .
> >
> > But I get the following error-
> >
> >
> > ..
> > /home/maahi/petsc/arch-linux2-c-debug/bin/mpif90 -Wall
> > -ffree-line-length-0 -Wno-unused-dummy-argument -g
> > -I/home/maahi/petsc/include -I/home/maahi/petsc/arch-linux2-c-debug/include
> > -Ofast -fdefault-real-8  -o egrid2d
> > -Wl,-rpath,/home/maahi/petsc/arch-linux2-c-debug/lib
> > -L/home/maahi/petsc/arch-linux2-c-debug/lib
> > -Wl,-rpath,/home/maahi/petsc/arch-linux2-c-debug/lib
> > -L/home/maahi/petsc/arch-linux2-c-debug/lib
> > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/7
> > -L/usr/lib/gcc/x86_64-redhat-linux/7 -lpetsc -lflapack -lfblas -lm
> > -lpthread -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm
> > -lgcc_s -lquadmath -lstdc++ -ldl
> > /*usr/lib/gcc/x86_64-redhat-linux/7/../../../../lib64/crt1.o: In function
> > `_start':*
> > *(.text+0x20): undefined reference to `main'*
> > collect2: error: ld returned 1 exit status
> > make: *** [makefile:18: egrid2d] Error 1
> >
> > 
> >
> > Any idea how to fix it ?
> >
> > Thanks,
> > Maahi Talukder
> >
> >
> >
> >
> 
> 



[petsc-users] Issue when passing DMDA array on to Paraview Catalyst

2019-03-13 Thread Bastian Löhrer via petsc-users

Dear PETSc users,

I am having difficulties passing PETSc data on to Paraview Catalyst and 
it may be related to the way we handle the PETSs data in our Fortran code.


We have DMDA objects, which we pass on to subroutines this way:


  ...
  call DMCreateLocalVector(da1dof, loc_p, ierr)
  ...
  call VecGetArray(loc_p, loc_p_v, loc_p_i, ierr)
  call process( loc_p_v(loc_p_i+1) )
  ...



Inside the subroutine (process in this example) we treat the 
subroutine's argument as if it were an ordinary Fortran array:



  subroutine process( p )

    use gridinfo ! provides gis, gie, ... etc.

    implicit none

#include "petsc_include.h"

    PetscScalar, dimension(gis:gie,gjs:gje,gks:gke) :: p
    PetscInt i,j,k

    do k = gks, gke
  do j = gjs, gje
    do i = gis, gie

    p(i,j,k) = ...

    enddo
  enddo
    enddo

  end subroutine process

I find this procedure a little quirky, but it has been working 
flawlessly for years.


However, I am now encountering difficulties when passing this 
variable/array p on to a Paraview Catalyst adaptor subroutine. Doing so 
I end up with very strange values there. When replacing p with an 
ordinary local Fortran array everything is fine.


Bastian



Re: [petsc-users] PCFieldSplit with MatNest

2019-03-13 Thread Manuel Colera Rico via petsc-users

After adding that line the problem gets fixed.

Regards,

Manuel

---

On 3/13/19 3:13 PM, Zhang, Junchao wrote:

Manuel,
  Could you try to add this line
     sbaij->free_imax_ilen = PETSC_TRUE;
 after line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c


 PS: Matt, this bug looks unrelated to my VecRestoreArrayRead_Nest fix.

--Junchao Zhang


On Wed, Mar 13, 2019 at 9:05 AM Matthew Knepley > wrote:


On Wed, Mar 13, 2019 at 9:44 AM Manuel Colera Rico via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:

Yes:

[ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c


Junchao, do imax and ilen get missed in the Destroy with the user
provides arrays?


https://bitbucket.org/petsc/petsc/src/06a3e802b3873ffbfd04b71a0821522327dd9b04/src/mat/impls/sbaij/seq/sbaij.c#lines-2431

    Matt

I have checked that I have destroyed all the MatNest matrices
and all
the submatrices individually.

Manuel

---

On 3/13/19 2:28 PM, Jed Brown wrote:
> Is there any output if you run with -malloc_dump?
>
> Manuel Colera Rico via petsc-users mailto:petsc-users@mcs.anl.gov>> writes:
>
>> Hi, Junchao,
>>
>> I have installed the newest version of PETSc and it works
fine. I just
>> get the following memory leak warning:
>>
>> Direct leak of 28608 byte(s) in 12 object(s) allocated from:
>>   #0 0x7f1ddd5caa38 in __interceptor_memalign
>>
../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:111
>>   #1 0x7f1ddbef1213 in PetscMallocAlign
>>

(/opt/PETSc_library/petsc-3.10.4/mcr_20190313/lib/libpetsc.so.3.10+0x150213)
>>
>> Thank you,
>>
>> Manuel
>>
>> ---
>>
>> On 3/12/19 7:08 PM, Zhang, Junchao wrote:
>>> Hi, Manuel,
>>>    I recently fixed a problem in VecRestoreArrayRead.
Basically, I
>>> added VecRestoreArrayRead_Nest. Could you try the master
branch of
>>> PETSc to see if it fixes your problem?
>>>    Thanks.
>>>
>>> --Junchao Zhang
>>>
>>>
>>> On Mon, Mar 11, 2019 at 6:56 AM Manuel Colera Rico via
petsc-users
>>> mailto:petsc-users@mcs.anl.gov>
>> wrote:
>>>
>>>      Hello,
>>>
>>>      I need to solve a 2*2 block linear system. The
matrices A_00, A_01,
>>>      A_10, A_11 are constructed separately via
>>>      MatCreateSeqAIJWithArrays and
>>>      MatCreateSeqSBAIJWithArrays. Then, I construct the
full system matrix
>>>      with MatCreateNest, and use MatNestGetISs and
PCFieldSplitSetIS to
>>>      set
>>>      up the PC, trying to follow the procedure described here:
>>>

https://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex70.c.html.
>>>
>>>      However, when I run the code with Leak Sanitizer, I
get the
>>>      following error:
>>>
>>>
=
>>>      ==54927==ERROR: AddressSanitizer: attempting free on
address which
>>>      was
>>>      not malloc()-ed: 0x62751ab8 in thread T0
>>>       #0 0x7fbd95c08f30 in __interceptor_free
>>>
../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:66
>>>       #1 0x7fbd92b99dcd in PetscFreeAlign
>>>

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x146dcd)
>>>       #2 0x7fbd92ce0178 in VecRestoreArray_Nest
>>>

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28d178)
>>>       #3 0x7fbd92cd627d in VecRestoreArrayRead
>>>

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28327d)
>>>       #4 0x7fbd92d1189e in VecScatterBegin_SSToSS
>>>

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2be89e)
>>>       #5 0x7fbd92d1a414 in VecScatterBegin
>>>

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2c7414)
 

Re: [petsc-users] PCFieldSplit with MatNest

2019-03-13 Thread Zhang, Junchao via petsc-users
Manuel,
  Could you try to add this line
 sbaij->free_imax_ilen = PETSC_TRUE;
 after line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c

 PS: Matt, this bug looks unrelated to my VecRestoreArrayRead_Nest fix.

--Junchao Zhang


On Wed, Mar 13, 2019 at 9:05 AM Matthew Knepley 
mailto:knep...@gmail.com>> wrote:
On Wed, Mar 13, 2019 at 9:44 AM Manuel Colera Rico via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Yes:

[ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c

Junchao, do imax and ilen get missed in the Destroy with the user provides 
arrays?

  
https://bitbucket.org/petsc/petsc/src/06a3e802b3873ffbfd04b71a0821522327dd9b04/src/mat/impls/sbaij/seq/sbaij.c#lines-2431

Matt

I have checked that I have destroyed all the MatNest matrices and all
the submatrices individually.

Manuel

---

On 3/13/19 2:28 PM, Jed Brown wrote:
> Is there any output if you run with -malloc_dump?
>
> Manuel Colera Rico via petsc-users 
> mailto:petsc-users@mcs.anl.gov>> writes:
>
>> Hi, Junchao,
>>
>> I have installed the newest version of PETSc and it works fine. I just
>> get the following memory leak warning:
>>
>> Direct leak of 28608 byte(s) in 12 object(s) allocated from:
>>   #0 0x7f1ddd5caa38 in __interceptor_memalign
>> ../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:111
>>   #1 0x7f1ddbef1213 in PetscMallocAlign
>> (/opt/PETSc_library/petsc-3.10.4/mcr_20190313/lib/libpetsc.so.3.10+0x150213)
>>
>> Thank you,
>>
>> Manuel
>>
>> ---
>>
>> On 3/12/19 7:08 PM, Zhang, Junchao wrote:
>>> Hi, Manuel,
>>>I recently fixed a problem in VecRestoreArrayRead. Basically, I
>>> added VecRestoreArrayRead_Nest. Could you try the master branch of
>>> PETSc to see if it fixes your problem?
>>>Thanks.
>>>
>>> --Junchao Zhang
>>>
>>>
>>> On Mon, Mar 11, 2019 at 6:56 AM Manuel Colera Rico via petsc-users
>>> mailto:petsc-users@mcs.anl.gov> 
>>> >> wrote:
>>>
>>>  Hello,
>>>
>>>  I need to solve a 2*2 block linear system. The matrices A_00, A_01,
>>>  A_10, A_11 are constructed separately via
>>>  MatCreateSeqAIJWithArrays and
>>>  MatCreateSeqSBAIJWithArrays. Then, I construct the full system matrix
>>>  with MatCreateNest, and use MatNestGetISs and PCFieldSplitSetIS to
>>>  set
>>>  up the PC, trying to follow the procedure described here:
>>>  
>>> https://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex70.c.html.
>>>
>>>  However, when I run the code with Leak Sanitizer, I get the
>>>  following error:
>>>
>>>  =
>>>  ==54927==ERROR: AddressSanitizer: attempting free on address which
>>>  was
>>>  not malloc()-ed: 0x62751ab8 in thread T0
>>>   #0 0x7fbd95c08f30 in __interceptor_free
>>>  ../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:66
>>>   #1 0x7fbd92b99dcd in PetscFreeAlign
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x146dcd)
>>>   #2 0x7fbd92ce0178 in VecRestoreArray_Nest
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28d178)
>>>   #3 0x7fbd92cd627d in VecRestoreArrayRead
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28327d)
>>>   #4 0x7fbd92d1189e in VecScatterBegin_SSToSS
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2be89e)
>>>   #5 0x7fbd92d1a414 in VecScatterBegin
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2c7414)
>>>   #6 0x7fbd934a999c in PCApply_FieldSplit
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa5699c)
>>>   #7 0x7fbd93369071 in PCApply
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x916071)
>>>   #8 0x7fbd934efe77 in KSPInitialResidual
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa9ce77)
>>>   #9 0x7fbd9350272c in KSPSolve_GMRES
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xaaf72c)
>>>   #10 0x7fbd934e3c01 in KSPSolve
>>>  
>>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa90c01)
>>>
>>>  Disabling Leak Sanitizer also outputs an "invalid pointer" error.
>>>
>>>  Did I forget something when writing the code?
>>>
>>>  Thank you,

Re: [petsc-users] PCFieldSplit with MatNest

2019-03-13 Thread Lawrence Mitchell via petsc-users



> On 13 Mar 2019, at 14:04, Matthew Knepley via petsc-users 
>  wrote:
> 
> On Wed, Mar 13, 2019 at 9:44 AM Manuel Colera Rico via petsc-users 
>  wrote:
> Yes:
> 
> [ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
> /opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
> [ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
> /opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
> [ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
> /opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
> [ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
> /opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
> 
> Junchao, do imax and ilen get missed in the Destroy with the user provides 
> arrays?
> 
>   
> https://bitbucket.org/petsc/petsc/src/06a3e802b3873ffbfd04b71a0821522327dd9b04/src/mat/impls/sbaij/seq/sbaij.c#lines-2431

Looks like it. One probably needs something like:

diff --git a/src/mat/impls/sbaij/seq/sbaij.c b/src/mat/impls/sbaij/seq/sbaij.c
index 2b98394140..c39fc696d8 100644
--- a/src/mat/impls/sbaij/seq/sbaij.c
+++ b/src/mat/impls/sbaij/seq/sbaij.c
@@ -2442,6 +2442,7 @@ PetscErrorCode  MatCreateSeqSBAIJWithArrays(MPI_Comm 
comm,PetscInt bs,PetscInt m
   sbaij->nonew= -1; /*this indicates that inserting a new 
value in the matrix that generates a new nonzero is an error*/
   sbaij->free_a   = PETSC_FALSE;
   sbaij->free_ij  = PETSC_FALSE;
+  sbaij->free_imax_ilen = PETSC_TRUE;
 
   for (ii=0; iiilen[ii] = sbaij->imax[ii] = i[ii+1] - i[ii];

Lawrence

Re: [petsc-users] PCFieldSplit with MatNest

2019-03-13 Thread Manuel Colera Rico via petsc-users
The warning that Leak Sanitizer gives me is not what I wrote two 
messages before (I apologize). It is:


Direct leak of 25920 byte(s) in 4 object(s) allocated from:
    #0 0x7fa97e35aa38 in __interceptor_memalign 
../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:111
    #1 0x7fa97cc81213 in PetscMallocAlign 
(/opt/PETSc_library/petsc-3.10.4/mcr_20190313/lib/libpetsc.so.3.10+0x150213)


which seems to be in accordance (at least in number of leaked bytes) to 
-malloc_dump's output.


Manuel

---

On 3/13/19 2:44 PM, Manuel Colera Rico wrote:

Yes:

[ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c


I have checked that I have destroyed all the MatNest matrices and all 
the submatrices individually.


Manuel

---

On 3/13/19 2:28 PM, Jed Brown wrote:

Is there any output if you run with -malloc_dump?

Manuel Colera Rico via petsc-users  writes:


Hi, Junchao,

I have installed the newest version of PETSc and it works fine. I just
get the following memory leak warning:

Direct leak of 28608 byte(s) in 12 object(s) allocated from:
      #0 0x7f1ddd5caa38 in __interceptor_memalign
../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:111
      #1 0x7f1ddbef1213 in PetscMallocAlign
(/opt/PETSc_library/petsc-3.10.4/mcr_20190313/lib/libpetsc.so.3.10+0x150213) 



Thank you,

Manuel

---

On 3/12/19 7:08 PM, Zhang, Junchao wrote:

Hi, Manuel,
   I recently fixed a problem in VecRestoreArrayRead. Basically, I
added VecRestoreArrayRead_Nest. Could you try the master branch of
PETSc to see if it fixes your problem?
   Thanks.

--Junchao Zhang


On Mon, Mar 11, 2019 at 6:56 AM Manuel Colera Rico via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:

 Hello,

 I need to solve a 2*2 block linear system. The matrices A_00, 
A_01,

 A_10, A_11 are constructed separately via
 MatCreateSeqAIJWithArrays and
 MatCreateSeqSBAIJWithArrays. Then, I construct the full system 
matrix
 with MatCreateNest, and use MatNestGetISs and 
PCFieldSplitSetIS to

 set
 up the PC, trying to follow the procedure described here:
https://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex70.c.html.

 However, when I run the code with Leak Sanitizer, I get the
 following error:

=
 ==54927==ERROR: AddressSanitizer: attempting free on address 
which

 was
 not malloc()-ed: 0x62751ab8 in thread T0
  #0 0x7fbd95c08f30 in __interceptor_free
../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:66
  #1 0x7fbd92b99dcd in PetscFreeAlign
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x146dcd)
  #2 0x7fbd92ce0178 in VecRestoreArray_Nest
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28d178)
  #3 0x7fbd92cd627d in VecRestoreArrayRead
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28327d)
  #4 0x7fbd92d1189e in VecScatterBegin_SSToSS
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2be89e)
  #5 0x7fbd92d1a414 in VecScatterBegin
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2c7414)
  #6 0x7fbd934a999c in PCApply_FieldSplit
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa5699c)
  #7 0x7fbd93369071 in PCApply
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x916071)
  #8 0x7fbd934efe77 in KSPInitialResidual
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa9ce77)
  #9 0x7fbd9350272c in KSPSolve_GMRES
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xaaf72c)
  #10 0x7fbd934e3c01 in KSPSolve
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa90c01)

 Disabling Leak Sanitizer also outputs an "invalid pointer" error.

 Did I forget something when writing the code?

 Thank you,

 Manuel

 ---



Re: [petsc-users] PCFieldSplit with MatNest

2019-03-13 Thread Manuel Colera Rico via petsc-users

Yes:

[ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]8416 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c
[ 0]4544 bytes MatCreateSeqSBAIJWithArrays() line 2431 in 
/opt/PETSc_library/petsc-3.10.4/src/mat/impls/sbaij/seq/sbaij.c


I have checked that I have destroyed all the MatNest matrices and all 
the submatrices individually.


Manuel

---

On 3/13/19 2:28 PM, Jed Brown wrote:

Is there any output if you run with -malloc_dump?

Manuel Colera Rico via petsc-users  writes:


Hi, Junchao,

I have installed the newest version of PETSc and it works fine. I just
get the following memory leak warning:

Direct leak of 28608 byte(s) in 12 object(s) allocated from:
      #0 0x7f1ddd5caa38 in __interceptor_memalign
../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:111
      #1 0x7f1ddbef1213 in PetscMallocAlign
(/opt/PETSc_library/petsc-3.10.4/mcr_20190313/lib/libpetsc.so.3.10+0x150213)

Thank you,

Manuel

---

On 3/12/19 7:08 PM, Zhang, Junchao wrote:

Hi, Manuel,
   I recently fixed a problem in VecRestoreArrayRead. Basically, I
added VecRestoreArrayRead_Nest. Could you try the master branch of
PETSc to see if it fixes your problem?
   Thanks.

--Junchao Zhang


On Mon, Mar 11, 2019 at 6:56 AM Manuel Colera Rico via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:

 Hello,

 I need to solve a 2*2 block linear system. The matrices A_00, A_01,
 A_10, A_11 are constructed separately via
 MatCreateSeqAIJWithArrays and
 MatCreateSeqSBAIJWithArrays. Then, I construct the full system matrix
 with MatCreateNest, and use MatNestGetISs and PCFieldSplitSetIS to
 set
 up the PC, trying to follow the procedure described here:
 
https://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex70.c.html.

 However, when I run the code with Leak Sanitizer, I get the
 following error:

 =
 ==54927==ERROR: AddressSanitizer: attempting free on address which
 was
 not malloc()-ed: 0x62751ab8 in thread T0
  #0 0x7fbd95c08f30 in __interceptor_free
 ../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:66
  #1 0x7fbd92b99dcd in PetscFreeAlign
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x146dcd)
  #2 0x7fbd92ce0178 in VecRestoreArray_Nest
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28d178)
  #3 0x7fbd92cd627d in VecRestoreArrayRead
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28327d)
  #4 0x7fbd92d1189e in VecScatterBegin_SSToSS
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2be89e)
  #5 0x7fbd92d1a414 in VecScatterBegin
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2c7414)
  #6 0x7fbd934a999c in PCApply_FieldSplit
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa5699c)
  #7 0x7fbd93369071 in PCApply
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x916071)
  #8 0x7fbd934efe77 in KSPInitialResidual
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa9ce77)
  #9 0x7fbd9350272c in KSPSolve_GMRES
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xaaf72c)
  #10 0x7fbd934e3c01 in KSPSolve
 
(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa90c01)

 Disabling Leak Sanitizer also outputs an "invalid pointer" error.

 Did I forget something when writing the code?

 Thank you,

 Manuel

 ---



Re: [petsc-users] PCFieldSplit with MatNest

2019-03-13 Thread Jed Brown via petsc-users
Is there any output if you run with -malloc_dump?

Manuel Colera Rico via petsc-users  writes:

> Hi, Junchao,
>
> I have installed the newest version of PETSc and it works fine. I just 
> get the following memory leak warning:
>
> Direct leak of 28608 byte(s) in 12 object(s) allocated from:
>      #0 0x7f1ddd5caa38 in __interceptor_memalign 
> ../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:111
>      #1 0x7f1ddbef1213 in PetscMallocAlign 
> (/opt/PETSc_library/petsc-3.10.4/mcr_20190313/lib/libpetsc.so.3.10+0x150213)
>
> Thank you,
>
> Manuel
>
> ---
>
> On 3/12/19 7:08 PM, Zhang, Junchao wrote:
>> Hi, Manuel,
>>   I recently fixed a problem in VecRestoreArrayRead. Basically, I 
>> added VecRestoreArrayRead_Nest. Could you try the master branch of 
>> PETSc to see if it fixes your problem?
>>   Thanks.
>>
>> --Junchao Zhang
>>
>>
>> On Mon, Mar 11, 2019 at 6:56 AM Manuel Colera Rico via petsc-users 
>> mailto:petsc-users@mcs.anl.gov>> wrote:
>>
>> Hello,
>>
>> I need to solve a 2*2 block linear system. The matrices A_00, A_01,
>> A_10, A_11 are constructed separately via
>> MatCreateSeqAIJWithArrays and
>> MatCreateSeqSBAIJWithArrays. Then, I construct the full system matrix
>> with MatCreateNest, and use MatNestGetISs and PCFieldSplitSetIS to
>> set
>> up the PC, trying to follow the procedure described here:
>> 
>> https://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex70.c.html.
>>
>> However, when I run the code with Leak Sanitizer, I get the
>> following error:
>>
>> =
>> ==54927==ERROR: AddressSanitizer: attempting free on address which
>> was
>> not malloc()-ed: 0x62751ab8 in thread T0
>>  #0 0x7fbd95c08f30 in __interceptor_free
>> ../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:66
>>  #1 0x7fbd92b99dcd in PetscFreeAlign
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x146dcd)
>>  #2 0x7fbd92ce0178 in VecRestoreArray_Nest
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28d178)
>>  #3 0x7fbd92cd627d in VecRestoreArrayRead
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28327d)
>>  #4 0x7fbd92d1189e in VecScatterBegin_SSToSS
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2be89e)
>>  #5 0x7fbd92d1a414 in VecScatterBegin
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2c7414)
>>  #6 0x7fbd934a999c in PCApply_FieldSplit
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa5699c)
>>  #7 0x7fbd93369071 in PCApply
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x916071)
>>  #8 0x7fbd934efe77 in KSPInitialResidual
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa9ce77)
>>  #9 0x7fbd9350272c in KSPSolve_GMRES
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xaaf72c)
>>  #10 0x7fbd934e3c01 in KSPSolve
>> 
>> (/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa90c01)
>>
>> Disabling Leak Sanitizer also outputs an "invalid pointer" error.
>>
>> Did I forget something when writing the code?
>>
>> Thank you,
>>
>> Manuel
>>
>> ---
>>


Re: [petsc-users] PCFieldSplit with MatNest

2019-03-13 Thread Manuel Colera Rico via petsc-users

Hi, Junchao,

I have installed the newest version of PETSc and it works fine. I just 
get the following memory leak warning:


Direct leak of 28608 byte(s) in 12 object(s) allocated from:
    #0 0x7f1ddd5caa38 in __interceptor_memalign 
../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:111
    #1 0x7f1ddbef1213 in PetscMallocAlign 
(/opt/PETSc_library/petsc-3.10.4/mcr_20190313/lib/libpetsc.so.3.10+0x150213)


Thank you,

Manuel

---

On 3/12/19 7:08 PM, Zhang, Junchao wrote:

Hi, Manuel,
  I recently fixed a problem in VecRestoreArrayRead. Basically, I 
added VecRestoreArrayRead_Nest. Could you try the master branch of 
PETSc to see if it fixes your problem?

  Thanks.

--Junchao Zhang


On Mon, Mar 11, 2019 at 6:56 AM Manuel Colera Rico via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:


Hello,

I need to solve a 2*2 block linear system. The matrices A_00, A_01,
A_10, A_11 are constructed separately via
MatCreateSeqAIJWithArrays and
MatCreateSeqSBAIJWithArrays. Then, I construct the full system matrix
with MatCreateNest, and use MatNestGetISs and PCFieldSplitSetIS to
set
up the PC, trying to follow the procedure described here:

https://www.mcs.anl.gov/petsc/petsc-current/src/snes/examples/tutorials/ex70.c.html.

However, when I run the code with Leak Sanitizer, I get the
following error:

=
==54927==ERROR: AddressSanitizer: attempting free on address which
was
not malloc()-ed: 0x62751ab8 in thread T0
 #0 0x7fbd95c08f30 in __interceptor_free
../../../../gcc-8.1.0/libsanitizer/asan/asan_malloc_linux.cc:66
 #1 0x7fbd92b99dcd in PetscFreeAlign

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x146dcd)
 #2 0x7fbd92ce0178 in VecRestoreArray_Nest

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28d178)
 #3 0x7fbd92cd627d in VecRestoreArrayRead

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x28327d)
 #4 0x7fbd92d1189e in VecScatterBegin_SSToSS

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2be89e)
 #5 0x7fbd92d1a414 in VecScatterBegin

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x2c7414)
 #6 0x7fbd934a999c in PCApply_FieldSplit

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa5699c)
 #7 0x7fbd93369071 in PCApply

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0x916071)
 #8 0x7fbd934efe77 in KSPInitialResidual

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa9ce77)
 #9 0x7fbd9350272c in KSPSolve_GMRES

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xaaf72c)
 #10 0x7fbd934e3c01 in KSPSolve

(/opt/PETSc_library/petsc/manuel_OpenBLAS_petsc/lib/libpetsc.so.3.8+0xa90c01)

Disabling Leak Sanitizer also outputs an "invalid pointer" error.

Did I forget something when writing the code?

Thank you,

Manuel

---



Re: [petsc-users] MatCompositeMerge + MatCreateRedundantMatrix

2019-03-13 Thread Marius Buerkle via petsc-users
Indeed, was very easy to add. Are you going to include the Fortran interface for MPICreateSubMatricesMPI  in future releases of PETSC ?

Regarding my initial problem, thanks a lot. It works very well with MPICreateSubMatricesMPI  and the solution can be implemented in a few lines. 

Thanks and Best,

Marius

 

 




On Tue, Mar 12, 2019 at 4:50 AM Marius Buerkle  wrote:





I tried to follow your suggestions but it seems there is no MPICreateSubMatricesMPI for Fortran. Is this correct?




 

We just have to write the binding. Its almost identical to MatCreateSubMatrices() in src/mat/interface/ftn-custom/zmatrixf.c

 

   Matt 






 



On Wed, Feb 20, 2019 at 6:57 PM Marius Buerkle  wrote:





ok, I think I understand now. I will give it a try and if there is some trouble comeback to you. thanks.




 

Cool.

 

   Matt

 




 

marius



 



On Tue, Feb 19, 2019 at 8:42 PM Marius Buerkle  wrote:




ok, so it seems there is no straight forward way to transfer data between PETSc matrices on different subcomms. Probably doing it by "hand" extracting the matricies on the subcomms create a MPI_INTERCOMM transfering the data to PETSC_COMM_WORLD and assembling them in a new PETSc matrix would be possible, right?



 

That sounds too complicated. Why not just reverse MPICreateSubMatricesMPI()? Meaning make it collective on the whole big communicator, so that you can swap out all the subcommunicator for the aggregation call, just like we do in that function.

Then its really just a matter of reversing the communication call.

 

   Matt 






 



On Tue, Feb 19, 2019 at 7:12 PM Marius Buerkle  wrote:





I see. This would work if the matrices are on different subcommumicators ? Is it possible to add this functionality ?




 

Hmm, no. That is specialized to serial matrices. You need the inverse of MatCreateSubMatricesMPI().

 

  Thanks,

 

     Matt

  




marius

 

 


You basically need the inverse of MatCreateSubmatrices(). I do not think we have that right now, but it could probably be done without too much trouble by looking at that code.
 

  Thanks,

 

     Matt

 


On Tue, Feb 19, 2019 at 6:15 AM Marius Buerkle via petsc-users  wrote:




Hi !

 

Is there some way to combine MatCompositeMerge with MatCreateRedundantMatrix? I basically want to create copies of a matrix from PETSC_COMM_WORLD to subcommunicators, do some work on each subcommunicator and than gather the results back to PETSC_COMM_WORLD, namely  I want to sum the  the invidual matrices from the subcommunicatos component wise and get the resulting matrix on PETSC_COMM_WORLD. Is this somehow possible without going through all the hassle of using MPI directly? 

 

marius




 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/













 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/














 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/














 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/














 

 
--







What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/