Re: [petsc-users] PetscSFReduceBegin can not handle MPI_CHAR?

2019-04-03 Thread Zhang, Junchao via petsc-users


On Wed, Apr 3, 2019, 10:29 PM Fande Kong 
mailto:fdkong...@gmail.com>> wrote:
Thanks for the reply.  It is not necessary for me to use MPI_SUM.  I think the 
better choice is MPIU_REPLACE. Doesn’t MPIU_REPLACE work for any mpi_datatype?
Yes.
Fande


On Apr 3, 2019, at 9:15 PM, Zhang, Junchao 
mailto:jczh...@mcs.anl.gov>> wrote:


On Wed, Apr 3, 2019 at 3:41 AM Lisandro Dalcin via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
IIRC, MPI_CHAR is for ASCII text data. Also, remember that in C the signedness 
of plain `char` is implementation (or platform?) dependent.
 I'm not sure MPI_Reduce() is supposed to / should  handle MPI_CHAR, you should 
use MPI_{SIGNED|UNSIGNED}_CHAR for that. Note however that MPI_SIGNED_CHAR is 
from MPI 2.0.

MPI standard chapter 5.9.3, says "MPI_CHAR, MPI_WCHAR, and MPI_CHARACTER (which 
represent printable characters) cannot be used in reduction operations"
So Fande's code and Jed's branch have problems. To fix that, we have to add 
support for signed char, unsigned char, and char in PetscSF.  The first two 
types support add, mult, logical and bitwise operations. The last is a dumb 
type, only supports pack/unpack. With this fix, PetscSF/MPI would raise error 
on Fande's code. I can come up with a fix tomorrow.


On Wed, 3 Apr 2019 at 07:01, Fande Kong via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Hi All,

There were some error messages when using PetscSFReduceBegin with MPI_CHAR.

ierr = 
PetscSFReduceBegin(ptap->sf,MPI_CHAR,rmtspace,space,MPI_SUM);CHKERRQ(ierr);


My question would be: Does PetscSFReduceBegin suppose work with MPI_CHAR? If 
not, should we document somewhere?

Thanks

Fande,


[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: No support for type size not divisible by 4
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e  GIT 
Date: 2019-04-02 17:37:18 -0600
[0]PETSC ERROR: [1]PETSC ERROR: - Error Message 
--
[1]PETSC ERROR: No support for this operation for this object type
[1]PETSC ERROR: No support for type size not divisible by 4
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e  GIT 
Date: 2019-04-02 17:37:18 -0600
[1]PETSC ERROR: ./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named 
fn605731.local by kongf Tue Apr  2 21:48:41 2019
[1]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes 
--with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 
--download-parmetis=1 --download-superlu_dist=1 
PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch 
--download-party --download-chaco --with-cxx-dialect=C++11
[1]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[1]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[1]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named fn605731.local by 
kongf Tue Apr  2 21:48:41 2019
[0]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes 
--with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 
--download-parmetis=1 --download-superlu_dist=1 
PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch 
--download-party --download-chaco --with-cxx-dialect=C++11
[0]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: #4 PetscSFReduceBegin() line 1208 in 
/Users/kongf/projects/petsc/src/vec/is/sf/interface/sf.c
[0]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIAIJ_allatonce() line 850 in 
/Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
[0]PETSC ERROR: #6 MatPtAP_MPIAIJ_MPIAIJ() line 202 in 
/Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
[0]PETSC ERROR: #7 MatPtAP() line 9429 in 
/Users/kongf/projects/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: #8 main() line 58 in 
/Users/kongf/projects/petsc/src/mat/examples/tests/ex90.c
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -matptap_via allatonce
[0]PETSC ERROR: End of Error Message ---send entire error 
message to 

Re: [petsc-users] PetscSFReduceBegin can not handle MPI_CHAR?

2019-04-03 Thread Zhang, Junchao via petsc-users

On Wed, Apr 3, 2019 at 3:41 AM Lisandro Dalcin via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
IIRC, MPI_CHAR is for ASCII text data. Also, remember that in C the signedness 
of plain `char` is implementation (or platform?) dependent.
 I'm not sure MPI_Reduce() is supposed to / should  handle MPI_CHAR, you should 
use MPI_{SIGNED|UNSIGNED}_CHAR for that. Note however that MPI_SIGNED_CHAR is 
from MPI 2.0.

MPI standard chapter 5.9.3, says "MPI_CHAR, MPI_WCHAR, and MPI_CHARACTER (which 
represent printable characters) cannot be used in reduction operations"
So Fande's code and Jed's branch have problems. To fix that, we have to add 
support for signed char, unsigned char, and char in PetscSF.  The first two 
types support add, mult, logical and bitwise operations. The last is a dumb 
type, only supports pack/unpack. With this fix, PetscSF/MPI would raise error 
on Fande's code. I can come up with a fix tomorrow.


On Wed, 3 Apr 2019 at 07:01, Fande Kong via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:
Hi All,

There were some error messages when using PetscSFReduceBegin with MPI_CHAR.

ierr = 
PetscSFReduceBegin(ptap->sf,MPI_CHAR,rmtspace,space,MPI_SUM);CHKERRQ(ierr);


My question would be: Does PetscSFReduceBegin suppose work with MPI_CHAR? If 
not, should we document somewhere?

Thanks

Fande,


[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR: No support for this operation for this object type
[0]PETSC ERROR: No support for type size not divisible by 4
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e  GIT 
Date: 2019-04-02 17:37:18 -0600
[0]PETSC ERROR: [1]PETSC ERROR: - Error Message 
--
[1]PETSC ERROR: No support for this operation for this object type
[1]PETSC ERROR: No support for type size not divisible by 4
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e  GIT 
Date: 2019-04-02 17:37:18 -0600
[1]PETSC ERROR: ./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named 
fn605731.local by kongf Tue Apr  2 21:48:41 2019
[1]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes 
--with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 
--download-parmetis=1 --download-superlu_dist=1 
PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch 
--download-party --download-chaco --with-cxx-dialect=C++11
[1]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[1]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[1]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named fn605731.local by 
kongf Tue Apr  2 21:48:41 2019
[0]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes 
--with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 
--download-parmetis=1 --download-superlu_dist=1 
PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch 
--download-party --download-chaco --with-cxx-dialect=C++11
[0]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in 
/Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
[0]PETSC ERROR: #4 PetscSFReduceBegin() line 1208 in 
/Users/kongf/projects/petsc/src/vec/is/sf/interface/sf.c
[0]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIAIJ_allatonce() line 850 in 
/Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
[0]PETSC ERROR: #6 MatPtAP_MPIAIJ_MPIAIJ() line 202 in 
/Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
[0]PETSC ERROR: #7 MatPtAP() line 9429 in 
/Users/kongf/projects/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: #8 main() line 58 in 
/Users/kongf/projects/petsc/src/mat/examples/tests/ex90.c
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -matptap_via allatonce
[0]PETSC ERROR: End of Error Message ---send entire error 
message to petsc-ma...@mcs.anl.gov--
[1]PETSC ERROR: #4 PetscSFReduceBegin() line 1208 in 
/Users/kongf/projects/petsc/src/vec/is/sf/interface/sf.c
[1]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIAIJ_allatonce() line 850 in 
/Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
[1]PETSC ERROR: #6 MatPtAP_MPIAIJ_MPIAIJ() line 202 in 

Re: [petsc-users] PetscSFReduceBegin can not handle MPI_CHAR?

2019-04-03 Thread Smith, Barry F. via petsc-users


  It is probably not written down anywhere (where would we write it?) nor does 
./configure check it but the consensus is that PETSc now requires MPI 2.0 or 
later.  So developers should feel free to use any MPI 2.0 feature without 
guards and if you stumble across older code that still has guards/configure 
tests for 2.0 features those may be removed.

   Barry

  On the other hand we still require C89* code so no one would make the mistake 
of thinking PETSc is cutting edge ;)

  * One word -- Microsoft


> On Apr 3, 2019, at 3:39 AM, Lisandro Dalcin via petsc-users 
>  wrote:
> 
> IIRC, MPI_CHAR is for ASCII text data. Also, remember that in C the 
> signedness of plain `char` is implementation (or platform?) dependent.
>  I'm not sure MPI_Reduce() is supposed to / should  handle MPI_CHAR, you 
> should use MPI_{SIGNED|UNSIGNED}_CHAR for that. Note however that 
> MPI_SIGNED_CHAR is from MPI 2.0.
> 
> On Wed, 3 Apr 2019 at 07:01, Fande Kong via petsc-users 
>  wrote:
> Hi All,
> 
> There were some error messages when using PetscSFReduceBegin with MPI_CHAR. 
> 
> ierr = 
> PetscSFReduceBegin(ptap->sf,MPI_CHAR,rmtspace,space,MPI_SUM);CHKERRQ(ierr);
> 
> 
> My question would be: Does PetscSFReduceBegin suppose work with MPI_CHAR? If 
> not, should we document somewhere?
> 
> Thanks
> 
> Fande,
> 
> 
> [0]PETSC ERROR: - Error Message 
> --
> [0]PETSC ERROR: No support for this operation for this object type
> [0]PETSC ERROR: No support for type size not divisible by 4
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e  GIT 
> Date: 2019-04-02 17:37:18 -0600
> [0]PETSC ERROR: [1]PETSC ERROR: - Error Message 
> --
> [1]PETSC ERROR: No support for this operation for this object type
> [1]PETSC ERROR: No support for type size not divisible by 4
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
> trouble shooting.
> [1]PETSC ERROR: Petsc Development GIT revision: v3.10.4-1989-gd816d1587e  GIT 
> Date: 2019-04-02 17:37:18 -0600
> [1]PETSC ERROR: ./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named 
> fn605731.local by kongf Tue Apr  2 21:48:41 2019
> [1]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes 
> --with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 
> --download-parmetis=1 --download-superlu_dist=1 
> PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch 
> --download-party --download-chaco --with-cxx-dialect=C++11
> [1]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in 
> /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
> [1]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in 
> /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
> [1]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in 
> /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
> ./ex90 on a arch-linux2-c-dbg-feature-ptap-all-at-once named fn605731.local 
> by kongf Tue Apr  2 21:48:41 2019
> [0]PETSC ERROR: Configure options --download-hypre=1 --with-debugging=yes 
> --with-shared-libraries=1 --download-fblaslapack=1 --download-metis=1 
> --download-parmetis=1 --download-superlu_dist=1 
> PETSC_ARCH=arch-linux2-c-dbg-feature-ptap-all-at-once --download-ptscotch 
> --download-party --download-chaco --with-cxx-dialect=C++11
> [0]PETSC ERROR: #1 PetscSFBasicPackTypeSetup() line 678 in 
> /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
> [0]PETSC ERROR: #2 PetscSFBasicGetPack() line 804 in 
> /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
> [0]PETSC ERROR: #3 PetscSFReduceBegin_Basic() line 1024 in 
> /Users/kongf/projects/petsc/src/vec/is/sf/impls/basic/sfbasic.c
> [0]PETSC ERROR: #4 PetscSFReduceBegin() line 1208 in 
> /Users/kongf/projects/petsc/src/vec/is/sf/interface/sf.c
> [0]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIAIJ_allatonce() line 850 in 
> /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
> [0]PETSC ERROR: #6 MatPtAP_MPIAIJ_MPIAIJ() line 202 in 
> /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
> [0]PETSC ERROR: #7 MatPtAP() line 9429 in 
> /Users/kongf/projects/petsc/src/mat/interface/matrix.c
> [0]PETSC ERROR: #8 main() line 58 in 
> /Users/kongf/projects/petsc/src/mat/examples/tests/ex90.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -matptap_via allatonce
> [0]PETSC ERROR: End of Error Message ---send entire error 
> message to petsc-ma...@mcs.anl.gov--
> [1]PETSC ERROR: #4 PetscSFReduceBegin() line 1208 in 
> /Users/kongf/projects/petsc/src/vec/is/sf/interface/sf.c
> [1]PETSC ERROR: #5 MatPtAPNumeric_MPIAIJ_MPIAIJ_allatonce() line 850 in 
> /Users/kongf/projects/petsc/src/mat/impls/aij/mpi/mpiptap.c
> 

Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-04-03 Thread Jed Brown via petsc-users
Myriam Peyrounette via petsc-users  writes:

> Hi all,
>
> for your information, you'll find attached the comparison of the weak
> memory scalings when using :
>
> - PETSc 3.6.4 (reference)
> - PETSc 3.10.4 without specific options
> - PETSc 3.10.4 with the three scalability options you mentionned
>
> Using the scalability options does improve the memory scaling. However,
> the 3.6 version still has a better one...

Yes, this still looks significant.  Is this an effect we can still
reproduce with a PETSc example and/or using a memory profiler (such as
massif or gperftools)?  I think it's important for us to narrow down
what causes this difference (looks like almost 2x on your 1e8 problem
size) so we can fix.


Re: [petsc-users] Integrate command line arguments -bv_type vecs -bv_type mat to python code

2019-04-03 Thread Jose E. Roman via petsc-users
You could just do

  E.getBV().setType(SLEPc.BV.Type.VECS)

before E.setFromOptions()

Jose


> El 3 abr 2019, a las 13:05, Jan Grießer via petsc-users 
>  escribió:
> 
> Hello, everybody,
> I use Splec4py to solve the lowest eigenvalues and corresponding eigenvectors 
> of systems up to 20 Mio x 20 Mio. My problem so far was that the base was too 
> big. On advice here in the forum I always added -bv_type vecs -bv_type mat to 
> the command line, what solved my problems. I don't always want to execute 
> these two commands on the command line, but I want to define them directly in 
> the python code before E.solve() is called. However, I'm not quite sure how 
> to call these functions correctly in Python. 
> Currently I do it like this
>  bv = SLEPc.BV().create()
> bv.setType(VECS),
> bv.setType(Mat), E.setBV(bv)
> 
> Could you tell me if that's correct? 
> Thank you very much in advance!



[petsc-users] Integrate command line arguments -bv_type vecs -bv_type mat to python code

2019-04-03 Thread Jan Grießer via petsc-users
Hello, everybody,
I use Splec4py to solve the lowest eigenvalues and corresponding
eigenvectors of systems up to 20 Mio x 20 Mio. My problem so far was that
the base was too big. On advice here in the forum I always added -bv_type
vecs -bv_type mat to the command line, what solved my problems. I don't
always want to execute these two commands on the command line, but I want
to define them directly in the python code before E.solve() is called.
However, I'm not quite sure how to call these functions correctly in
Python.
Currently I do it like this
 bv = SLEPc.BV().create()
bv.setType(VECS),
bv.setType(Mat), E.setBV(bv)

Could you tell me if that's correct?
Thank you very much in advance!


Re: [petsc-users] petsc4py - Convert c example code to Python

2019-04-03 Thread Nicola Creati via petsc-users

Hello,
thanks to all, I used the wrong RHS and Jacobian function in my 
conversion. I modified the code and I got the right solution. I'm 
attaching the final code conversion. Someone could be interested in a 
Python version of ex13 of the TS component tutorial.


"""
Python version of the ex13.c PETSc TS component at:
https://www.mcs.anl.gov/petsc/petsc-current/src/ts/examples/tutorials/ex13.c.html
"""

import sys, petsc4py
petsc4py.init(sys.argv)

from petsc4py import PETSc
from mpi4py import MPI
import numpy as np
import math

def RHS_func(ts, t, U, F, *args):
    da = ts.getDM()
    localU = da.getLocalVec()

    da.globalToLocal(U, localU)

    mx, my = da.getSizes()

    hx, hy = [1.0/(m-1) for m in [mx, my]]
    sx = 1.0/(hx*hx)
    sy = 1.0/(hy*hy)

    uarray = localU.getArray(readonly=1).reshape(8, 8, order='C')
    f = F.getArray(readonly=0).reshape(8, 8, order='C')

    (xs, xm), (ys, ym) = da.getRanges()
    for j in range(ys, ym):
    for i in range(xs, xm):
    if i == 0 or j == 0 or i == (mx-1) or j == (my-1):
    f[i,j] = uarray[i,j]
    continue
    u = uarray[i,j]
    uxx = (-2.0 * u + uarray[i, j-1] + uarray[i, j+1]) * sx
    uyy = (-2.0 * u + uarray[i-1, j] + uarray[i+1, j])* sy
    f[i, j] = uxx + uyy
    F.assemble()

def Jacobian_func(ts, t, U, A, B, *args):
    mx, my = da.getSizes()
    hx = 1.0/(mx-1)
    hy = 1.0/(my-1)
    sx = 1.0/(hx*hx)
    sy = 1.0/(hy*hy)

    B.setOption(PETSc.Mat.Option.NEW_NONZERO_ALLOCATION_ERR, False)
    B.zeroEntries()

    (i0, i1), (j0, j1) = da.getRanges()
    row = PETSc.Mat.Stencil()
    col = PETSc.Mat.Stencil()

    for i in range(j0, j1):
    for j in range(i0, i1):
    row.index = (i,j)
    row.field = 0
    if i == 0 or j== 0 or i==(my-1) or j==(mx-1):
    B.setValueStencil(row, row, 1.0)
    else:
    for index, value in [
    ((i-1, j),   sx),
    ((i+1, j),   sx),
    ((i,   j-1), sy),
    ((i, j+1), sy),
    ((i,   j),  -2*sx - 2*sy)]:
    col.index = index
    col.field = 0
    B.setValueStencil(row, col, value)

    B.assemble()
    if A != B: B.assemble()

    return PETSc.Mat.Structure.SAME_NONZERO_PATTERN

def make_initial_solution(da, U, c):
    mx, my = da.getSizes()
    hx = 1.0/(mx-1)
    hy = 1.0/(my-1)
    (xs, xm), (ys, ym) = da.getRanges()

    u = U.getArray(readonly=0).reshape(8, 8, order='C')

    for j in range(ys, ym):
    y = j*hy
    for i in range(xs, xm):
    x = i*hx
    r = math.sqrt((x-0.5)**2+(y-0.5)**2)
    if r < (0.125):
    u[i, j] = math.exp(c*r*r*r)
    else:
    u[i, j] = 0.0
    U.assemble()

nx = 8
ny = 8
da = PETSc.DMDA().create([nx, ny], stencil_type= 
PETSc.DA.StencilType.STAR, stencil_width=1, dof=1)


da.setFromOptions()
da.setUp()

u = da.createGlobalVec()
f = u.duplicate()

c = -30.0

ts = PETSc.TS().create()
ts.setDM(da)
ts.setType(ts.Type.BEULER)

ts.setRHSFunction(RHS_func, f)

da.setMatType('aij')
J = da.createMat()
ts.setRHSJacobian(Jacobian_func, J, J)

ftime = 1.0
ts.setMaxTime(ftime)
ts.setExactFinalTime(PETSc.TS.ExactFinalTime.STEPOVER)

make_initial_solution(da, u, c)
dt = 0.01

ts.setTimeStep(dt)
ts.setFromOptions()
ts.solve(u)

ftime = ts.getSolveTime()
steps = ts.getStepNumber()

Cheers.

Nicola


On 02/04/2019 16:54, Zhang, Hong wrote:

Your python implementation of the residual function and the Jacobian function 
is intended for solving ODEs in the form x' = f(t,x). So you should use 
ts.setRHSFunction() and ts.setRHSJacobian() instead of the implicit version.

Hong


On Apr 2, 2019, at 3:18 AM, Nicola Creati via petsc-users 
 wrote:

Hello, I have run the two codes using: -pc_type lu -ts_monitor -snes_monitor 
-ksp_monitor -ts_max_steps 5. The codes start to diverge after the first time 
step:
Python run :

0 TS dt 0.01 time 0.
 0 SNES Function norm 2.327405179696e+02
   0 KSP Residual norm 1.939100379240e+00
   1 KSP Residual norm 1.013760235597e-15
 1 SNES Function norm 9.538686413612e-14
1 TS dt 0.01 time 0.01
 0 SNES Function norm 8.565210784140e-14
   0 KSP Residual norm 1.266678408555e-15
   1 KSP Residual norm 2.663404219322e-31
 1 SNES Function norm 5.528402779439e-29
2 TS dt 0.01 time 0.02
 0 SNES Function norm 5.829656072531e-29
   0 KSP Residual norm 3.805606380832e-31
   1 KSP Residual norm 3.283104975114e-46
 1 SNES Function norm 1.867771204856e-44
3 TS dt 0.01 time 0.03
 0 SNES Function norm 1.644541571708e-44
   0 KSP Residual norm 1.969573456719e-46
   1 KSP Residual norm 1.297976290541e-61
 1 SNES Function norm 1.780215821230e-59
4 TS dt 0.01 time 0.04
 0 SNES Function norm 1.700894451833e-59
5 TS dt 0.01 time 0.05

ex13.c run:

0 TS dt 0.01 time 0.
 0 SNES Function norm 

Re: [petsc-users] Unable to read in values thru namelist in Fortran after using PETSc 64bit in linux

2019-04-03 Thread Smith, Barry F. via petsc-users

   Based on the data below I am guess you are passing PetscInt arguments for 
all the arguments to MPI_ALLGATHERV. This won't work if PetscInt variables are 
of size 64 bits. Lets look at the arguments from 
https://www.mpich.org/static/docs/v3.1.x/www3/MPI_Allgather.html

int MPI_Allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype,
  void *recvbuf, int recvcount, MPI_Datatype recvtype,
  MPI_Comm comm)

all he arguments labeled void* can be passed PetscInt declared arguments but 
the ones labeled int must be passed regular 32 bit integers (to help our own 
development we label all these variables as PetscMPIInt in PETSc)

   Good luck, if this does not resolve your problem please send us a small 
standalone problem we can run that reproduces your problem.

   Barry




> On Apr 3, 2019, at 12:48 AM, TAY wee-beng  wrote:
> 
> Hi,
> 
> I just encounter a mpi_allgatherv problem when using 64bit PETSc:
> 
> call 
> MPI_ALLGATHERV(tmp_mpi_data,counter,MPIU_INTEGER,tmp_mpi_data2,counter_global,idisp,MPIU_INTEGER,MPI_COMM_WORLD,ierr)
> 
> The error is:
> 
> Fatal error in PMPI_Allgatherv: Message truncated, error stack:
> PMPI_Allgatherv(1452).: MPI_Allgatherv(sbuf=0x75cfdc0, scount=1119, 
> dtype=0x4c000831, rbuf=0x75d8600, rcounts=0x7ffdb880, 
> displs=0x7ffdb860, dtype=0x4c000831, MPI_COMM_WORLD) failed
> MPIR_Allgatherv_impl(1013): fail failed
> MPIR_Allgatherv(967)..: fail failed
> 
> MPIR_Allgatherv_intra(222): fail failed
> MPIR_Localcopy(107)...: Message truncated; 8952 bytes received but buffer 
> size is 8864
> 
> The variables are all defined as PetscInt. The strange thing is that I did 
> 2-3 MPI_ALLGATHERV which are exactly the same, but only the last one got 
> problem. Should I change all these variables to PetscMPIInt?
> 
> Also, can I change all PetscInt in the code to PetscMPIInt, except if it's 
> labeled void *?
> 
> 
> Thank you very much.
> 
> Yours sincerely,
> 
> 
> TAY Wee-Beng (Zheng Weiming) 郑伟明
> Personal research webpage: http://tayweebeng.wixsite.com/website
> Youtube research showcase: 
> https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
> linkedin: www.linkedin.com/in/tay-weebeng
> 
> 
> On 21/9/2018 2:57 AM, Smith, Barry F. wrote:
>>Yes, you need to go through your code and check each MPI call and make 
>> sure you use PetscMPIInt for integer arguments and PetscInt for the void* 
>> arguments and also make sure that the data type you use in the MPI calls 
>> (when communicating PetscInt) is MPIU_INT.
>> 
>> You should not need a fancy debugger to find out the crash point. Just a 
>> basic debugger like gdb, lldb, or dbx will
>> 
>>Barry
>> 
>> 
>>> On Sep 20, 2018, at 8:57 AM, TAY wee-beng  wrote:
>>> 
>>> Hi,
>>> 
>>> Sorry I'm still a bit confused. My 64bit code still doesn't work once I use 
>>> more than 1 procs. It just aborts at some point. I'm been trying to use ARM 
>>> Forge mpi debugging tool to find the error but it's a bit difficult to back 
>>> trace.
>>> 
>>> So I should carefully inspect each mpi subroutine or function, is that 
>>> correct?
>>> 
>>> If it's INT, then I should use PetscMPIInt. If it's labeled void *, I 
>>> should use PetscInt. Is that so?
>>> 
>>> Thank you very much
>>> 
>>> Yours sincerely,
>>> 
>>> 
>>> TAY Wee-Beng 郑伟明 (Zheng Weiming)
>>> Personal research webpage: http://tayweebeng.wixsite.com/website
>>> Youtube research showcase: 
>>> https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
>>> linkedin: www.linkedin.com/in/tay-weebeng
>>> 
>>> 
>>> On 19/9/2018 12:49 AM, Smith, Barry F. wrote:
PetscMPIInt  (or integer) are for all lengths passed to MPI functions; 
 simple look at the prototypes for the MPI function you care about and it 
 will tell you which arguments are integer.  The DATA you are passing into 
 the MPI arrays (which are labeled void * in the manual pages) should be 
 PetscInt.
 
 
 
> On Sep 18, 2018, at 1:39 AM, TAY wee-beng  wrote:
> 
> Hi,
> 
> In that case, does it apply to all MPI subroutines such as MPI_ALLGATHER?
> 
> In other words, must I assign local_array_length etc as PetscMPIInt?
> 
> call 
> MPI_ALLGATHER(local_array_length,1,MPIU_INTEGER,array_length,1,MPIU_INTEGER,MPI_COMM_WORLD,ierr)
> 
> Or is it ok to change all integers from PetscInt to PetscMPIInt?
> 
> With the exception of ierr - PetscErrorCode
> 
> 
> Thank you very much.
> 
> Yours sincerely,
> 
> 
> TAY Wee-Beng (Zheng Weiming) 郑伟明
> Personal research webpage: http://tayweebeng.wixsite.com/website
> Youtube research showcase: 
> https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA