Re: [petsc-dev] Is master broken?

2019-07-31 Thread Smith, Barry F. via petsc-dev
It is generated automatically and put in arch-linux2-c-debug/include/petscpkg_version.h this include file is included at top of the "bad" source file crashes so in theory everything is in order check that arch-linux2-c-debug/include/petscpkg_version.h contains PETSC_PKG_CUDA_VERSION_GE

[petsc-dev] Is master broken?

2019-07-31 Thread Mark Adams via petsc-dev
I am seeing this when I pull master into my branch: "/autofs/nccs-svm1_home1/adams/petsc/src/mat/impls/dense/seq/cuda/ densecuda.cu" , line 243: error: function call is not allowed in a constant expression #if PETSC_PKG_CUDA_VERSION_GE(10,1,0) and I see that this macro does

Re: [petsc-dev] DMDAGlobalToNatural errors with Ubuntu:latest; gcc 7 & Open MPI 2.1.1

2019-07-31 Thread Fabian.Jakub via petsc-dev
Awesome, many thanks for your efforts! On 7/31/19 9:17 PM, Zhang, Junchao wrote: > Hi, Fabian, > I found it is an OpenMPI bug w.r.t self-to-self MPI_Send/Recv using > MPI_ANY_SOURCE for message matching. OpenMPI does not put correct value in > recv buffer. > I have a workaround >

Re: [petsc-dev] DMDAGlobalToNatural errors with Ubuntu:latest; gcc 7 & Open MPI 2.1.1

2019-07-31 Thread Zhang, Junchao via petsc-dev
Hi, Fabian, I found it is an OpenMPI bug w.r.t self-to-self MPI_Send/Recv using MPI_ANY_SOURCE for message matching. OpenMPI does not put correct value in recv buffer. I have a workaround

Re: [petsc-dev] [petsc-users] MatMultTranspose memory usage

2019-07-31 Thread Jed Brown via petsc-dev
https://bitbucket.org/petsc/petsc/issues/333/use-64-bit-indices-for-row-offsets-in "Smith, Barry F." writes: > Make an issue > > >> On Jul 30, 2019, at 7:00 PM, Jed Brown wrote: >> >> "Smith, Barry F. via petsc-users" writes: >> >>> The reason this worked for 4 processes is that the

Re: [petsc-dev] [petsc-users] MatMultTranspose memory usage

2019-07-31 Thread Smith, Barry F. via petsc-dev
Make an issue > On Jul 30, 2019, at 7:00 PM, Jed Brown wrote: > > "Smith, Barry F. via petsc-users" writes: > >> The reason this worked for 4 processes is that the largest count in that >> case was roughly 6,653,750,976/4 which does fit into an int. PETSc only >> needs to know the