Re: [petsc-users] With-batch (new) flags

2019-05-20 Thread Smith, Barry F. via petsc-users
Yes, this is totally my fault. By removing the help message it made configure treat the argument as a string hence '0' was true and you got the error message. For fblaslapack one should use -known-64-bit-blas-indices=0 just as you did, I have pushed a fix to master What kind of system is

Re: [petsc-users] Question about TSComputeRHSJacobianConstant

2019-05-20 Thread Zhang, Hong via petsc-users
Sajid, I have also rested the simpler problem you provided. The branch hongzh/fix-computejacobian gives exactly the same numerical results as the master branch does, but runs much faster. So the solver seems to work correctly. To rule out the possible compiler issues, you might want to try a

Re: [petsc-users] DMNetwork in petsc4py

2019-05-20 Thread Smith, Barry F. via petsc-users
Justin, That would be great. No one is working on it that I know of. Barry > On May 20, 2019, at 1:48 PM, Justin Chang via petsc-users > wrote: > > Hi all, > > Is there any current effort or plan to make the DMNetwork calls available in > petsc4py? I don't see anything DMNetwork

Re: [petsc-users] With-batch (new) flags

2019-05-20 Thread Balay, Satish via petsc-users
I'm not yet sure what the correct fix is - but the following change should get this going.. diff --git a/config/BuildSystem/config/packages/BlasLapack.py b/config/BuildSystem/config/packages/BlasLapack.py index e0310da4b0..7355f1a369 100644 --- a/config/BuildSystem/config/packages/BlasLapack.py

Re: [petsc-users] With-batch (new) flags

2019-05-20 Thread Mark Adams via petsc-users
On Mon, May 20, 2019 at 3:55 PM Balay, Satish wrote: > for ex: ilp version of mkl is --known-64-bit-blas-indices=1 while lp mkl > is --known-64-bit-blas-indices=0 > > Default blas we normally use is --known-64-bit-blas-indices=0 [they don't > use 64bit indices] > Humm, that is what Dylan (in

Re: [petsc-users] With-batch (new) flags

2019-05-20 Thread Balay, Satish via petsc-users
for ex: ilp version of mkl is --known-64-bit-blas-indices=1 while lp mkl is --known-64-bit-blas-indices=0 Default blas we normally use is --known-64-bit-blas-indices=0 [they don't use 64bit indices] Satish On Mon, 20 May 2019, Mark Adams via petsc-users wrote: > We are getting this failure.

Re: [petsc-users] Creating a DMNetwork from a DMPlex

2019-05-20 Thread Swarnava Ghosh via petsc-users
Hi Barry and Matt, Maybe try building by hand in a DMNetwork using a handrawn mesh with just a few vertices and endless and see if what you want to do makes sense > Okay, will try to do that. Do you have any DMNetwork example which I could follow. I think DMNetwork is not buying you anything

Re: [petsc-users] problem with generating simplicies mesh

2019-05-20 Thread Stefano Zampini via petsc-users
Matt, The code is actually for 2d. > On May 20, 2019, at 12:54 PM, Matthew Knepley via petsc-users > wrote: > > On Sun, May 19, 2019 at 9:22 AM 陳鳴諭 via petsc-users > wrote: > I have problem with generating simplicies mesh. > I do as the description in

Re: [petsc-users] problem with generating simplicies mesh

2019-05-20 Thread Matthew Knepley via petsc-users
On Sun, May 19, 2019 at 9:22 AM 陳鳴諭 via petsc-users wrote: > I have problem with generating simplicies mesh. > I do as the description in DMPlexCreateBoxmesh says, but still meet error. > Stefano is right that you will need a mesh generator for a simplex mesh. However, you are asking for a 1D

Re: [petsc-users] Calling LAPACK routines from PETSc

2019-05-20 Thread Dave Lee via petsc-users
Thanks Barry, I found some helpful examples on the intel lapack site - moral of the story: using C ordering for input matrix, but transposed output matrices leads to a consistent solution. Cheers, Dave. On Mon, May 20, 2019 at 6:07 PM Smith, Barry F. wrote: > > > > On May 20, 2019, at 2:28

Re: [petsc-users] Calling LAPACK routines from PETSc

2019-05-20 Thread Smith, Barry F. via petsc-users
> On May 20, 2019, at 2:28 AM, Dave Lee wrote: > > Thanks Jed and Barry, > > So, just to confirm, > > -- From the KSP_GMRES structure, if I call *HH(a,b), that will return the row > a, column b entry of the Hessenberg matrix (while the back end array > *hh_origin array is ordering using

Re: [petsc-users] Creating a DMNetwork from a DMPlex

2019-05-20 Thread Smith, Barry F. via petsc-users
Maybe try building by hand in a DMNetwork using a handrawn mesh with just a few vertices and endless and see if what you want to do makes sense > On May 20, 2019, at 2:04 AM, Swarnava Ghosh wrote: > > Hi Barry, > > Thank you for your email. My planned discretization is based on the fact

Re: [petsc-users] Calling LAPACK routines from PETSc

2019-05-20 Thread Smith, Barry F. via petsc-users
The little work arrays in GMRES tend to be stored in Fortran ordering; there is no C style p[][] indexing into such arrays. Thus the arrays can safely be sent to LAPACK. The only trick is knowing the two dimensions and as Jed say the "leading dimension parameter. He gave you a place to

Re: [petsc-users] Calling LAPACK routines from PETSc

2019-05-20 Thread Jed Brown via petsc-users
Dave Lee via petsc-users writes: > Hi Petsc, > > I'm attempting to implement a "hookstep" for the SNES trust region solver. > Essentially what I'm trying to do is replace the solution of the least > squares problem at the end of each GMRES solve with a modified solution > with a norm that is