[petsc-dev] config/PETSc/FEM.py

2012-06-05 Thread Barry Smith
Matt Why is config/PETSc/FEM.py where it is. It isn't configure stuff is it? Can it be removed or moved to somewhere proper? What about ./config/BuildSystem/install.old Can that be removed? Thanks Barry

[petsc-dev] misleading error message in VecWAXPY

2012-06-05 Thread Jed Brown
Department of Mathematics and Center for Computation & Technology > > Louisiana State University, Baton Rouge, LA 70803, USA > > Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276 > http://www.math.lsu.edu/~bourdin > > > > > > > > > > > > > > > > -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120605/0b1017e4/attachment.html>

[petsc-dev] misleading error message in VecWAXPY

2012-06-05 Thread Barry Smith
Suggestions for improvements for the checks? #define PetscValidLogicalCollectiveScalar(a,b,c)\ do { \ PetscErrorCode _7_ierr; \ PetscReal b1[2],b2[2];

[petsc-dev] API changes in MatIS

2012-06-05 Thread Stefano Zampini
e matrix on preselected vertices during MatAssemblyBegin/End? Note that this will imply that standard Neumann-Neumann methods will not work (they need the unassembled matrix to solve for the local Schur complements). -- Stefano -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120605/2f2ca450/attachment.html>

[petsc-dev] misleading error message in VecWAXPY

2012-06-05 Thread Blaise Bourdin
Hi, It looks like when VecWAXPY is called with alpha=Nan, PetscValidLogicalCollectiveScalar causes the message "Scalar value must be same on all processes, argument # 2" to be printed. This is a bit misleading, and confusing when running on only 1 processor. Is this something worth fixing? Bl

[petsc-dev] segv on Cray with 64K processes

2012-06-05 Thread Mark F. Adams
have that info tomorrow. > > Can you instruct one process (or a few) to dump core? I suppose so. How does one do that? -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120605/839046fa/attachment.html>

[petsc-dev] segv on Cray with 64K processes

2012-06-05 Thread Mark F. Adams
ed printfs to this and submitted the job so I should have that info tomorrow. > > > I can keep diving in, but anyone have any ideas on this? > > Mark > -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120605/7fb6fd97/attachment.html>

[petsc-dev] segv on Cray with 64K processes

2012-06-05 Thread Jed Brown
the Cray. -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120605/899ca9e3/attachment.html>

[petsc-dev] segv on Cray with 64K processes

2012-06-05 Thread Mark F. Adams
I've got a consistent segv on the Cray at NERSC with 64K cores. No problems with smaller jobs. It seems to happen in here: /* Done after init due to a bug in MPICH-GM? */ ierr = PetscErrorPrintfInitialize();CHKERRQ(ierr); I can keep diving in, but anyone have any ideas on this? Mark

[petsc-dev] segv on Cray with 64K processes

2012-06-05 Thread Jed Brown
s (or a few) to dump core? -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120605/e9c3a48d/attachment.html>

[petsc-dev] segv on Cray with 64K processes

2012-06-05 Thread Jed Brown
.mcs.anl.gov/pipermail/petsc-dev/attachments/20120605/2813d88a/attachment.html>

[petsc-dev] [petsc-users] use mpic++ instead of mpicc

2012-06-05 Thread Jed Brown
An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120605/2191ec86/attachment.html>

[petsc-dev] API changes in MatIS

2012-06-05 Thread Jed Brown
assembled matrix to > solve for the local Schur complements). I'm not too concerned about that since I consider the classic N-N and original FETI methods to be rather special-purpose compared to the newer generation. I would like to limit the number of copies of a matrix to control peak memory usage. -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120605/f61c843e/attachment.html>