[petsc-dev] [issue1595] Issues of limited number of MPI communicators when having many instances of hypre boomerAMG with Moose

2018-04-04 Thread Rob Falgout hypre Tracker
Rob Falgout added the comment: 1. Yes. AMG coarsens recursively based on matrix entries, so you can't know a priori which ranks will be active by the time you get to the coarsest grid. 2. If you use the other direct solver option, it will be a little less performant. Ulrike can probably com

Re: [petsc-dev] upcoming release and testing [v2]

2018-04-04 Thread Satish Balay
On Wed, 4 Apr 2018, Satish Balay wrote: > I have the following branches in queue for master [via next-tmp - this build > will start now] > > origin/dalcinl/fix-scatter-destroy > origin/dalcinl/fix-vecscatter-initpkg > origin/knepley/fix-cylinder-mesh > origin/rmills/feature-aijmkl-add-ma

Re: [petsc-dev] [issue1595] Issues of limited number of MPI communicators when having many instances of hypre boomerAMG with Moose

2018-04-04 Thread Jed Brown
1. Are you saying that the ranks involved in the coarse solve depends on the matrix entries? 2. Okay, but this might degrade performance, right? 4. I think you are right that *if* all sends and receives have been posted (may be hard to guarantee if the user is using threads) and MPI_ANY_SOURCE i

Re: [petsc-dev] Nullspace with right preconditioning

2018-04-04 Thread Jed Brown
Matthew Knepley writes: > On Wed, Apr 4, 2018 at 3:23 PM, Smith, Barry F. wrote: > >> >> It may not be broken; it may just be misunderstood or incorrect approach. >> >> You'll need to explain why and where exactly you think it is broken. > > > I am solving Stokes. I will push the example for

Re: [petsc-dev] Nullspace with right preconditioning

2018-04-04 Thread Matthew Knepley
On Wed, Apr 4, 2018 at 3:23 PM, Smith, Barry F. wrote: > > It may not be broken; it may just be misunderstood or incorrect approach. > > You'll need to explain why and where exactly you think it is broken. I am solving Stokes. I will push the example for you to run tonight. Basically, I put

Re: [petsc-dev] Nullspace with right preconditioning

2018-04-04 Thread Smith, Barry F.
It may not be broken; it may just be misunderstood or incorrect approach. You'll need to explain why and where exactly you think it is broken. Barry > On Apr 4, 2018, at 1:13 PM, Matthew Knepley wrote: > > This is broken. Did we know this was broken? > > Thanks, > > Matt >

[petsc-dev] Nullspace with right preconditioning

2018-04-04 Thread Matthew Knepley
This is broken. Did we know this was broken? Thanks, Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/

[petsc-dev] [issue1595] Issues of limited number of MPI communicators when having many instances of hypre boomerAMG with Moose

2018-04-04 Thread Rob Falgout hypre Tracker
Rob Falgout added the comment: FYI, I meant to say "corresponding user recv requests" in 4 below. -Rob hypre Issue Tracker

Re: [petsc-dev] upcoming release and testing [v2]

2018-04-04 Thread Satish Balay
FYI - I have the release related strings in https://bitbucket.org/petsc/petsc/branch/balay/release-3.9 This change should be similar to what was done for 3.8 release https://bitbucket.org/petsc/petsc/commits/0e50f9e530a7b78427514d3e384f6941d4a9cc62?at=v3.8 For now - I'm using tomorrow's date [b

[petsc-dev] [issue1595] Issues of limited number of MPI communicators when having many instances of hypre boomerAMG with Moose

2018-04-04 Thread Rob Falgout hypre Tracker
Rob Falgout added the comment: Hi All, Some comments and questions: 1. The Comm_create is used to create a subcommunicator that involves only the currently active MPI tasks so that the Allgather() will happen only over that subset. I don't think we can create this once, attach it to a paren

[petsc-dev] Add PetscFree1() ?

2018-04-04 Thread Lisandro Dalcin
Jed, just for the sake of consistency, don't we need PetscFree1()? Also, PetscFree() is a macro, and evaluates twice the first arg, I just had to fix a bug in my code that was doing while (ctx->size > 0) PetscFree(ctx->arrayOfPointers[--ctx->size]); PetscFree(ctx->arrayOfPointers) I guess w

Re: [petsc-dev] FAS indentation

2018-04-04 Thread Satish Balay
On Wed, 4 Apr 2018, Dener, Alp wrote: > (I didn’t wanna push directly to next without approval). Just a note: we would never commit/push directly to next. All fixes should go to the corresponding feature branch - this way they will get to master when ready [and next is never directly merged

[petsc-dev] upcoming release and testing [v2]

2018-04-04 Thread Satish Balay
[starting a new thread] As of now master builds are clean! http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2018/04/03/master.html next is improved but still has issues that need fixing. http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2018/04/04/next.html most-likely are are related t

Re: [petsc-dev] [issue1595] Issues of limited number of MPI communicators when having many instances of hypre boomerAMG with Moose

2018-04-04 Thread Jed Brown
Yes, this is a real issue for MOOSE which sometimes has thousands of active single-field solvers. PETSc can limit the number of fine-level communicators by retaining the dup'd communicator so the same communicator can be passed to hypre for each solver, but cannot control the MPI_Comm_create fo

[petsc-dev] Fwd: In March, you had 482 users visit your website (Google Analytics)

2018-04-04 Thread Matthew Knepley
Good month. Matt -- Forwarded message -- From: Google Analytics Date: Wed, Apr 4, 2018 at 5:21 AM Subject: In March, you had 482 users visit your website (Google Analytics) To: knep...@gmail.com Your Snapshot for March Want to get a monthly snapshot for more views? Edit your