Re: [petsc-users] matrix assembly

2014-11-16 Thread Barry Smith
Yes, flush assembly moves all the values you have accumulated so far to the correct process so if you set any off process values there will be communication. But this also means that the final assembly will have a corresponding less amount of communication. Barry > On Nov 17, 2014, at 12

[petsc-users] matrix assembly

2014-11-16 Thread Fande Kong
Hi all, If I do a matrix assembly with the flag MAT_FLUSH_ASSEMBLY, will it involve any communications across processors? Fande,

Re: [petsc-users] KSPSolve() get slower when preconditioner or Cholesky factor is re-used with many multiple RHS.

2014-11-16 Thread Matthew Knepley
On Sun, Nov 16, 2014 at 10:11 AM, Evan Um wrote: > Dear Matthew, > > Thanks for your comments. I will prepare log summary. To activate an > separate log stage for each iteration (i.e. each RHS vector), I tried the > code below, but got an error. Could you give me a comment? Does > PetscLogView()

Re: [petsc-users] KSPSolve() get slower when preconditioner or Cholesky factor is re-used with many multiple RHS.

2014-11-16 Thread Evan Um
Dear Matthew, Thanks for your comments. I will prepare log summary. To activate an separate log stage for each iteration (i.e. each RHS vector), I tried the code below, but got an error. Could you give me a comment? Does PetscLogView() allow users to use different output file names such that each

Re: [petsc-users] KSPSolve() get slower when preconditioner or Cholesky factor is re-used with many multiple RHS.

2014-11-16 Thread Matthew Knepley
On Sat, Nov 15, 2014 at 9:24 PM, Evan Um wrote: > Dear PETSC users, > > I would like to show you a performance issue when Cholesky factor is > re-used as a direct solver or pre-conditioner many times with many > right-hand side vectors. Does anyone suggest a solution about this issue? > In advanc