Re: [petsc-users] [EXT]Re: reuse Mumps factorization for multiple RHS

2020-08-05 Thread Anthony Paul Haas
ew Knepley > *Sent:* Tuesday, August 4, 2020 7:29 PM > *To:* Anthony Paul Haas > *Cc:* petsc-users > *Subject:* Re: [petsc-users] reuse Mumps factorization for multiple RHS > > On Tue, Aug 4, 2020 at 7:57 PM Anthony Paul Haas > wrote: > > Hello, > > When us

Re: [petsc-users] reuse Mumps factorization for multiple RHS

2020-08-04 Thread Zhang, Hong via petsc-users
See case '"-num_rhs" in petsc/src/ksp/ksp/tests/ex30.c Hong From: petsc-users on behalf of Matthew Knepley Sent: Tuesday, August 4, 2020 7:29 PM To: Anthony Paul Haas Cc: petsc-users Subject: Re: [petsc-users] reuse Mumps factorization for m

Re: [petsc-users] reuse Mumps factorization for multiple RHS

2020-08-04 Thread Matthew Knepley
On Tue, Aug 4, 2020 at 7:57 PM Anthony Paul Haas wrote: > Hello, > > When using Mumps to solve a linear system of equations (see below), can I > reuse the factorization to solve for multiple RHS, ie, can I use KSPSolve > multiple > times while only building a different RHS in

Re: [petsc-users] reuse Mumps factorization for multiple RHS

2020-08-04 Thread Barry Smith
Yes, it automatically uses the same factorization. > On Aug 4, 2020, at 6:55 PM, Anthony Paul Haas wrote: > > Hello, > > When using Mumps to solve a linear system of equations (see below), can I > reuse the factorization to solve for multiple RHS, ie, can I use KSPSolve

[petsc-users] reuse Mumps factorization for multiple RHS

2020-08-04 Thread Anthony Paul Haas
Hello, When using Mumps to solve a linear system of equations (see below), can I reuse the factorization to solve for multiple RHS, ie, can I use KSPSolve multiple times while only building a different RHS in between the calls to KSPSolve? Thanks, Anthony call KSPSetType(self%ksp,KSPPREONLY

Re: [petsc-users] Code performance for solving multiple RHS

2016-08-24 Thread Barry Smith
have many right hand sides it obviously pays to > > > spend more time building the preconditioner so that each solve is faster. > > > If you provide more information on your linear system we might have > > > suggestions. CFD so is your linear system a Poisson problem? Are

Re: [petsc-users] Code performance for solving multiple RHS

2016-08-24 Thread Harshad Ranadive
oner so that each solve is faster. > If you provide more information on your linear system we might have > suggestions. CFD so is your linear system a Poisson problem? Are you using > geometric or algebraic multigrid with PETSc? It not a Poisson problem how > can you describe the

Re: [petsc-users] Code performance for solving multiple RHS

2016-08-11 Thread Barry Smith
ns. > > CFD so is your linear system a Poisson problem? Are you using geometric or > > algebraic multigrid with PETSc? It not a Poisson problem how can you > > describe the linear system? > > > > Barry > > > > > > > > > On Aug 10, 2016, at

Re: [petsc-users] Code performance for solving multiple RHS

2016-08-11 Thread Harshad Ranadive
the linear system? > > > > Barry > > > > > > > > > On Aug 10, 2016, at 9:54 PM, Harshad Ranadive < > harshadranad...@gmail.com> wrote: > > > > > > Hi All, > > > > > > I have currently added the PETSc library

Re: [petsc-users] Code performance for solving multiple RHS

2016-08-11 Thread Barry Smith
Barry > > > > > On Aug 10, 2016, at 9:54 PM, Harshad Ranadive > > wrote: > > > > Hi All, > > > > I have currently added the PETSc library with our CFD solver. > > > > In this I need to use KSPSolve(...) multiple time for the same matrix A

Re: [petsc-users] Code performance for solving multiple RHS

2016-08-10 Thread Barry Smith
gt; > In this I need to use KSPSolve(...) multiple time for the same matrix A. I > have read that PETSc does not support passing multiple RHS vectors in the > form of a matrix and the only solution to this is calling KSPSolve multiple > times as in example given here: > http://www

Re: [petsc-users] Code performance for solving multiple RHS

2016-08-10 Thread Matthew Knepley
On Wed, Aug 10, 2016 at 9:54 PM, Harshad Ranadive wrote: > Hi All, > > I have currently added the PETSc library with our CFD solver. > > In this I need to use KSPSolve(...) multiple time for the same matrix A. I > have read that PETSc does not support passing multiple RHS vec

[petsc-users] Code performance for solving multiple RHS

2016-08-10 Thread Harshad Ranadive
Hi All, I have currently added the PETSc library with our CFD solver. In this I need to use KSPSolve(...) multiple time for the same matrix A. I have read that PETSc does not support passing multiple RHS vectors in the form of a matrix and the only solution to this is calling KSPSolve multiple

Re: [petsc-users] KSPSolve() get slower when preconditioner or Cholesky factor is re-used with many multiple RHS.

2014-11-16 Thread Matthew Knepley
On Sun, Nov 16, 2014 at 10:11 AM, Evan Um wrote: > Dear Matthew, > > Thanks for your comments. I will prepare log summary. To activate an > separate log stage for each iteration (i.e. each RHS vector), I tried the > code below, but got an error. Could you give me a comment? Does > PetscLogView()

Re: [petsc-users] KSPSolve() get slower when preconditioner or Cholesky factor is re-used with many multiple RHS.

2014-11-16 Thread Evan Um
Dear Matthew, Thanks for your comments. I will prepare log summary. To activate an separate log stage for each iteration (i.e. each RHS vector), I tried the code below, but got an error. Could you give me a comment? Does PetscLogView() allow users to use different output file names such that each

Re: [petsc-users] KSPSolve() get slower when preconditioner or Cholesky factor is re-used with many multiple RHS.

2014-11-16 Thread Matthew Knepley
On Sat, Nov 15, 2014 at 9:24 PM, Evan Um wrote: > Dear PETSC users, > > I would like to show you a performance issue when Cholesky factor is > re-used as a direct solver or pre-conditioner many times with many > right-hand side vectors. Does anyone suggest a solution about this issue? > In advanc

[petsc-users] KSPSolve() get slower when preconditioner or Cholesky factor is re-used with many multiple RHS.

2014-11-15 Thread Evan Um
Dear PETSC users, I would like to show you a performance issue when Cholesky factor is re-used as a direct solver or pre-conditioner many times with many right-hand side vectors. Does anyone suggest a solution about this issue? In advance, thanks for your help. Regards, Evan Example 1: I used MU

[petsc-users] Using Petsc with multiple RHS

2011-12-22 Thread Barry Smith
I will use thousands of them, > moreover it will require a lot of memory to store them as a dense matrix. > > On 11.12.2011 18:50, Barry Smith wrote: >> On Dec 11, 2011, at 9:29 AM, Alexander Grayver wrote: >>One by one > > I'm wondering why? All main direct packa

[petsc-users] Using Petsc with multiple RHS

2011-12-12 Thread Alexander Grayver
common to have sparse RHS. I am wondering how about other disciplines? I don't know about other solvers, but when I use multiple RHS in mumps it works several times faster then solving for them sequentially. It is just my experience. And I also can imagine that using sparsity is another obv

[petsc-users] Using Petsc with multiple RHS

2011-12-12 Thread Xiangdong Liang
solve for many RHS very quickly. > Surely I don't expect my solution to be sparse and it is not, but at least > in electromagnetics it is pretty common to have sparse RHS. > I am wondering how about other disciplines? > > I don't know about other solvers, but when I use mu

[petsc-users] Using Petsc with multiple RHS

2011-12-12 Thread Alexander Grayver
11.12.2011 18:50, Barry Smith wrote: > On Dec 11, 2011, at 9:29 AM, Alexander Grayver wrote: > One by one I'm wondering why? All main direct packages like MUMPS, SuperLU_DIST, PaSTiX support multiple RHS. > We do not handle a sparse right hand side. Since I already transfer

[petsc-users] Using Petsc with multiple RHS

2011-12-12 Thread Hong Zhang
se matrix. >> >> On 11.12.2011 18:50, Barry Smith wrote: >>> On Dec 11, 2011, at 9:29 AM, Alexander Grayver wrote: >>> ? ?One by one >> >> I'm wondering why? All main direct packages like MUMPS, SuperLU_DIST, PaSTiX >> support multiple RH

[petsc-users] Using Petsc with multiple RHS

2011-12-12 Thread Barry Smith
ot of memory to store them as a dense matrix. > > On 11.12.2011 18:50, Barry Smith wrote: >> On Dec 11, 2011, at 9:29 AM, Alexander Grayver wrote: >>One by one > > I'm wondering why? All main direct packages like MUMPS, SuperLU_DIST, PaSTiX > support m

[petsc-users] Using Petsc with multiple RHS

2011-12-12 Thread Matthew Knepley
hem, > moreover it will require a lot of memory to store them as a dense matrix. > > On 11.12.2011 18:50, Barry Smith wrote: > >> On Dec 11, 2011, at 9:29 AM, Alexander Grayver wrote: >>One by one >> > > I'm wondering why? All main direct packages lik

[petsc-users] Using Petsc with multiple RHS

2011-12-11 Thread Alexander Grayver
Hello, I used to use MUMPS directly with sparse multiple RHS. Now I use MUMPS through PETSc interface and solution for multiple RHS takes 1.5-2 times longer (MatMatSolve). My first question is whether do you use multiple RHS internally or you solve one-by-one? Second guess concerns the option

[petsc-users] Using Petsc with multiple RHS

2011-12-11 Thread Barry Smith
On Dec 11, 2011, at 9:29 AM, Alexander Grayver wrote: > Hello, > > I used to use MUMPS directly with sparse multiple RHS. Now I use MUMPS > through PETSc interface and solution for multiple RHS takes 1.5-2 times > longer (MatMatSolve). > My first question is whether do yo

[petsc-users] Question about KSPSolve for multiple rhs

2011-10-26 Thread Bogdan Dita
Hello, First of all I'm new to PETSc so please be pacient with me. I'm trying to solve 2 linear systems with the same A matrix using superlu_dist, and so i'm using the same ksp context for both systems. The matrix is a square matrix of 84719 with 351289 nonzero elements. The time for the

[petsc-users] Question about KSPSolve for multiple rhs

2011-10-26 Thread Barry Smith
On Oct 26, 2011, at 10:34 AM, Bogdan Dita wrote: > > Hello, > > First of all I'm new to PETSc so please be pacient with me. > I'm trying to solve 2 linear systems with the same A matrix using > superlu_dist, and so i'm using the same ksp context for both systems. > The matrix is a square

KSPSolve(), multiple rhs and preconditioner

2009-03-26 Thread Matthew Knepley
On Thu, Mar 26, 2009 at 1:03 AM, Yujie wrote: > Hi, PETSc developers > > I am wondering what the difference is when iterative preconditioners (such > as ILU, sparse approximation inverse and so on) are used in single and > multiple rhs using KSPSolve(). > > In m

KSPSolve(), multiple rhs and preconditioner

2009-03-25 Thread Yujie
Hi, PETSc developers I am wondering what the difference is when iterative preconditioners (such as ILU, sparse approximation inverse and so on) are used in single and multiple rhs using KSPSolve(). In multiple rhs case, the preconditioners are made to each rhs? Thanks a lot. Regards, Yujie

multiple rhs

2009-03-16 Thread Hong Zhang
>>> website looks like it hasn't been updated since 2007. Maybe PLAPACK >>> is in need of some maintenance? You said "nonsquare", is plapack working >>> for you for square matrices ? >> >> Yes, it works for square matrices. >> See ~petsc/src/mat/examples/tests/ex103.c and ex107.c >> >> Hong > >

multiple rhs

2009-03-15 Thread David Fuentes
On Sat, 14 Mar 2009, Hong Zhang wrote: > >> Very Many Thanks for your efforts on this Barry. The PLAPACK >> website looks like it hasn't been updated since 2007. Maybe PLAPACK >> is in need of some maintenance? You said "nonsquare", is plapack working >> for you for square matrices ? > > Yes, it

multiple rhs

2009-03-14 Thread Hong Zhang
> Very Many Thanks for your efforts on this Barry. The PLAPACK > website looks like it hasn't been updated since 2007. Maybe PLAPACK > is in need of some maintenance? You said "nonsquare", is plapack working for > you for square matrices ? Yes, it works for square matrices. See ~petsc/src/mat/ex

multiple rhs

2009-03-14 Thread David Fuentes
Very Many Thanks for your efforts on this Barry. The PLAPACK website looks like it hasn't been updated since 2007. Maybe PLAPACK is in need of some maintenance? You said "nonsquare", is plapack working for you for square matrices ? thanks again, df >> >> >> >> [0]PETSC ERROR: -

multiple rhs

2009-03-13 Thread Barry Smith
On Mar 12, 2009, at 8:11 PM, David Fuentes wrote: > > I'm getting plapack errors in "external library" with > > MatMatMult_MPIDense_MPIDense > > with plapack? How is memory handled for a matrix > of type MATMPIDENSE? Are all NxN entries allocated and ready for > use at time of creation? Yes

multiple rhs

2009-03-12 Thread Hong Zhang
> > What solver would I use to do a factorization of a dense parallel matrix w/ > plapack? MAT_SOLVER_PLAPACK. See ~petsc-3.0.0/src/mat/examples/tests/ex103.c Hong > > I don't see a > > MPI_SOLVER_PLAPACK > > ? > > > > > > On Thu, 12 Mar 2009, Hong Zhang wrote: > >> Is MatCreateMP

multiple rhs

2009-03-12 Thread David Fuentes
I'm getting plapack errors in "external library" with MatMatMult_MPIDense_MPIDense with plapack? How is memory handled for a matrix of type MATMPIDENSE? Are all NxN entries allocated and ready for use at time of creation? or do I have to MatInsertValues then Assemble to be ready to use a matri

multiple rhs

2009-03-12 Thread David Fuentes
Hi Hong, What solver would I use to do a factorization of a dense parallel matrix w/ plapack? I don't see a MPI_SOLVER_PLAPACK ? On Thu, 12 Mar 2009, Hong Zhang wrote: > >>> >>> Is MatCreateMPIDense the recommended matrix type to interface w/ mumps ? >>> Does it use a sparse direct stor

multiple rhs

2009-03-12 Thread Hong Zhang
>> >> Is MatCreateMPIDense the recommended matrix type to interface w/ mumps ? >> Does it use a sparse direct storage or allocate the full n x n matrix? > > > No, MUMPS is "sparse direct" so it uses MPIAIJ. For mpi dense matrix, you can use plapack Hong > > >> >> df >> >> On Thu, 12 Mar 2009, Ma

multiple rhs

2009-03-12 Thread Matthew Knepley
On Thu, Mar 12, 2009 at 1:50 PM, David Fuentes wrote: > Thanks Matt, > > Is MatCreateMPIDense the recommended matrix type to interface w/ mumps ? > Does it use a sparse direct storage or allocate the full n x n matrix? No, MUMPS is "sparse direct" so it uses MPIAIJ. Matt > > df > > On Thu,

multiple rhs

2009-03-12 Thread David Fuentes
Thanks Matt, Is MatCreateMPIDense the recommended matrix type to interface w/ mumps ? Does it use a sparse direct storage or allocate the full n x n matrix? df On Thu, 12 Mar 2009, Matthew Knepley wrote: > You can try using a sparse direct solver like MUMPS instead of PETSc LU. > > Matt

multiple rhs

2009-03-12 Thread Matthew Knepley
You can try using a sparse direct solver like MUMPS instead of PETSc LU. Matt On Thu, Mar 12, 2009 at 9:17 AM, David Fuentes wrote: > Thanks Hong, > > The complete error message is attached. I think I just had too big > of a matrix. The matrix i'm trying to factor is 327680 x 327680 > > > [0]

multiple rhs

2009-03-12 Thread David Fuentes
Thanks Hong, The complete error message is attached. I think I just had too big of a matrix. The matrix i'm trying to factor is 327680 x 327680 [0]PETSC ERROR: - Error Message [0]PETSC ERROR: Out of memory. This could be due to allocating

multiple rhs

2009-03-12 Thread Hong Zhang
David, I do not see any problem with the calling sequence. The memory is determined in MatLUFactorSymbolic(). Does your code crashes within MatLUFactorSymbolic()? Please send us complete error message. Hong On Wed, 11 Mar 2009, David Fuentes wrote: > > Hello, > > I have a sparse matrix, A, wi

multiple rhs

2009-03-11 Thread David Fuentes
Hello, I have a sparse matrix, A, with which I want to solve multiple right hand sides with a direct solver. Is this the correct call sequence ? MatGetFactor(A,MAT_SOLVER_PETSC,MAT_FACTOR_LU,&Afact); IS isrow,iscol; MatGetOrdering(A,MATORDERING_ND,&isrow,&iscol); MatLUFactorSymboli