Performance of MatMatSolve

2009-03-13 Thread Hong Zhang
David, You may run with option '-log_summary ' and check which function dominates the time. I suspect the symbolic factorization, because it is implemented sequentially in mumps. If this is the case, you may swich to superlu_dist which supports parallel symbolic factorization in the latest releas

multiple rhs

2009-03-13 Thread Barry Smith
On Mar 12, 2009, at 8:11 PM, David Fuentes wrote: > > I'm getting plapack errors in "external library" with > > MatMatMult_MPIDense_MPIDense > > with plapack? How is memory handled for a matrix > of type MATMPIDENSE? Are all NxN entries allocated and ready for > use at time of creation? Yes

matrix assembling time

2009-03-13 Thread Barry Smith
On Mar 13, 2009, at 12:48 PM, Ravi Kannan wrote: > Hi, >This is Ravi Kannan from CFD Research Corporation. One basic > question on > the ordering of linear solvers in PETSc: If my A matrix (in AX=B) is a > sparse matrix and the bandwidth of A (i.e. the distance between non > zero > elemen

matrix assembling time

2009-03-13 Thread Ravi Kannan
ments lead. -- Norbert Wiener -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090313/494389b6/attachment.htm>

matrix assembling time

2009-03-13 Thread Matthew Knepley
gt; >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090313/5e5a1b8b/attachment.htm>

Performance of MatMatSolve

2009-03-13 Thread David Fuentes
The majority of time in my code is spent in the MatMatSolve. I'm running MatMatSolve in parallel using Mumps as the factored matrix. Using top, I've noticed that during the MatMatSolve the majority of the load seems to be on the root process. Is this expected? Or do I most likely have a problem

Mumps, BoomerAMG, or Pestc?

2009-03-13 Thread REN
Hi, Matthew and Hong, I will use PETSc for my 4 years project. :) Thanks Zhengyong Ren On Fri, 2009-03-13 at 08:46 -0500, Hong Zhang wrote: > Zhengyong, > > > I known your excellent codes several months ago. I want to use Mumps as > > direct solver for multi-sources problem and BoomerAMG for rea

matrix assembling time

2009-03-13 Thread Matthew Knepley
experiments lead. -- Norbert Wiener -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090313/53ee7e72/attachment.htm>

Mumps, BoomerAMG, or Pestc?

2009-03-13 Thread RenZhengYong
ich, Switzerland Tel: +41 44 633 37561 e-mail: renzh at ethz.ch Gmail: renzhengyong at gmail.com -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090313/81b4c22b/attachment.htm>

Mumps, BoomerAMG, or Pestc?

2009-03-13 Thread RenZhengYong
scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090313/146439f7/attachment.htm>

matrix assembling time

2009-03-13 Thread Ravi Kannan
Hi, This is Ravi Kannan from CFD Research Corporation. One basic question on the ordering of linear solvers in PETSc: If my A matrix (in AX=B) is a sparse matrix and the bandwidth of A (i.e. the distance between non zero elements) is high, does PETSc reorder the matrix/matrix-equations so as to

Mumps, BoomerAMG, or Pestc?

2009-03-13 Thread Hong Zhang
Zhengyong, > I known your excellent codes several months ago. I want to use Mumps as > direct solver for multi-sources problem and BoomerAMG for real-value based > iterative solver. I went though the doc of the petsc. It showed that petsc > offered a easy and top interface to these two packages.

Mumps, BoomerAMG, or Pestc?

2009-03-13 Thread Matthew Knepley
rt Wiener -- next part -- An HTML attachment was scrubbed... URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20090313/55ecf50a/attachment.htm>