On Sun, Mar 1, 2015 at 12:30 AM, Sun, Hui <hus...@ucsd.edu> wrote:

> Thank you Barry. I have yet two more questions:
>
> 1) If I have a DMDA and I use KSPSetComputeOperators and KSPSetComputeRHS
> to set up matrices and rhs, and I use geometric mg, what if I want to
> change my rhs many times? Should I write many KSPSetComputeRHS, and
> register them with ksp? Or is there a simple way to just register the rhs
> with ksp as a vector?
>

If you just give a vector, you could use -pc_mg_galerkin to have it
projected. Same with the matrix.


> 2) How do I create a Mat, whose cols follow the DMDA parallelization, and
> whose rows are serial?
>

Sparse matrices are always divided by rows across processors.

  Thanks,

    Matt


> By the way, I've figured out and fixed the bugs in my code concerning
> using mg with DMDA having 4 dof. It has to do with the interpolations. Now
> I can see mg works well with 4 dof DMDA.
>
> Best,
> Hui
>
> ________________________________________
> From: Barry Smith [bsm...@mcs.anl.gov]
> Sent: Saturday, February 28, 2015 9:35 AM
> To: Sun, Hui
> Cc: petsc-users@mcs.anl.gov
> Subject: Re: [petsc-users] DMDA with dof=4, multigrid solver
>
> > On Feb 27, 2015, at 7:25 PM, Sun, Hui <hus...@ucsd.edu> wrote:
> >
> > Thank you Barry. Another question: I observe that in those ksp examples,
> whenever multigrid is used, DMDA is also used, besides,
> KSPSetComputeOperators and KSPSetComputeRHS are also used.
> >
> > Is it true that
> > 1) Only DMDA can use mg?
>
>    No this is not true
>
> > 2) We have to set up matrices and rhs using KSPSetComputeOperators  and
> KSPSetComputeRHS?
>
>    No you do not have to
>
> > We cannot create a matrix and add it to KSP if we want to use mg?
>
>     Yes you can.
>
>    There are many many variants of multigrid one can do with PETSc; we
> don't have the time to have examples of all the possibilities.
>
> More details
>
> > 1) Only DMDA can use mg?
>
>    Because DMDA provides structured grids with easy interpolation between
> levels and it is easy for users to write Jacobians we have many examples
> that use the DMDA. However, so long as YOU (or something) can provide
> interpolation between the multigrid levels you can use multigrid. For
> example PCGAMG uses algebraic multigrid to generate the interpolations. If
> you have your own interpolations you can provide them with
> PCMGSetInterpolation() (when you use PCMG with DMDA PETSc essentially
> handles those details automatically for you).
>
> > 2) We have to set up matrices and rhs using KSPSetComputeOperators  and
> KSPSetComputeRHS?
>
>    Normally with geometric multigrid one discretizes the operator on each
> level of the grid. Thus the user has to provide several matrices (one for
> each level). KSPSetComputeOperators() is ONE way that the user can provide
> them. You can also provide them by call PCMGetSmoother(pc,level,&ksp) and
> then call KSPSetOperators(ksp,...) for each of the levels
> (KSPSetComputeOperators() essentially does the book keeping for you).
>
> > We cannot create a matrix and add it to KSP if we want to use mg?
>
>     As I said in 2 normally multigrid requires you to provide a
> discretized operator at each level. But with Galerkin coarse grids (which
> is what algebraic multigrid users and can also be used by geometric
> multigrid) the user does not provide coarser grid operators instead the
> code computes them automatically from the formula R*A*P where R is the
> restriction operator used in multigrid and P is the interpolation operator
> (usually the transpose of P).
>
>    If you are looking for a simple automatic multigrid then you want to
> use PCGAMG in PETSc, it does algebraic multigrid and doesn't require you
> provide interpolations or coarser operators. However algebraic multigrid
> doesn't work for all problems; though it does work for many. Try it with
> -pc_type gamg
>
>   Barry
>
> >
> > Best,
> > Hui
> >
> > ________________________________________
> > From: Barry Smith [bsm...@mcs.anl.gov]
> > Sent: Friday, February 27, 2015 5:11 PM
> > To: Sun, Hui
> > Cc: petsc-users@mcs.anl.gov
> > Subject: Re: [petsc-users] DMDA with dof=4, multigrid solver
> >
> >> On Feb 27, 2015, at 6:36 PM, Sun, Hui <hus...@ucsd.edu> wrote:
> >>
> >> I'm trying to work on 4 Poisson's equations defined on a DMDA grid,
> Hence the parameter dof in DMDACreate3d should be 4, and I've set stencil
> width to be 4, and stencil type to be star.
> >
> >  Use a stencil width of 1, not 4. The stencil width is defined in terms
> of dof.
> >>
> >> If I run the code with -pc_type ilu and -ksp_type gmres, it works
> alright.
> >>
> >> However, if I run with pc_type mg, it gives me an error saying that
> when it is doing MatSetValues, the argument is out of range, and there is a
> new nonzero at (60,64) in the matrix. However, that new nonzero is expected
> to be there, the row number 60 corresponds to i=15 and c=0 in x direction,
> and the column number 64 corresponds to i=16 and c=0 in x direction. So
> they are next to each other, and the star stencil with width 1 should
> include that. I have also checked with the memory allocations, and I'm
> found no problem.
> >>
> >> So I'm wondering if there is any problem of using multigrid on a DMDA
> with dof greater than 1?
> >
> >  No it handles dof > 1 fine.
> >
> >  Send your code.
> >
> >  Barry
> >
> >>
> >> Thank you!
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

Reply via email to