--
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/1ac8efc1/attachment.htm>
t; non-overlapping subdomains.
>
> How will you ensure that the coarse basis functions are a partition of unity?
Yes, I do some weighting.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/f85f13b3/attachment.htm>
more natural because we avoid factorization on the
non-overlapping subdomains.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/c585f2b9/attachment.htm>
TML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/b312eeea/attachment.htm>
>>>>>>>>>>> On Mon, Feb 20, 2012 at 12:59 AM, Hui Zhang >>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>> On Feb 20, 2012, at 12:41 AM, Dmitry Karpeev wrote:
>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sun, Feb 19, 2012 at 3:08 PM, Hui Zhang >>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>>> I have a new problem: the results from ASM and GASM are different
>>>>>>>>>>>> and it seems
>>>>>>>>>>>> GASM has something wrong with SetModifySubMatrices. Numerical
>>>>>>>>>>>> tests are with
>>>>>>>>>>>> each subdomain supported only by one subdomain. There are no
>>>>>>>>>>>> problems when
>>>>>>>>>>>> I did not modify submatrices. But when I modify submatrices,
>>>>>>>>>>>> there are problems
>>>>>>>>>>>> with GASM but no problems with ASM.
>>>>>>>>>>>>
>>>>>>>>>>>> For example, I use two subdomains. In the first case each
>>>>>>>>>>>> subdomain is supported by
>>>>>>>>>>>> one processor and there seems no problem with GASM. But when I use
>>>>>>>>>>>> run my program
>>>>>>>>>>>> with only one proc. so that it supports both of the two
>>>>>>>>>>>> subdomains, the iteration
>>>>>>>>>>>> number is different from the first case and is much larger. On
>>>>>>>>>>>> the other hand
>>>>>>>>>>>> ASM has no such problem.
>>>>>>>>>>>>
>>>>>>>>>>>> Are the solutions the same?
>>>>>>>>>>>> What problem are you solving?
>>>>>>>>>>>
>>>>>>>>>>> Yes, the solutions are the same. That's why ASM gives the same
>>>>>>>>>>> results with one or
>>>>>>>>>>> two processors. But GASM did not.
>>>>>>>>>>> Sorry, I wasn't clear: ASM and GASM produced different solutions in
>>>>>>>>>>> the case of two domains per processor?
>>>>>>>>>>> I'm solving the Helmholtz equation. Maybe
>>>>>>>>>>> I can prepare a simpler example to show this difference.
>>>>>>>>>>> That would be helpful.
>>>>>>>>>>> Thanks.
>>>>>>>>>>>
>>>>>>>>>>> Dmitry.
>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> Dmitry.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Feb 15, 2012, at 6:46 PM, Dmitry Karpeev wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> You should be able to.
>>>>>>>>>>>>> This behavior is the same as in PCASM,
>>>>>>>>>>>>> except in GASM the matrices live on subcommunicators.
>>>>>>>>>>>>> I am in transit right now, but I can take a closer look in Friday.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Dmitry
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Feb 15, 2012, at 8:07, Hui Zhang >>>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Feb 15, 2012, at 11:19 AM, Hui Zhang wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Hi Dmitry,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks a lot! Currently, I'm not using ISColoring. Just comes
>>>>>>>>>>>>>>> another question
>>>>>>>>>>>>>>> on PCGASMSetModifySubMatrices(). The user provided function has
>>>>>>>>>>>>>>> the prototype
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> func (PC pc,PetscInt nsub,IS *row,IS *col,Mat *submat,void
>>>>>>>>>>>>>>> *ctx);
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I think the coloumns from the parameter 'col' are always the
>>>>>>>>>>>>>>> same as the rows
>>>>>>>>>>>>>>> from the parameter 'row'. Because PCGASMSetLocalSubdomains()
>>>>>>>>>>>>>>> only accepts
>>>>>>>>>>>>>>> index sets but not rows and columns. Has I misunderstood
>>>>>>>>>>>>>>> something?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> As I tested, the row and col are always the same.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I have a new question. Am I allowed to SetLocalToGlobalMapping()
>>>>>>>>>>>>>> for the submat's
>>>>>>>>>>>>>> in the above func()?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks,
>>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks,
>>>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Feb 11, 2012, at 3:36 PM, Dmitry Karpeev wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Yes, that's right.
>>>>>>>>>>>>>>>> There is no good way to help the user assemble the subdomains
>>>>>>>>>>>>>>>> at the moment beyond the 2D stuff.
>>>>>>>>>>>>>>>> It is expected that they are generated from mesh subdomains.
>>>>>>>>>>>>>>>> Each IS does carry the subdomains subcomm.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> There is ISColoringToList() that is supposed to convert a
>>>>>>>>>>>>>>>> "coloring" of indices to an array of ISs,
>>>>>>>>>>>>>>>> each having the indices with the same color and the subcomm
>>>>>>>>>>>>>>>> that supports that color. It is
>>>>>>>>>>>>>>>> largely untested, though. You could try using it and give us
>>>>>>>>>>>>>>>> feedback on any problems you encounter.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Dmitry.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Sat, Feb 11, 2012 at 6:06 AM, Hui Zhang >>>>>>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>>>>>>> About PCGASMSetLocalSubdomains(), in the case of one subdomain
>>>>>>>>>>>>>>>> supported by
>>>>>>>>>>>>>>>> multiple processors, shall I always create the arguments
>>>>>>>>>>>>>>>> 'is[s]' and 'is_local[s]'
>>>>>>>>>>>>>>>> in a subcommunicator consisting of processors supporting the
>>>>>>>>>>>>>>>> subdomain 's'?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> The source code of PCGASMCreateSubdomains2D() seemingly does
>>>>>>>>>>>>>>>> so.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/c4671240/attachment-0001.htm>
roblem with GASM. But when I use
>>>>>>>>>> run my program
>>>>>>>>>> with only one proc. so that it supports both of the two subdomains,
>>>>>>>>>> the iteration
>>>>>>>>>> number is different from the first case and is much larger. On the
>>>>>>>>>> other hand
>>>>>>>>>> ASM has no such problem.
>>>>>>>>>>
>>>>>>>>>> Are the solutions the same?
>>>>>>>>>> What problem are you solving?
>>>>>>>>>
>>>>>>>>> Yes, the solutions are the same. That's why ASM gives the same
>>>>>>>>> results with one or
>>>>>>>>> two processors. But GASM did not.
>>>>>>>>> Sorry, I wasn't clear: ASM and GASM produced different solutions in
>>>>>>>>> the case of two domains per processor?
>>>>>>>>> I'm solving the Helmholtz equation. Maybe
>>>>>>>>> I can prepare a simpler example to show this difference.
>>>>>>>>> That would be helpful.
>>>>>>>>> Thanks.
>>>>>>>>>
>>>>>>>>> Dmitry.
>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Dmitry.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Feb 15, 2012, at 6:46 PM, Dmitry Karpeev wrote:
>>>>>>>>>>
>>>>>>>>>>> You should be able to.
>>>>>>>>>>> This behavior is the same as in PCASM,
>>>>>>>>>>> except in GASM the matrices live on subcommunicators.
>>>>>>>>>>> I am in transit right now, but I can take a closer look in Friday.
>>>>>>>>>>>
>>>>>>>>>>> Dmitry
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Feb 15, 2012, at 8:07, Hui Zhang
>>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> On Feb 15, 2012, at 11:19 AM, Hui Zhang wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi Dmitry,
>>>>>>>>>>>>>
>>>>>>>>>>>>> thanks a lot! Currently, I'm not using ISColoring. Just comes
>>>>>>>>>>>>> another question
>>>>>>>>>>>>> on PCGASMSetModifySubMatrices(). The user provided function has
>>>>>>>>>>>>> the prototype
>>>>>>>>>>>>>
>>>>>>>>>>>>> func (PC pc,PetscInt nsub,IS *row,IS *col,Mat *submat,void
>>>>>>>>>>>>> *ctx);
>>>>>>>>>>>>>
>>>>>>>>>>>>> I think the coloumns from the parameter 'col' are always the same
>>>>>>>>>>>>> as the rows
>>>>>>>>>>>>> from the parameter 'row'. Because PCGASMSetLocalSubdomains() only
>>>>>>>>>>>>> accepts
>>>>>>>>>>>>> index sets but not rows and columns. Has I misunderstood
>>>>>>>>>>>>> something?
>>>>>>>>>>>>
>>>>>>>>>>>> As I tested, the row and col are always the same.
>>>>>>>>>>>>
>>>>>>>>>>>> I have a new question. Am I allowed to SetLocalToGlobalMapping()
>>>>>>>>>>>> for the submat's
>>>>>>>>>>>> in the above func()?
>>>>>>>>>>>>
>>>>>>>>>>>> thanks,
>>>>>>>>>>>> Hui
>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> thanks,
>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Feb 11, 2012, at 3:36 PM, Dmitry Karpeev wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Yes, that's right.
>>>>>>>>>>>>>> There is no good way to help the user assemble the subdomains at
>>>>>>>>>>>>>> the moment beyond the 2D stuff.
>>>>>>>>>>>>>> It is expected that they are generated from mesh subdomains.
>>>>>>>>>>>>>> Each IS does carry the subdomains subcomm.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> There is ISColoringToList() that is supposed to convert a
>>>>>>>>>>>>>> "coloring" of indices to an array of ISs,
>>>>>>>>>>>>>> each having the indices with the same color and the subcomm that
>>>>>>>>>>>>>> supports that color. It is
>>>>>>>>>>>>>> largely untested, though. You could try using it and give us
>>>>>>>>>>>>>> feedback on any problems you encounter.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Dmitry.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sat, Feb 11, 2012 at 6:06 AM, Hui Zhang >>>>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>>>>> About PCGASMSetLocalSubdomains(), in the case of one subdomain
>>>>>>>>>>>>>> supported by
>>>>>>>>>>>>>> multiple processors, shall I always create the arguments 'is[s]'
>>>>>>>>>>>>>> and 'is_local[s]'
>>>>>>>>>>>>>> in a subcommunicator consisting of processors supporting the
>>>>>>>>>>>>>> subdomain 's'?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> The source code of PCGASMCreateSubdomains2D() seemingly does so.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/3cf175e4/attachment-0001.htm>
;>>>>>> other hand
>>>>>>>>>> ASM has no such problem.
>>>>>>>>>>
>>>>>>>>>> Are the solutions the same?
>>>>>>>>>> What problem are you solving?
>>>>>>>>>
>>>>>>>>> Yes, the solutions are the same. That's why ASM gives the same
>>>>>>>>> results with one or
>>>>>>>>> two processors. But GASM did not.
>>>>>>>>> Sorry, I wasn't clear: ASM and GASM produced different solutions in
>>>>>>>>> the case of two domains per processor?
>>>>>>>>> I'm solving the Helmholtz equation. Maybe
>>>>>>>>> I can prepare a simpler example to show this difference.
>>>>>>>>> That would be helpful.
>>>>>>>>> Thanks.
>>>>>>>>>
>>>>>>>>> Dmitry.
>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Dmitry.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Feb 15, 2012, at 6:46 PM, Dmitry Karpeev wrote:
>>>>>>>>>>
>>>>>>>>>>> You should be able to.
>>>>>>>>>>> This behavior is the same as in PCASM,
>>>>>>>>>>> except in GASM the matrices live on subcommunicators.
>>>>>>>>>>> I am in transit right now, but I can take a closer look in Friday.
>>>>>>>>>>>
>>>>>>>>>>> Dmitry
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Feb 15, 2012, at 8:07, Hui Zhang
>>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> On Feb 15, 2012, at 11:19 AM, Hui Zhang wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi Dmitry,
>>>>>>>>>>>>>
>>>>>>>>>>>>> thanks a lot! Currently, I'm not using ISColoring. Just comes
>>>>>>>>>>>>> another question
>>>>>>>>>>>>> on PCGASMSetModifySubMatrices(). The user provided function has
>>>>>>>>>>>>> the prototype
>>>>>>>>>>>>>
>>>>>>>>>>>>> func (PC pc,PetscInt nsub,IS *row,IS *col,Mat *submat,void
>>>>>>>>>>>>> *ctx);
>>>>>>>>>>>>>
>>>>>>>>>>>>> I think the coloumns from the parameter 'col' are always the same
>>>>>>>>>>>>> as the rows
>>>>>>>>>>>>> from the parameter 'row'. Because PCGASMSetLocalSubdomains() only
>>>>>>>>>>>>> accepts
>>>>>>>>>>>>> index sets but not rows and columns. Has I misunderstood
>>>>>>>>>>>>> something?
>>>>>>>>>>>>
>>>>>>>>>>>> As I tested, the row and col are always the same.
>>>>>>>>>>>>
>>>>>>>>>>>> I have a new question. Am I allowed to SetLocalToGlobalMapping()
>>>>>>>>>>>> for the submat's
>>>>>>>>>>>> in the above func()?
>>>>>>>>>>>>
>>>>>>>>>>>> thanks,
>>>>>>>>>>>> Hui
>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> thanks,
>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Feb 11, 2012, at 3:36 PM, Dmitry Karpeev wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Yes, that's right.
>>>>>>>>>>>>>> There is no good way to help the user assemble the subdomains at
>>>>>>>>>>>>>> the moment beyond the 2D stuff.
>>>>>>>>>>>>>> It is expected that they are generated from mesh subdomains.
>>>>>>>>>>>>>> Each IS does carry the subdomains subcomm.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> There is ISColoringToList() that is supposed to convert a
>>>>>>>>>>>>>> "coloring" of indices to an array of ISs,
>>>>>>>>>>>>>> each having the indices with the same color and the subcomm that
>>>>>>>>>>>>>> supports that color. It is
>>>>>>>>>>>>>> largely untested, though. You could try using it and give us
>>>>>>>>>>>>>> feedback on any problems you encounter.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Dmitry.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Sat, Feb 11, 2012 at 6:06 AM, Hui Zhang >>>>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>>>>> About PCGASMSetLocalSubdomains(), in the case of one subdomain
>>>>>>>>>>>>>> supported by
>>>>>>>>>>>>>> multiple processors, shall I always create the arguments 'is[s]'
>>>>>>>>>>>>>> and 'is_local[s]'
>>>>>>>>>>>>>> in a subcommunicator consisting of processors supporting the
>>>>>>>>>>>>>> subdomain 's'?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> The source code of PCGASMCreateSubdomains2D() seemingly does so.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>>> Hui
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/4b486dcf/attachment-0001.htm>
-snes_linesearch_type
> cp (critical point) that might be useful.
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/485cc81e/attachment.htm>
gt;>>>>> Dmitry.
>>>>>>>>
>>>>>>>>
>>>>>>>> On Feb 15, 2012, at 6:46 PM, Dmitry Karpeev wrote:
>>>>>>>>
>>>>>>>>> You should be able to.
>>>>>>>>> This behavior is the same as in PCASM,
>>>>>>>>> except in GASM the matrices live on subcommunicators.
>>>>>>>>> I am in transit right now, but I can take a closer look in Friday.
>>>>>>>>>
>>>>>>>>> Dmitry
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Feb 15, 2012, at 8:07, Hui Zhang
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> On Feb 15, 2012, at 11:19 AM, Hui Zhang wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi Dmitry,
>>>>>>>>>>>
>>>>>>>>>>> thanks a lot! Currently, I'm not using ISColoring. Just comes
>>>>>>>>>>> another question
>>>>>>>>>>> on PCGASMSetModifySubMatrices(). The user provided function has the
>>>>>>>>>>> prototype
>>>>>>>>>>>
>>>>>>>>>>> func (PC pc,PetscInt nsub,IS *row,IS *col,Mat *submat,void
>>>>>>>>>>> *ctx);
>>>>>>>>>>>
>>>>>>>>>>> I think the coloumns from the parameter 'col' are always the same
>>>>>>>>>>> as the rows
>>>>>>>>>>> from the parameter 'row'. Because PCGASMSetLocalSubdomains() only
>>>>>>>>>>> accepts
>>>>>>>>>>> index sets but not rows and columns. Has I misunderstood something?
>>>>>>>>>>
>>>>>>>>>> As I tested, the row and col are always the same.
>>>>>>>>>>
>>>>>>>>>> I have a new question. Am I allowed to SetLocalToGlobalMapping() for
>>>>>>>>>> the submat's
>>>>>>>>>> in the above func()?
>>>>>>>>>>
>>>>>>>>>> thanks,
>>>>>>>>>> Hui
>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> thanks,
>>>>>>>>>>> Hui
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Feb 11, 2012, at 3:36 PM, Dmitry Karpeev wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Yes, that's right.
>>>>>>>>>>>> There is no good way to help the user assemble the subdomains at
>>>>>>>>>>>> the moment beyond the 2D stuff.
>>>>>>>>>>>> It is expected that they are generated from mesh subdomains.
>>>>>>>>>>>> Each IS does carry the subdomains subcomm.
>>>>>>>>>>>>
>>>>>>>>>>>> There is ISColoringToList() that is supposed to convert a
>>>>>>>>>>>> "coloring" of indices to an array of ISs,
>>>>>>>>>>>> each having the indices with the same color and the subcomm that
>>>>>>>>>>>> supports that color. It is
>>>>>>>>>>>> largely untested, though. You could try using it and give us
>>>>>>>>>>>> feedback on any problems you encounter.
>>>>>>>>>>>>
>>>>>>>>>>>> Dmitry.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Sat, Feb 11, 2012 at 6:06 AM, Hui Zhang >>>>>>>>>>> hotmail.com> wrote:
>>>>>>>>>>>> About PCGASMSetLocalSubdomains(), in the case of one subdomain
>>>>>>>>>>>> supported by
>>>>>>>>>>>> multiple processors, shall I always create the arguments 'is[s]'
>>>>>>>>>>>> and 'is_local[s]'
>>>>>>>>>>>> in a subcommunicator consisting of processors supporting the
>>>>>>>>>>>> subdomain 's'?
>>>>>>>>>>>>
>>>>>>>>>>>> The source code of PCGASMCreateSubdomains2D() seemingly does so.
>>>>>>>>>>>>
>>>>>>>>>>>> Thanks,
>>>>>>>>>>>> Hui
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>>
>>
>>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/96b4b7c2/attachment-0001.htm>
ng, but there is -snes_linesearch_type
>> cp (critical point) that might be useful.
>>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/d99408ed/attachment.htm>
On May 15, 2012, at 4:32 PM, Avery Bingham wrote:
> I have also seen this behavior, and I think this might be related to the
> scaling of the variables in the nonlinear system. I am using PETSc through
> an application of MOOSE which allows for scaling of the variables. This
> scaling reduce
ly post-release.
- Peter
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/29368f73/attachment.htm>
e is -snes_linesearch_type
cp (critical point) that might be useful.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/e3fbec73/attachment.htm>
ake place before a
steady state can be reached.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/431b6427/attachment-0001.htm>
cubic model
can not.
Please help me what conditions may leads the above issue in line search.
Regards,
BehZad*
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/7690c6e3/attachment.htm>
ts.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/5b4ca762/attachment.htm>
s from the local range of one processor? that processor wouldn't be
> utilized unless you redistribute the matrix)
>
> What sizes and method are we talking about? Usually additional (compact)
> basis functions only make sense to add to one of a small number of processes.
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/ab773184/attachment-0001.htm>
u need to manually determine the local row
>>> distribution after you do that. (for example, say you completely remove
>>> all the values from the local range of one processor? that processor
>>> wouldn't be utilized unless you redistribute the matrix)
>>>
sor? that
> > > processor wouldn't be utilized unless you redistribute the matrix)
> > What sizes and method are we talking about? Usually additional
> > (compact) basis functions only make sense to add to one of a small
> > number of processes.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/1d82f08e/attachment.htm>
t;http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/9920d6cf/attachment.htm>
redistribute the matrix)
>>
>
> What sizes and method are we talking about? Usually additional (compact)
> basis functions only make sense to add to one of a small number of
> processes.
>
>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/2455201f/attachment.htm>
Hi Barry.
On Mon, 14 May 2012, Barry Smith wrote:
>src/snes/examples/tutorials/ex22.c is an example of how one can set
> options in the program and then have the command line options override them.
Thanks! I am looking at that right now.
> PetscInitialize(&argc,&argv,PETSC_NULL,help);
>
>
nctions only make sense to add to one of a small number of processes.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/aeab303a/attachment.htm>
On 5/15/12 9:25 AM, Thomas Witkowski wrote:
> I made some comparisons of using umfpack, superlu, superlu_dist and
> mumps to solve systems with sparse matrices arising from finite
> element method. The size of the matrices range from around 5 to
> more than 3 million unknowns. I used 1, 2, 4
ich are
easy enough to construct on your own.
PCGASM doesn't have the individual scatters, so that's of even less help
for you.
Dmitry.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/a4d73fc8/attachment-0001.htm>
Alan,
Sorry, we don't have exactly that example. But you can start with
src/ksp/ksp/examples/tutorials/ex45.c change the 2nd, 3rd, or 4th argument to
DMDA_BOUNDARY_PERIODIC then in ComputeMatrix() remove the appropriate values
from if (i==0 || j==0 || k==0 || i==mx-1 || j==my-1 || k==mz-1)
rices. Numerical
>>>>>>>>>>> tests are with
>>>>>>>>>>> each subdomain supported only by one subdomain. There are no
>>>>>>>>>>> problems when
>>>>>>>>>>> I did not modify submatrices. But when I modify submatrices,
>>>>>>>>>>> there are problems
>>>>>>>>>>> with GASM but no problems with ASM.
>>>>>>>>>>>
>>>>>>>>>>> For example, I use two subdomains. In the first case each
>>>>>>>>>>> subdomain is supported by
>>>>>>>>>>> one processor and there seems no problem with GASM. But when I
>>>>>>>>>>> use run my program
>>>>>>>>>>> with only one proc. so that it supports both of the two
>>>>>>>>>>> subdomains, the iteration
>>>>>>>>>>> number is different from the first case and is much larger. On
>>>>>>>>>>> the other hand
>>>>>>>>>>> ASM has no such problem.
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Are the solutions the same?
>>>>>>>>>> What problem are you solving?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Yes, the solutions are the same. That's why ASM gives the same
>>>>>>>>>> results with one or
>>>>>>>>>> two processors. But GASM did not.
>>>>>>>>>>
>>>>>>>>> Sorry, I wasn't clear: ASM and GASM produced different solutions
>>>>>>>>> in the case of two domains per processor?
>>>>>>>>>
>>>>>>>>>> I'm solving the Helmholtz equation. Maybe
>>>>>>>>>> I can prepare a simpler example to show this difference.
>>>>>>>>>>
>>>>>>>>> That would be helpful.
>>>>>>>>> Thanks.
>>>>>>>>>
>>>>>>>>> Dmitry.
>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> Dmitry.
>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Feb 15, 2012, at 6:46 PM, Dmitry Karpeev wrote:
>>>>>>>>>>>
>>>>>>>>>>> You should be able to.
>>>>>>>>>>> This behavior is the same as in PCASM,
>>>>>>>>>>> except in GASM the matrices live on subcommunicators.
>>>>>>>>>>> I am in transit right now, but I can take a closer look in
>>>>>>>>>>> Friday.
>>>>>>>>>>>
>>>>>>>>>>> Dmitry
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Feb 15, 2012, at 8:07, Hui Zhang
>>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>> On Feb 15, 2012, at 11:19 AM, Hui Zhang wrote:
>>>>>>>>>>>
>>>>>>>>>>> Hi Dmitry,
>>>>>>>>>>>
>>>>>>>>>>> thanks a lot! Currently, I'm not using ISColoring. Just comes
>>>>>>>>>>> another question
>>>>>>>>>>> on PCGASMSetModifySubMatrices(). The user provided function has
>>>>>>>>>>> the prototype
>>>>>>>>>>>
>>>>>>>>>>> func (PC pc,PetscInt nsub,IS *row,IS *col,Mat *submat,void
>>>>>>>>>>> *ctx);
>>>>>>>>>>>
>>>>>>>>>>> I think the coloumns from the parameter 'col' are always the
>>>>>>>>>>> same as the rows
>>>>>>>>>>> from the parameter 'row'. Because PCGASMSetLocalSubdomains()
>>>>>>>>>>> only accepts
>>>>>>>>>>> index sets but not rows and columns. Has I misunderstood
>>>>>>>>>>> something?
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> As I tested, the row and col are always the same.
>>>>>>>>>>>
>>>>>>>>>>> I have a new question. Am I allowed to SetLocalToGlobalMapping()
>>>>>>>>>>> for the submat's
>>>>>>>>>>> in the above func()?
>>>>>>>>>>>
>>>>>>>>>>> thanks,
>>>>>>>>>>> Hui
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> thanks,
>>>>>>>>>>> Hui
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Feb 11, 2012, at 3:36 PM, Dmitry Karpeev wrote:
>>>>>>>>>>>
>>>>>>>>>>> Yes, that's right.
>>>>>>>>>>> There is no good way to help the user assemble the subdomains at
>>>>>>>>>>> the moment beyond the 2D stuff.
>>>>>>>>>>> It is expected that they are generated from mesh subdomains.
>>>>>>>>>>> Each IS does carry the subdomains subcomm.
>>>>>>>>>>>
>>>>>>>>>>> There is ISColoringToList() that is supposed to convert a
>>>>>>>>>>> "coloring" of indices to an array of ISs,
>>>>>>>>>>> each having the indices with the same color and the subcomm that
>>>>>>>>>>> supports that color. It is
>>>>>>>>>>> largely untested, though. You could try using it and give us
>>>>>>>>>>> feedback on any problems you encounter.
>>>>>>>>>>>
>>>>>>>>>>> Dmitry.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Sat, Feb 11, 2012 at 6:06 AM, Hui Zhang <>>>>>>>>>> hotmail.com>
>>>>>>>>>>> mike.hui.zhang at hotmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> About PCGASMSetLocalSubdomains(), in the case of one subdomain
>>>>>>>>>>>> supported by
>>>>>>>>>>>> multiple processors, shall I always create the arguments
>>>>>>>>>>>> 'is[s]' and 'is_local[s]'
>>>>>>>>>>>> in a subcommunicator consisting of processors supporting the
>>>>>>>>>>>> subdomain 's'?
>>>>>>>>>>>>
>>>>>>>>>>>> The source code of PCGASMCreateSubdomains2D() seemingly does so.
>>>>>>>>>>>>
>>>>>>>>>>>> Thanks,
>>>>>>>>>>>> Hui
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/65dae8c7/attachment-0001.htm>
;>>>>>
>>>>>>>>> I think the coloumns from the parameter 'col' are always the same as
>>>>>>>>> the rows
>>>>>>>>> from the parameter 'row'. Because PCGASMSetLocalSubdomains() only
>>>>>>>>> accepts
>>>>>>>>> index sets but not rows and columns. Has I misunderstood something?
>>>>>>>>
>>>>>>>> As I tested, the row and col are always the same.
>>>>>>>>
>>>>>>>> I have a new question. Am I allowed to SetLocalToGlobalMapping() for
>>>>>>>> the submat's
>>>>>>>> in the above func()?
>>>>>>>>
>>>>>>>> thanks,
>>>>>>>> Hui
>>>>>>>>
>>>>>>>>>
>>>>>>>>> thanks,
>>>>>>>>> Hui
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Feb 11, 2012, at 3:36 PM, Dmitry Karpeev wrote:
>>>>>>>>>
>>>>>>>>>> Yes, that's right.
>>>>>>>>>> There is no good way to help the user assemble the subdomains at the
>>>>>>>>>> moment beyond the 2D stuff.
>>>>>>>>>> It is expected that they are generated from mesh subdomains.
>>>>>>>>>> Each IS does carry the subdomains subcomm.
>>>>>>>>>>
>>>>>>>>>> There is ISColoringToList() that is supposed to convert a "coloring"
>>>>>>>>>> of indices to an array of ISs,
>>>>>>>>>> each having the indices with the same color and the subcomm that
>>>>>>>>>> supports that color. It is
>>>>>>>>>> largely untested, though. You could try using it and give us
>>>>>>>>>> feedback on any problems you encounter.
>>>>>>>>>>
>>>>>>>>>> Dmitry.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Sat, Feb 11, 2012 at 6:06 AM, Hui Zhang >>>>>>>>> hotmail.com> wrote:
>>>>>>>>>> About PCGASMSetLocalSubdomains(), in the case of one subdomain
>>>>>>>>>> supported by
>>>>>>>>>> multiple processors, shall I always create the arguments 'is[s]' and
>>>>>>>>>> 'is_local[s]'
>>>>>>>>>> in a subcommunicator consisting of processors supporting the
>>>>>>>>>> subdomain 's'?
>>>>>>>>>>
>>>>>>>>>> The source code of PCGASMCreateSubdomains2D() seemingly does so.
>>>>>>>>>>
>>>>>>>>>> Thanks,
>>>>>>>>>> Hui
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/b2e19cda/attachment.htm>
he coarse basis functions are a partition of
unity?
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/e0e347f9/attachment.htm>
ct the subdomain
problems from PC(G)ASM.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/1c42faa8/attachment.htm>
t;>>>> results with one or
>>>>>>>> two processors. But GASM did not.
>>>>>>>>
>>>>>>> Sorry, I wasn't clear: ASM and GASM produced different solutions in
>>>>>>> the case of two domains per processor?
>>>>>>>
>>>>>>>> I'm solving the Helmholtz equation. Maybe
>>>>>>>> I can prepare a simpler example to show this difference.
>>>>>>>>
>>>>>>> That would be helpful.
>>>>>>> Thanks.
>>>>>>>
>>>>>>> Dmitry.
>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Dmitry.
>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Feb 15, 2012, at 6:46 PM, Dmitry Karpeev wrote:
>>>>>>>>>
>>>>>>>>> You should be able to.
>>>>>>>>> This behavior is the same as in PCASM,
>>>>>>>>> except in GASM the matrices live on subcommunicators.
>>>>>>>>> I am in transit right now, but I can take a closer look in Friday.
>>>>>>>>>
>>>>>>>>> Dmitry
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Feb 15, 2012, at 8:07, Hui Zhang
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>> On Feb 15, 2012, at 11:19 AM, Hui Zhang wrote:
>>>>>>>>>
>>>>>>>>> Hi Dmitry,
>>>>>>>>>
>>>>>>>>> thanks a lot! Currently, I'm not using ISColoring. Just comes
>>>>>>>>> another question
>>>>>>>>> on PCGASMSetModifySubMatrices(). The user provided function has
>>>>>>>>> the prototype
>>>>>>>>>
>>>>>>>>> func (PC pc,PetscInt nsub,IS *row,IS *col,Mat *submat,void *ctx);
>>>>>>>>>
>>>>>>>>> I think the coloumns from the parameter 'col' are always the same as
>>>>>>>>> the rows
>>>>>>>>> from the parameter 'row'. Because PCGASMSetLocalSubdomains() only
>>>>>>>>> accepts
>>>>>>>>> index sets but not rows and columns. Has I misunderstood something?
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> As I tested, the row and col are always the same.
>>>>>>>>>
>>>>>>>>> I have a new question. Am I allowed to SetLocalToGlobalMapping()
>>>>>>>>> for the submat's
>>>>>>>>> in the above func()?
>>>>>>>>>
>>>>>>>>> thanks,
>>>>>>>>> Hui
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> thanks,
>>>>>>>>> Hui
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Feb 11, 2012, at 3:36 PM, Dmitry Karpeev wrote:
>>>>>>>>>
>>>>>>>>> Yes, that's right.
>>>>>>>>> There is no good way to help the user assemble the subdomains at
>>>>>>>>> the moment beyond the 2D stuff.
>>>>>>>>> It is expected that they are generated from mesh subdomains.
>>>>>>>>> Each IS does carry the subdomains subcomm.
>>>>>>>>>
>>>>>>>>> There is ISColoringToList() that is supposed to convert a
>>>>>>>>> "coloring" of indices to an array of ISs,
>>>>>>>>> each having the indices with the same color and the subcomm that
>>>>>>>>> supports that color. It is
>>>>>>>>> largely untested, though. You could try using it and give us
>>>>>>>>> feedback on any problems you encounter.
>>>>>>>>>
>>>>>>>>> Dmitry.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Sat, Feb 11, 2012 at 6:06 AM, Hui Zhang <>>>>>>>> hotmail.com>
>>>>>>>>> mike.hui.zhang at hotmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> About PCGASMSetLocalSubdomains(), in the case of one subdomain
>>>>>>>>>> supported by
>>>>>>>>>> multiple processors, shall I always create the arguments 'is[s]'
>>>>>>>>>> and 'is_local[s]'
>>>>>>>>>> in a subcommunicator consisting of processors supporting the
>>>>>>>>>> subdomain 's'?
>>>>>>>>>>
>>>>>>>>>> The source code of PCGASMCreateSubdomains2D() seemingly does so.
>>>>>>>>>>
>>>>>>>>>> Thanks,
>>>>>>>>>> Hui
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>>
>>>
>>
>>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/a59a6a8c/attachment.htm>
)
which is a basic implementation of a two-level DD method of this type.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/738ee3bd/attachment.htm>
Am 15.05.2012 10:00, schrieb Dave May:
> Ah okay. Thanks for the timings.
>
> Have you monitored the CPU usage when you using umfpack?
> On my machine, it's definitely not running on a single process,
> so I wouldn't consider it a sequential solver.
>
>
>
Yes, the CPU usage is 100% and not more. If
Ah okay. Thanks for the timings.
Have you monitored the CPU usage when you using umfpack?
On my machine, it's definitely not running on a single process,
so I wouldn't consider it a sequential solver.
On 15 May 2012 09:54, Thomas Witkowski
wrote:
> Am 15.05.2012 09:36, schrieb Dave May:
>
>>
There is another sparse direct solver that PETSc supports: PaStix. You
can try it by --download-pastix.
Xiangdong
On Tue, May 15, 2012 at 5:34 AM, Anton Popov wrote:
> On 5/15/12 9:25 AM, Thomas Witkowski wrote:
>>
>> I made some comparisons of using umfpack, superlu, superlu_dist and mumps
>>
...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/34b087a2/attachment.htm>
Am 15.05.2012 09:36, schrieb Dave May:
> I have seem similar behaviour comparing umfpack and superlu_dist,
> however the difference wasn't enormous, possibly umfpack was a factor
> of 1.2-1.4 times faster on 1 - 4 cores.
> What sort of time differences are you observing? Can you post the
> numbers
2, 4, 8 and 16 nodes to make the benchmark. Now, I
>>> wonder that in all cases the sequential umfpack was the fastest one. So
>>> even
>>> with 16 cores, superlu_dist and mumps are slower. Can anybody of you
>>> confirm
>>> this observation? Are there any other parallel direct solvers around
>>> which
>>> are more efficient?
>>>
>>> Thomas
>>>
>>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/243d1f4a/attachment.htm>
I have seem similar behaviour comparing umfpack and superlu_dist,
however the difference wasn't enormous, possibly umfpack was a factor
of 1.2-1.4 times faster on 1 - 4 cores.
What sort of time differences are you observing? Can you post the
numbers somewhere?
However, umpack will not work on a di
I made some comparisons of using umfpack, superlu, superlu_dist and
mumps to solve systems with sparse matrices arising from finite element
method. The size of the matrices range from around 5 to more than 3
million unknowns. I used 1, 2, 4, 8 and 16 nodes to make the benchmark.
Now, I wond
On May 15, 2012, at 5:30 AM, Giacomo Mulas wrote:
> Hi Barry.
>
> On Mon, 14 May 2012, Barry Smith wrote:
>
>> src/snes/examples/tutorials/ex22.c is an example of how one can set
>> options in the program and then have the command line options override them.
>
> Thanks! I am looking at that
he transient or steady problem.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/cd451057/attachment.htm>
gt;>>>>>> wrote:
>>>>>>>
>>>>>>> On Feb 15, 2012, at 11:19 AM, Hui Zhang wrote:
>>>>>>>
>>>>>>> Hi Dmitry,
>>>>>>>
>>>>>>> thanks a lot! Currently, I'm not using ISColoring. Just comes
>>>>>>> another question
>>>>>>> on PCGASMSetModifySubMatrices(). The user provided function has the
>>>>>>> prototype
>>>>>>>
>>>>>>> func (PC pc,PetscInt nsub,IS *row,IS *col,Mat *submat,void *ctx);
>>>>>>>
>>>>>>> I think the coloumns from the parameter 'col' are always the same as
>>>>>>> the rows
>>>>>>> from the parameter 'row'. Because PCGASMSetLocalSubdomains() only
>>>>>>> accepts
>>>>>>> index sets but not rows and columns. Has I misunderstood something?
>>>>>>>
>>>>>>>
>>>>>>> As I tested, the row and col are always the same.
>>>>>>>
>>>>>>> I have a new question. Am I allowed to SetLocalToGlobalMapping() for
>>>>>>> the submat's
>>>>>>> in the above func()?
>>>>>>>
>>>>>>> thanks,
>>>>>>> Hui
>>>>>>>
>>>>>>>
>>>>>>> thanks,
>>>>>>> Hui
>>>>>>>
>>>>>>>
>>>>>>> On Feb 11, 2012, at 3:36 PM, Dmitry Karpeev wrote:
>>>>>>>
>>>>>>> Yes, that's right.
>>>>>>> There is no good way to help the user assemble the subdomains at the
>>>>>>> moment beyond the 2D stuff.
>>>>>>> It is expected that they are generated from mesh subdomains.
>>>>>>> Each IS does carry the subdomains subcomm.
>>>>>>>
>>>>>>> There is ISColoringToList() that is supposed to convert a "coloring"
>>>>>>> of indices to an array of ISs,
>>>>>>> each having the indices with the same color and the subcomm that
>>>>>>> supports that color. It is
>>>>>>> largely untested, though. You could try using it and give us
>>>>>>> feedback on any problems you encounter.
>>>>>>>
>>>>>>> Dmitry.
>>>>>>>
>>>>>>>
>>>>>>> On Sat, Feb 11, 2012 at 6:06 AM, Hui Zhang <>>>>>> hotmail.com>
>>>>>>> mike.hui.zhang at hotmail.com> wrote:
>>>>>>>
>>>>>>>> About PCGASMSetLocalSubdomains(), in the case of one subdomain
>>>>>>>> supported by
>>>>>>>> multiple processors, shall I always create the arguments 'is[s]'
>>>>>>>> and 'is_local[s]'
>>>>>>>> in a subcommunicator consisting of processors supporting the
>>>>>>>> subdomain 's'?
>>>>>>>>
>>>>>>>> The source code of PCGASMCreateSubdomains2D() seemingly does so.
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Hui
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>
>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/82de5e39/attachment-0001.htm>
ent was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120515/a1d9acc1/attachment.htm>
44 matches
Mail list logo