Tim,
Thanks. But what I want is all bundled up a routine that takes in the
Fortran array of character strings and allocates and copies over them to the C
ptr business. Likely I could figure it out myself with but Blaise is such a
Fortran wizard he can do it optimally in his sleep :-)
For what it's worth, I answered a question about how to pass string arrays
awhile ago on StackOverflow:
http://stackoverflow.com/questions/9686532/arrays-of-strings-in-fortran-c-bridges-using-iso-c-binding/9686741#9686741
Tim
- Original Message -
From: "Barry Smith"
To: "For users of t
Blaise
If you provide a fortran function to convert an array of Fortran strings to
an array of C strings and tell us how to delete the result then we'll provide a
PetscOptionsGetEnum() for Fortran.
Barry
On May 2, 2012, at 12:02 PM, Blaise Bourdin wrote:
> Barry,
>
>>> Also, fortran
rval and we build eigenvalue
>>> estimates using K = w D A, so we'll produce exactly the same polynomial as
>>> w=1.
>>>
>>> We need better visualization for modes, but if the preconditioned operator
>>> K = P^{-1}A has maximum eigenvalue of 1, the second order Chebyshev
>>> polynomial targeting [0.1, 1.1] is about (1 - 0.25 K) (1 - 0.95 K). Thus,
>>> if P^{-1} perfectly corrects the high energy mode, we will use more than
>>> 0.95 of that correction.
>>>
>>>
>>> Please correct the above reasoning if I've messed up.
>>
>>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/b895d1dd/attachment.html>
, but they also care
about accuracy, which doesn't matter at all to us.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/11366056/attachment.html>
-- next part --
A non-t
ut if the preconditioned
>> operator K = P^{-1}A has maximum eigenvalue of 1, the second order
>> Chebyshev polynomial targeting [0.1, 1.1] is about (1 - 0.25 K) (1 - 0.95
>> K). Thus, if P^{-1} perfectly corrects the high energy mode, we will use
>> more than 0.95 of that correction.
>>
>>
>> Please correct the above reasoning if I've messed up.
>>
>>
>>
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/c628395b/attachment.html>
ation for modes, but if the preconditioned operator K
>> = P^{-1}A has maximum eigenvalue of 1, the second order Chebyshev polynomial
>> targeting [0.1, 1.1] is about (1 - 0.25 K) (1 - 0.95 K). Thus, if P^{-1}
>> perfectly corrects the high energy mode, we will use more than 0.95 of that
>> correction.
>>
>>
>> Please correct the above reasoning if I've messed up.
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/86ccd695/attachment.html>
g than any results to which their
experiments lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/70db1276/attachment.html>
nomial
> targeting [0.1, 1.1] is about (1 - 0.25 K) (1 - 0.95 K). Thus, if P^{-1}
> perfectly corrects the high energy mode, we will use more than 0.95 of that
> correction.
>
>
> Please correct the above reasoning if I've messed up.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/d2d4c6c7/attachment.html>
On May 25, 2012, at 4:02 PM, Matthew Knepley wrote:
> On Fri, May 25, 2012 at 4:03 PM, Barry Smith wrote:
>
>Because of the tetgen license we cannot include ctetgen directly in the
> PETSc tarball. Thus we have forked it off into its own repository and it is
> available for users as --do
t if the preconditioned operator
> K = P^{-1}A has maximum eigenvalue of 1, the second order Chebyshev
> polynomial targeting [0.1, 1.1] is about (1 - 0.25 K) (1 - 0.95 K). Thus,
> if P^{-1} perfectly corrects the high energy mode, we will use more than
> 0.95 of that correctio
Because of the tetgen license we cannot include ctetgen directly in the
PETSc tarball. Thus we have forked it off into its own repository and it is
available for users as --download-ctetgen Developers may choose to hg clone
the repository directly into petsc-dev/externalpackages if they
Because of the tetgen license we cannot include ctetgen directly in the
PETSc tarball. Thus we have forked it off into its own repository and it is
available for users as --download-ctetgen Developers may choose to hg clone
the repository directly into petsc-dev/externalpackages if they
---
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/d31ce5f1/attachment.html>
A)^q X
> >>>
> >>> where q is 1 or 2 or 3, then compute
> >>>
> >>> Q R = Y
> >>>
> >>> and compute the largest singular value of the small matrix R. The
> >>> orthogonalization can be done in one reduction and all the MatMults can
> >>> be done together. Whenever we manage to implement a MatMMult and PCMApply
> >>> or whatever (names inspired by VecMDot), this will provide a very low
> >>> communication way to get the eigenvalue estimates.
> >>>
> >>>
> >>> I want to turn off norms in Chebyshev by default (they are very
> >>> wasteful), but how should I make -mg_levels_ksp_monitor turn them back
> >>> on? I'm already tired of typing -mg_levels_ksp_norm_type unpreconditioned.
> >>
> >>
> >
> >
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/b4b4c5ee/attachment.html>
or 2 or 3, then compute
> >>>
> >>> Q R = Y
> >>>
> >>> and compute the largest singular value of the small matrix R. The
> >>> orthogonalization can be done in one reduction and all the MatMults can
> >>> be done together. Whenever we manage to implement a MatMMult and PCMApply
> >>> or whatever (names inspired by VecMDot), this will provide a very low
> >>> communication way to get the eigenvalue estimates.
> >>>
> >>>
> >>> I want to turn off norms in Chebyshev by default (they are very
> >>> wasteful), but how should I make -mg_levels_ksp_monitor turn them back
> >>> on? I'm already tired of typing -mg_levels_ksp_norm_type unpreconditioned.
> >>
> >>
> >
> >
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/06d36e23/attachment.html>
smoothing steps.
> One should compare it with two additive smoothers.
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/561cfd2a/attachment.html>
Barry,
The trick with Cheby is that it exploits special knowledge: the spectrum of the
error reduction with an additive (diagonal) and Richardson -- on the 5-point
stencil of the Laplacian -- is the worst (in the part of the spectrum that the
smoother has to worry about) at the high end. 2) it
SOR is local_symmetric.
So it does a forward and backward pass. So it is really two smoothing steps.
One should compare it with two additive smoothers.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/5996b3ea/attachment.html>
eigenvalue estimate by
>> applying the preconditioned operator to a few random vectors, then
>> orthogonalizing. The basic algorithm is to generate a random matrix X (say
>> 5 or 10 columns), compute
>> >>>
>> >>> Y = (P^{-1} A)^q X
>> >>&g
a different can
>>> of worms. So if A is symmetric then maybe try SSOR.
>>>
>>
>> The default SOR is local_symmetric.
>>
>>
>> So it does a forward and backward pass. So it is really two smoothing
>> steps. One should compare it with two additive smoothers.
>>
>>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/267fd582/attachment.html>
> Y = (P^{-1} A)^q X
> >>>
> >>> where q is 1 or 2 or 3, then compute
> >>>
> >>> Q R = Y
> >>>
> >>> and compute the largest singular value of the small matrix R. The
> orthogonalization can be done in one reducti
othing
> steps. One should compare it with two additive smoothers.
>
>
------ next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/70e3359d/attachment.html>
sts.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/7564f969/attachment.html>
On May 24, 2012, at 10:35 PM, Jed Brown wrote:
> On Thu, May 24, 2012 at 10:19 PM, Barry Smith wrote:
>
> I think I've fixed PCPre/PostSolve_Eisenstat() to work properly with kspest
> but seems to be with ex19 a much worse smoother than SOR (both with Cheby).
> I'm not sure why. With SOR we
ide a very low
> > communication way to get the eigenvalue estimates.
> >
> >
> > I want to turn off norms in Chebyshev by default (they are very wasteful),
> > but how should I make -mg_levels_ksp_monitor turn them back on? I'm already
> > tired of typing -mg_levels_ksp_norm_type unpreconditioned.
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120525/587476b8/attachment.html>
26 matches
Mail list logo