On Wed, Sep 13, 2017 at 6:14 PM, David Gross wrote:
> Hi Matt,
> Thank you for getting back to me. Your answer confirms what I thought in
> terms of existing functionality. I think it shouldn't be too hard to make a
> copy of MatAXPY to MatAXY where it performs Xij =
Hi Matt,
Thank you for getting back to me. Your answer confirms what I thought in
terms of existing functionality. I think it shouldn't be too hard to make a
copy of MatAXPY to MatAXY where it performs Xij = A*Xij*Yij (or without the
A). I could then do the MatNorm of the resulting matrix to get
Barry Smith writes:
>PETSc folks,
>
> Argonne National Laboratory has recently set up a system that may make it
> possible for the PETSc group at ANL to subcontract particular PETSc
> contribution projects to developers in most of the world. These could be from
>
> On Sep 13, 2017, at 10:56 AM, Federico Golfrè Andreasi
> wrote:
>
> Hi Barry,
>
> I understand and perfectly agree with you that the behavior increase after
> the release due to better tuning.
>
> In my case, the difference in the solution is negligible, but the
Hi Barry,
I understand and perfectly agree with you that the behavior increase after
the release due to better tuning.
In my case, the difference in the solution is negligible, but the runtime
increases up to +70% (with the same number of ksp_iterations).
So I was wondering if maybe there were
Two iterations for the eigen estimate is too low and gmres converges
slowly. I'm surprised this does not diverge, or just die, for a Laplacian
because you need to get an upper bound. Cheby will scale the estimate up by
some safety factor (is it really large now?). Try: -mg_levels_esteig_ksp_max_it
Federico :
>
> Coarse grid solver -- level ---
> KSP Object:(mg_levels_0_) 128 MPI processes
> type: chebyshev
> Chebyshev: eigenvalue estimates: min = 0.223549, max = 2.45903
> Chebyshev: eigenvalues estimated using gmres with
There will likely always be slight differences in convergence over that many
releases. Lots of little defaults etc get changed over time as we learn from
users and increase the robustness of the defaults.
So in your case do the differences matter?
1) What is the time to solution in
Dear PETSc users/developers,
I recently switched from PETSc-3.4 to PETSc-3.7 and found that some default
setup for the "mg" (mutigrid) preconditioner have changed.
We were solving a linear system passing, throug command line, the following
options:
-ksp_type fgmres
-ksp_max_it10
On 12. Sep 2017, at 21:26, Matthew Knepley wrote:On Tue, Sep 12, 2017 at 12:52 PM, Maximilian Hartig wrote:Hello,
in my attempt to use an incremental formulation for elastoplasticity I wanted to create an auxiliary field in which the stresses from
10 matches
Mail list logo