Thanks for your quick answer.
Yeah, the Vec I passed to KSPSolve is in the wrong blocksize. Only the rhs
Vec is blocked, the solution Vec is not blocked.
Add VecSetBlockSize before KSPSolve solved the problem. Thanks you very
much.
On Tue, Apr 29, 2014 at 4:06 PM, Jed Brown wrote:
> Song Gao
Song Gao writes:
> Dear Petsc users,
>
> We are working on a matrix-free solver for NS equations. We first developed
> the solver in the SNES framework. But because we want to take control of
> the Newton iterations and reuse some of our existing code, we stop using
> SNES and begin developing th
Dear Petsc users,
We are working on a matrix-free solver for NS equations. We first developed
the solver in the SNES framework. But because we want to take control of
the Newton iterations and reuse some of our existing code, we stop using
SNES and begin developing the matrix-free solver in KSP fr
On Tue, Apr 29, 2014 at 2:09 PM, Xiangdong wrote:
> It turns out to a be a bug in my FormFunctionLocal(DMDALocalInfo
> *info,PetscScalar **x,PetscScalar **f,AppCtx *user). I forgot to initialize
> the array f. Zero the array f solved the problem and gave consistent result.
>
> Just curious, why
It turns out to a be a bug in my FormFunctionLocal(DMDALocalInfo
*info,PetscScalar **x,PetscScalar **f,AppCtx *user). I forgot to initialize
the array f. Zero the array f solved the problem and gave consistent result.
Just curious, why does not petsc initialize the array f to zero by default
insi
Barry, thank you for the tips. Besides the trust region method, I have also
tested line search methods. -snes_linesearch_type cp worked robustly. Other
line search types didn't converge, except for basic. I'll spend some more
time to check if my Jacobian is wrong or if -snes_mf_operator has some
pr
On Apr 29, 2014, at 9:19 AM, Norihiro Watanabe wrote:
> Hi Barry,
>
> Is it possible that -snes_mf_operator makes convergence of linear solves
> slower if unknowns are poorly scaled for multiphysics problems? I gave up to
> check Jacobian for the large problem because it takes too long time.
Hi Barry,
Is it possible that -snes_mf_operator makes convergence of linear solves
slower if unknowns are poorly scaled for multiphysics problems? I gave up
to check Jacobian for the large problem because it takes too long time.
Instead I tested it with several different small size problems and no
On Tue, Apr 29, 2014 at 8:57 AM, Garth N. Wells wrote:
> I’m using VecGhostGetLocalForm to test whether or not a vector has ghost
> values. Is there any overhead associated with calling VecGhostGetLocalForm
> that I should be concerned about?
No, its just checks vector state (no communication o
I’m using VecGhostGetLocalForm to test whether or not a vector has ghost
values. Is there any overhead associated with calling VecGhostGetLocalForm that
I should be concerned about?
Garth
Hi Anton,
You can do the whole thing much easier (to my opinion).
>
Since you created two DMDA anyway, just do:
>
> - find first index on every processor using MPI_Scan
> - create two global vectors (no ghosts)
> - put proper global indicies to global vectors
> - create two local vectors (with gho
On 4/29/14 7:07 AM, Anush Krishnan wrote:
Hi all,
I created a DMComposite using two DMDAs (representing the x and y
components of velocity in a 2-D staggered cartesian grid used for CFD
simulations). DMCompositeGetISLocalToGlobalMappings gives me the
global indices of the elements and ghost c
12 matches
Mail list logo