It does not compromise the checkpointing ability since TSAdjoint does not rely
on TSHistory.
diff --git a/src/ts/trajectory/interface/traj.c
b/src/ts/trajectory/interface/traj.c
index 465a52f5cf..8267fd73a3 100644
--- a/src/ts/trajectory/interface/traj.c
+++ b/src/ts/trajectory/interface/traj.c
export PETSC_OPTIONS="-ts_trajectory_use_history 0" before starting python or
os.environ['PETSC_OPTIONS'] = '-ts_trajectory_use_history 0'
is the easiest.
> On Aug 18, 2020, at 6:12 PM, Salazar De Troya, Miguel via petsc-users
> wrote:
>
> Would it be possible for you to share a patch, PR o
Would it be possible for you to share a patch, PR or to point where in the ts.c
file to add those lines? I am working from petsc4py, it would be easy for me to
modify my petsc local repo than creating the petsc4py interface functions for
those two calls.
Thanks
Miguel
From: "Zhang, Hong"
Date
Thank you,
Does this compromise the ability of TSAdjoint to use checkpointing schemes to
save memory?
Miguel
From: "Zhang, Hong"
Date: Tuesday, August 18, 2020 at 3:46 PM
To: "Salazar De Troya, Miguel"
Cc: "Zhang, Hong via petsc-users"
Subject: Re: [petsc-users] Error calling TSSolve more
To get rid of this error, you can disable TSHistory with the command line
option -ts_trajectory_use_history 0
or set up your TS with
TSGetTrajectory(ts, &tj);
TSTrajectorySetUseHistory(tj, PETSC_FALSE);
It is a known issue, but not the intended behavior for TSSetSaveTrajectory. I
think TSHisto
Hello,
If I set up my TS with TSSetSaveTrajectory() and then call TSSolve() more than
once, the second time I get this error:
[0] TSSolve() line 4102 in
/Users/salazardetro1/scicomp_libraries/firedrake-debug/firedrake/src/petsc/src/ts/interface/ts.c
[0] TSTrajectorySet() line 73 in
/Users/sala
> On Aug 18, 2020, at 12:14 PM, Adolfo Rodriguez wrote:
>
> I am suspecting that there is a memory leak in the implementation of
> non-linear preconditioners in PETSc.
It is possible, in the configurations we test there are no memory leaks, but
perhaps your configuration does something w
I am suspecting that there is a memory leak in the implementation of
non-linear preconditioners in PETSc.
When I use the following options I see the memory usage increase (using the
windows process monitor) during consecutive time steps.
PetscOptionsSetValue(NULL, "-snes_type", "qn");
PetscOpti
On Mon, Aug 17, 2020 at 7:05 PM Fande Kong wrote:
> IIRC, Chaco does not produce an arbitrary number of subdomains. The number
> needs to be like 2^n.
>
No, Chaco can do an arbitrary number.
Thanks,
Matt
> ParMETIS and PTScotch are much better, and they are production-level code.
> If
I thought these were the Lapalcian (Poisson and AMerphere law).
Anyway, the coarse grids are very messed up, or at least the eigen
estimates are very messed up. A bad QR solver, used in GAMG's coarse grid
construction, could do that. I've never seen that happen before, but it
would explain this.
Dear Mark, the matrices are not symmetric and not positive definite. I did
try to add:
*-*ampere_mg_levels_esteig_*ksp_monitor_singular_value*
-ampere_mg_levels_esteig_ksp_max_it *50*
-ampere_mg_levels_esteig_ksp_type
*gmres*
but it still fails to converge.
For the time being it seems that hypre on
11 matches
Mail list logo