On Tue, Jun 18, 2013 at 12:41 PM, Jens Lohne Eftang <[email protected]>wrote:

> Hi all,
>
>
> I'm solving a large 3D linear elasticity (steady) problem. My mesh has
> 8.8million nodes, so since this is a vector-valued problem there's
> around 26 million unknowns.
>
> The code is basically systems_of_equations_ex6 (except for the mesh
> generation part, and I also do not compute stresses).
>
> I have configured petsc with --download-ml, and I am running my program
> with
>
> mpirun -np 20 ./elasticity-opt -ksp_type cg -pc_type gamg
> -pc_gamg_agg_nsmooths 1 -ksp_monitor -ksp_converged_reason -log_summary
>
> that is, CG as the iterative solver and AMG as preconditioner.
>
> The problem is the huge amount of memory that this solve requires --- I
> have 128Gb memory, and I run out! Due to the large problem size I was of
> course expecting significant memory consumption, but not this bad. I did
> try ParallelMesh, but that did not change things.
>
> Am I doing something obviously wrong here?
>

Do you still run out of memory if you run without GAMG?

There could be some GAMG options that control memory consumption, I don't
know too much about it.

-- 
John
------------------------------------------------------------------------------
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to