On 06/18/2013 02:48 PM, Cody Permann wrote:
>
>
>
> On Tue, Jun 18, 2013 at 12:41 PM, Jens Lohne Eftang 
> <[email protected] <mailto:[email protected]>> wrote:
>
>     Hi all,
>
>
>     I'm solving a large 3D linear elasticity (steady) problem. My mesh has
>     8.8million nodes, so since this is a vector-valued problem there's
>     around 26 million unknowns.
>
>     The code is basically systems_of_equations_ex6 (except for the mesh
>     generation part, and I also do not compute stresses).
>
>     I have configured petsc with --download-ml, and I am running my
>     program with
>
>     mpirun -np 20 ./elasticity-opt -ksp_type cg -pc_type gamg
>     -pc_gamg_agg_nsmooths 1 -ksp_monitor -ksp_converged_reason
>     -log_summary
>
>     that is, CG as the iterative solver and AMG as preconditioner.
>
>     The problem is the huge amount of memory that this solve requires
>     --- I
>     have 128Gb memory, and I run out! Due to the large problem size I
>     was of
>     course expecting significant memory consumption, but not this bad.
>     I did
>     try ParallelMesh, but that did not change things.
>
>
>
> Is that total memory across all 20 processes, per node, or per 
> process?  128GB total is not unreasonable for a problem of this size. 
>  Typically you would spread this out over several nodes though so it 
> could run.  One way to drastically reduce total memory consumption 
> would be to implement threading for your Jacobian and residual callbacks.

Hmm ok... the memory consumption I reported was total across all 20 
processes. I'll  look into some solver options and see if I can get it 
under my 128 threshold.

Thanks!

>
> Cody
>
>
>     Am I doing something obviously wrong here?
>
>
>     Thanks,
>     Jens
>
>     
> ------------------------------------------------------------------------------
>     This SF.net email is sponsored by Windows:
>
>     Build for Windows Store.
>
>     http://p.sf.net/sfu/windows-dev2dev
>     _______________________________________________
>     Libmesh-users mailing list
>     [email protected]
>     <mailto:[email protected]>
>     https://lists.sourceforge.net/lists/listinfo/libmesh-users
>
>

------------------------------------------------------------------------------
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to