On Tue, Jun 18, 2013 at 12:58 PM, Jens Lohne Eftang <[email protected]>
wrote:
>
> On 06/18/2013 02:45 PM, John Peterson wrote:
>
>
>
>
> On Tue, Jun 18, 2013 at 12:41 PM, Jens Lohne Eftang <[email protected]>
wrote:
>>
>> Hi all,
>>
>>
>> I'm solving a large 3D linear elasticity (steady) problem. My mesh has
>> 8.8million nodes, so since this is a vector-valued problem there's
>> around 26 million unknowns.
>>
>> The code is basically systems_of_equations_ex6 (except for the mesh
>> generation part, and I also do not compute stresses).
>>
>> I have configured petsc with --download-ml, and I am running my program
with
>>
>> mpirun -np 20 ./elasticity-opt -ksp_type cg -pc_type gamg
>> -pc_gamg_agg_nsmooths 1 -ksp_monitor -ksp_converged_reason -log_summary
>>
>> that is, CG as the iterative solver and AMG as preconditioner.
>>
>> The problem is the huge amount of memory that this solve requires --- I
>> have 128Gb memory, and I run out! Due to the large problem size I was of
>> course expecting significant memory consumption, but not this bad. I did
>> try ParallelMesh, but that did not change things.
>>
>> Am I doing something obviously wrong here?
>
>
> Do you still run out of memory if you run without GAMG?
>
> There could be some GAMG options that control memory consumption, I don't
know too much about it.
>
> I am able to to solve the problem with -pc_type bjacobi and -sub_pc_type
icc, but that still uses a lot of memory, around 60Gb. And also this
required more than 5000 CG iterations which is why I moved to AMG.


OK, switching from bjacobi -> GAMG caused memory consumption to more than
double?!  I'd definitely look into the GAMG options...

Another possibility is that you could build PETSc with Hypre and run with:

-pc_type hypre -pc_hypre_type boomeramg
-pc_hypre_boomeramg_strong_threshold 0.7

--
John
------------------------------------------------------------------------------
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to