On Apr 18, 2013, at 12:14 AM, Jed Brown <[email protected]> wrote:

>>   My application is a GLS solver for compressible Euler equations,
>>   and I am running 2D and 3D simulations at transonic Mach
>>   numbers. The initial mesh in 3D analysis has ~2millions tetrahedra,
>>   and the h-refinement process increases this to ~8million
>>   tetrahedra. I am using the standard PETSc GMRES solver with block
>>   Jacobi preconditioner (the default).
> 
> Are these steady-state computations or finite time step?
> 

I am time marching to steady state. 

>> So, a few questions: 
>> --    I am curious about the experience that others have with such 
>> applications? 
>> --    What is the typical Krylov subspace dimension size that people use for 
>> such problems? 
>> --    Are there other built-in preconditioners in PETSc that could do a 
>> better job? 
> 
> Try increasing the overlap slightly '-pc_type asm -pc_asm_overlap 1'
> (and 2).  Perhaps also combine with '-sub_pc_factor_levels 1' (and 2).
> Let us know if these make a useful difference.
> 
> It's usually preferable to order your unknowns so that the fields are
> interlaced, with all values at a node contiguous.

I will certainly give these a shot tomorrow. Do you know if these require any 
other modification in my code/libMesh, or providing the command line options 
would be enough? 

> 
>> --    Are other custom preconditioners expected to perform better? If so, 
>> which ones? 
> 
> It's hard to beat nonlinear multigrid (FAS) for steady-state problems of
> this variety, but those depend on having an accurate coarse grid
> discretization.  The favorite for that has always been to use
> cell-centered (finite volume) discretizations and use agglomeration to
> produce a coarse operator.  A nonlinear relaxation process is of less
> value for finite element methods because it's relatively more expensive
> than with FV/FD discretizations.
> 
> Field-split preconditioners can also be useful, but most will only work
> properly for low Mach.

I have been thinking of doing something with multigrid, but am not sure if 
something along those lines is currently feasible with libMesh<->PETSc 
interface.  Currently I don't have much experience with multigrid, so I am 
curious to see what is possible. Do you think that somehow going between the 
original mesh and the refined mesh would provide some reasonable form of 
multigrid? 

Manav





------------------------------------------------------------------------------
Precog is a next-generation analytics platform capable of advanced
analytics on semi-structured data. The platform includes APIs for building
apps and a phenomenal toolset for data science. Developers can use
our toolset for easy data analysis & visualization. Get a free account!
http://www2.precog.com/precogplatform/slashdotnewsletter
_______________________________________________
Libmesh-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/libmesh-users

Reply via email to