For structure grids you can use geometric MG which is very good. Mark
On May 2, 2012, at 12:44 AM, Dave Nystrom wrote: > Barry Smith writes: >> On May 1, 2012, at 7:22 PM, Dave Nystrom wrote: >>> >>>>> 2. Is anyone on this list sufficiently familiar with agmg and the other >>>>> PETSc mg solvers to know how to configure the PETSc mg solvers to work >>>>> more >>>>> like agmg? It seems that agmg gives better performance than the PETSc mg >>>>> solvers but I also have issues with agmg including fragility. >>>> >>>> Please send a link for information on agmg; I've never heard of that. >>> >>> http://homepages.ulb.ac.be/~ynotay/AGMG >> >> Ok, this general approach is what the Trilinos ML solver users (which is a >> fairly good) and what Mark Adams is adding to PETSc in PCGAMG. > > Thanks for this info. > >> You can start by ./configure PETSc with --download-ml and then run the >> program with -pc_type mg > > I assume you mean "-pc_type ml". I have actually configured and run > petsc-dev with gamg, ml and hypre-boomeramg. I have just used the default > parameters because I am not sure which of the parameters would be most useful > to try and tune. It has seemed that ML has given the best results so far but > those results in terms of run time performance are not as good as agmg. > > I'd like to somehow get educated on how to intelligently tune the adjustable > parameters on these various MG packages and ML seems a good place to start. > But I'm not quite sure how to get started since there are enough adjustable > parameters that exploring parameter space could be challenge. > > Perhaps there is some good documentation on this. I guess I should google > "Trilinos ML documentation" and see what I find. My solves are on a 2d > structured mesh and I have 8 of them of various flavors. > >> You can run with -ksp_view to see the number of levels etc it ended up >> with and >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCML.html >> for some options. > > Thanks again. > > Dave > >> Barry >> >>> >>>> Barry >>>> >>>>> >>>>> Thanks, >>>>> >>>>> Dave >> >