On Fri, Jul 27, 2018 at 11:12 AM Jed Brown <j...@jedbrown.org> wrote:
> Pierre Jolivet <pierre.joli...@enseeiht.fr> writes: > > > Everything is fine with GAMG I think, please find the (trimmed) > -eps_view attached. The problem is that, correct me if I’m wrong, there is > no easy way to redistribute data efficiently from within PETSc when using > fieldsplit with unbalanced number of unknowns per field. For the other > three fields, the solvers are still behaving somehow properly. Now if I’d > like to optimize this some more, I’d probably need to switch from a > fieldsplit to a MatNest, with submatrices from different communicators, so > that I don’t have all processes handling the pressure space. But this is > apparently not allowed. > > What if pressure is still on a global communicator, but all the degrees > of freedom are in a subset? Then MatMult and the like have nothing to > send or receive on the processes without any dofs. Since there are no > reductions in PCApply (there are in setup), it should complete > immediately for all the processes that don't have any dofs, right? > Yes, idle processes should just fall through.