> On 27 Jul 2018, at 5:33 PM, Jed Brown <j...@jedbrown.org> wrote:
> 
> Pierre Jolivet <pierre.joli...@enseeiht.fr> writes:
> 
>>> On 27 Jul 2018, at 5:12 PM, Jed Brown <j...@jedbrown.org> wrote:
>>> 
>>> Pierre Jolivet <pierre.joli...@enseeiht.fr> writes:
>>> 
>>>> Everything is fine with GAMG I think, please find the (trimmed) -eps_view 
>>>> attached. The problem is that, correct me if I’m wrong, there is no easy 
>>>> way to redistribute data efficiently from within PETSc when using 
>>>> fieldsplit with unbalanced number of unknowns per field. For the other 
>>>> three fields, the solvers are still behaving somehow properly. Now if I’d 
>>>> like to optimize this some more, I’d probably need to switch from a 
>>>> fieldsplit to a MatNest, with submatrices from different communicators, so 
>>>> that I don’t have all processes handling the pressure space. But this is 
>>>> apparently not allowed.
>>> 
>>> What if pressure is still on a global communicator, but all the degrees
>>> of freedom are in a subset?  Then MatMult and the like have nothing to
>>> send or receive on the processes without any dofs.  Since there are no
>>> reductions in PCApply (there are in setup), it should complete
>>> immediately for all the processes that don't have any dofs, right?
>> 
>> My PC is PCKSP, with GMRES underneath, so there are reductions on the global 
>> communicator in PCApply.
> 
> Why PCKSP in the pressure solve?

I need to solve (on the pressure space) Ax = b with A^-1 = B^-1+ C^-1.
I do an additive PCCOMPOSITE, with two PCKSP. B is a mass matrix (PCJACOBI, no 
problem), C is a shifted Laplacian (PCGAMG, problem…).

Reply via email to