I would first run with -ksp_monitor_true_residual -ksp_converged_reason to 
make sure that those "very fast" cases are actually converging in those runs 
also use -ksp_view to see what the GMAG parameters are. Also use the -info 
option  to have it print details on the solution process. 

   Barry



> On Nov 29, 2019, at 4:14 PM, Felipe Giacomelli <fe.wall...@gmail.com> wrote:
> 
> Hello,
> 
> I'm trying to solve Biot's poroelasticity (Cryer's sphere problem) through a 
> fully coupled scheme. Thus, the solution of a single linear system yields 
> both displacement and pressure fields,
> 
>   |K       L      | | u | = |b_u|.
>   |Q  (A + H) | | p | = |b_p|
> 
> The linear system is asymmetric, given that the discrete equations were 
> obtained through the Element based Finite Volume Method (EbFVM). An 
> unstructured tetrahedral grid is utilised, it has about 10000 nodal points 
> (not coarse, nor too refined). Therefore, GMRES and GAMG are employed to 
> solve it.
> 
> Furthermore, the program was parallelised through a Domain Decomposition 
> Method. Thus, each processor works in its subdomain only.
> 
> So far, so good. For a given set of poroelastic properties (which are 
> constant throughout time and space), the speedup increases as more processors 
> are utilised:
> 
>   coupling intensity: 7.51e-01
> 
>     proc    solve time [s]
>        1        314.23
>        2        171.65
>        3        143.21
>        4        149.26 (> 143.21, but ok)
> 
> However, after making the problem MORE coupled (different poroelastic 
> properties), a strange behavior is observed:
> 
>   coupling intensity: 2.29e+01
> 
>     proc    solve time [s]
>        1      28909.35
>        2        192.39
>        3        181.29
>        4      14463.63
> 
> Recalling that GMRES and GAMG are used, KSP takes about 4300 iterations to 
> converge when 1 processor is employed. On the other hand, for 2 processors, 
> KSP takes around 30 iterations to reach convergence. Hence, explaining the 
> difference between the solution times.
> 
> Increasing the coupling even MORE, everything goes as expected:
> 
>   coupling intensity: 4.63e+01
> 
>     proc    solve time [s]
>        1        229.26
>        2        146.04
>        3        121.49
>        4        107.80
> 
> Because of this, I ask:
> 
> * What may be the source of this behavior? Can it be predicted?
> * How can I remedy this situation?
> 
> At last, are there better solver-pc choices for coupled poroelasticity?
> 
> Thank you,
> Felipe

Reply via email to