Thanks for the feedback, everyone. My null space is simply the constant vectors 
and I am setting MatSetNullSpace based on code I had been testing with other 
PC/KSP options, which grew out of examples. GMRES/BoomerAMG seems to be the 
best fit for my problem but I stumbled upon that via experimentation (Hurray 
for KSP/PCSetFromOptions!), rather than via a more informed analysis. I am 
still in the process of learning about both the GMRES algorithm and multigrid 
in general, so I asked this question in case it turned out that I could improve 
my working code.

I'd be interested in that interface to HYPRE_BoomerAMGSetInterpVectors, if it 
makes it into PETSc.

--Matt

--------------------------------------------------------------
Matthew Young
Graduate Student
Boston University Dept. of Astronomy
--------------------------------------------------------------


________________________________________
From: [email protected] [[email protected]] on 
behalf of Barry Smith [[email protected]]
Sent: Thursday, July 23, 2015 6:13 AM
To: Lawrence Mitchell
Cc: [email protected]
Subject: Re: [petsc-users] Null space/Near null space

> On Jul 23, 2015, at 3:42 AM, Lawrence Mitchell 
> <[email protected]> wrote:
>
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> On 22/07/15 23:57, Barry Smith wrote:
>>
>> If your matrix has a null space you should always be setting
>> MatSetNullSpace.
>>
>> If you know the near null space then you should also always set it
>> (it cannot do any harm) because some preconditioners can take
>> advantage of it.
>>
>> I do not think PETSc currently has code that transfers near null
>> space information to hypre; I do not know if hypre has any
>> algorithms that can take advantage of it. Only some AMG multigrid
>> methods can utilize near null space information.
>
> FWIW, the HYPRE manual suggests yes:
>
> If the block size is set (PETSc already does this) and nodal systems
> coarsening is requested (PETSc has an option for this) then one can
> provide the near null modes with HYPRE_BoomerAMGSetInterpVectors,
> passing an array of Vecs (PETSc does not do this last step).

   Thanks. So someone should make this small code addition plus a test case 
that utilizes it to demonstrate that it functions and wrap that up in a pull 
request.

  Barry

>
> Lawrence
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1
>
> iQEcBAEBAgAGBQJVsKjjAAoJECOc1kQ8PEYvWhQIALEe6KAnENO8yveClQQAUImw
> xMkHXkuHiYtnbPZhQH8YeuVEx71pkmimXc0wRk532ee0FjfQZypuKybKrpFtVSvT
> HsvpFJwE0n3HxwaVdIYYtG8c0c/Lj3326u1oFxCN5trbcUAY5VUSb41fYPGrLc6q
> UQmWwMYgqPBdsGOykdOkVGGvJFbcc2CdR99Of3sEu+C0ub53/8T2Cav4yxAG32rn
> +7lxX+a74nN+M6lRCGI8D9HiKLwKtxUKhO8W3+QeTA+y0Hf4ytZ/4AkiPScU7j0n
> KwQ5vBdIt9hK8Qc8RU28gAn7uS0Ih4CgsrTtKP4Fdd5ksMEWTRZBQAimd2Yd5Cw=
> =bHru
> -----END PGP SIGNATURE-----

Reply via email to