u back an actually submatrix, otherwise it gives a view.
This would do
everything that we currently do without this horrible MatNest
interface bubbling to the top.
Matt
> On Apr 10, 2019, at 11:49 AM, Manuel Colera Rico via
petsc-users mailto:petsc-users@mcs.anl.
ently do without this horrible MatNest
interface bubbling to the top.
Matt
> On Apr 10, 2019, at 11:49 AM, Manuel Colera Rico via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
>
> Thank you for your answer, Matt. In the MWE example attached
be
that the moment to use them is from the beginning...
once all the code is developed, it is very hard to switch matrices types.
Regards,
Manuel
---
On 4/10/19 5:41 PM, Matthew Knepley wrote:
On Wed, Apr 10, 2019 at 11:29 AM Manuel Colera Rico via petsc-users
mailto:petsc-users@mcs.anl.gov
Hello,
I am trying to solve a system whose matrix is of type MatNest. If I
don't use KSPSetUp(), everything is fine. However, if I use that
routine, I get the following error:
0]PETSC ERROR: - Error Message
--
OK, thank you Matt.
Manuel
---
On 3/25/19 6:27 PM, Matthew Knepley wrote:
On Mon, Mar 25, 2019 at 8:07 AM Manuel Colera Rico via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Hello,
I would like to solve a N*N block system (with N>2) in which some
of the
Hello,
I would like to solve a N*N block system (with N>2) in which some of the
diagonal blocks are null. My system matrix is defined as a MatNest. As
N>2, I can't use "pc_fieldsplit_type schur" nor
"pc_fieldsplit_detect_saddle_point". The other algorithms ("additive",
"multiplicative" and
att, this bug looks unrelated to my VecRestoreArrayRead_Nest fix.
--Junchao Zhang
On Wed, Mar 13, 2019 at 9:05 AM Matthew Knepley <mailto:knep...@gmail.com>> wrote:
On Wed, Mar 13, 2019 at 9:44 AM Manuel Colera Rico via petsc-users
mailto:petsc-users@mcs.anl.gov>> w
, Jed Brown wrote:
Is there any output if you run with -malloc_dump?
Manuel Colera Rico via petsc-users writes:
Hi, Junchao,
I have installed the newest version of PETSc and it works fine. I just
get the following memory leak warning:
Direct leak of 28608 byte(s) in 12 object(s) allocated from
individually.
Manuel
---
On 3/13/19 2:28 PM, Jed Brown wrote:
Is there any output if you run with -malloc_dump?
Manuel Colera Rico via petsc-users writes:
Hi, Junchao,
I have installed the newest version of PETSc and it works fine. I just
get the following memory leak warning:
Direct leak of 28608
VecRestoreArrayRead_Nest. Could you try the master branch of
PETSc to see if it fixes your problem?
Thanks.
--Junchao Zhang
On Mon, Mar 11, 2019 at 6:56 AM Manuel Colera Rico via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Hello,
I need to solve a 2*2 block linear system. The matrice
Hello,
I need to solve a 2*2 block linear system. The matrices A_00, A_01,
A_10, A_11 are constructed separately via MatCreateSeqAIJWithArrays and
MatCreateSeqSBAIJWithArrays. Then, I construct the full system matrix
with MatCreateNest, and use MatNestGetISs and PCFieldSplitSetIS to set
up
11 matches
Mail list logo