Barry,

Thank you very much.

DMDASetBlockFills() needs two input arrays specifying sparsity pattern. In my 
case, I have 26 components at each grid. Is it correct that I need to have two 
arrays of 26x26 elements with value 1 for coupling and 0 for no coupling? Also 
how to determine when the value should be 1 or 0? Each component involves 7 
stencils (i-1, j, k), (I, j, k), (i+1, j, k), (I, j-1, k), (I, j+1, k), (I, j, 
k-1), and (I, j, k+1), and couples with several other components.

Jiannan



Sent from Mail<https://go.microsoft.com/fwlink/?LinkId=550986> for Windows

From: Barry Smith<mailto:bsm...@petsc.dev>
Sent: Wednesday, September 7, 2022 11:53 AM
To: Tu, Jiannan<mailto:jiannan...@uml.edu>
Cc: petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.


DMDASetBlockFills() or DMDASetBlockFillsSparse() (they are just different ways 
of providing the same information) will help you here enormously, your type of 
case is exactly what they were designed for.

 Barry



On Sep 7, 2022, at 10:47 AM, Tu, Jiannan 
<jiannan...@uml.edu<mailto:jiannan...@uml.edu>> wrote:

Barry and Hong,

Thank you.

There are 26 components at each grid and there are not fully coupled in terms 
of stiff functions. Mutual coupling is among about 6 components.

I would prefer not using matrix-free since the Jacobina is not difficult to 
calculate and only up to 10 non-zeros at each row. I'll try DMDASetBlockFills() 
or DMDASetBlockFillsSparse() and see how they can reduce the memory usage.

Jiannan

From: Barry Smith <bsm...@petsc.dev<mailto:bsm...@petsc.dev>>
Sent: Tuesday, September 6, 2022 11:33 PM
To: Tu, Jiannan <jiannan...@uml.edu<mailto:jiannan...@uml.edu>>
Cc: petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov> 
<petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.




On Sep 6, 2022, at 11:00 PM, Tu, Jiannan 
<jiannan...@uml.edu<mailto:jiannan...@uml.edu>> wrote:

I am using TS IMEX to solve a large DAE system. The DAE is obtained by applying 
finite FV method to 3-D multi-specie ion/neutral fluids equations with magnetic 
induction equation. The Jacobian for stiff part is formed by using 
MatSetValuesStencil(). The Jacobian matrix is very sparse, no more than 10 
non-zeros on each row. MatSetValuesStencil requires local to global mapping by 
calling ISLocaltoGlobalMapping(). Could you please instruct me how to use local 
to global mapping?

   DMDA automatically sets up the ISLocaltoGlobalMapping() so you should not 
need to.

I also tried using DMCreateMatrix() to create the Jacobian. While local to 
global mapping is not necessary, the matrix takes too much memory and requires 
64-bit indices. I would prefer to take the advantage of sparsity of the 
Jacobian, pre-allocate the matrix to use as less as possible memory so that the 
code can be run on a multi-core desktop.

  If using matrix-free does not work for you because the linear solves do not 
converge or converge too slowly. Then you might be able to decrease the memory 
used in the matrix. The DMCreateMatrix() does take advantage of sparsity and 
tries to preallocate only what it needs. Often what it preallocates is the best 
possible,  but for multicomponent problems it has to assume there is full 
coupling within all the degrees of freedom that live at each at each grid 
point. How many components live at each grid point in your model and are they 
not all coupled to each other in the equations? If they are not fully coupled 
you can use either DMDASetBlockFills() or DMDASetBlockFillsSparse() to indicate 
the reduced coupling that actually exists for you model.

  Barry



Thank you very much for your advice.

Jiannan


Reply via email to