Barry,

I think I understand how to set values for the intra-grid coupling. But I am 
not clear about the way to set values for inter-grid coupling. When a component 
couples to itself at above, below, right, left, front and back grids, diagonal 
is set as 1 for the second array of DMDASetBlockFills(). How about if involving 
more grids? How does DMDA know where to allocate memory? In my case, a 
component couples to itself at the other end of the grid because of periodic 
boundary condition. There is an error message "Augment out of range . Insert 
new nonzero at global row/column (26, 520526) into matrix" because of this far 
away coupling. The column index 520526 is related to the other end of the grid 
in azimuthal direction.

Thanks,
Jiannan



From: Barry Smith <bsm...@petsc.dev>
Sent: Saturday, September 10, 2022 1:10 PM
To: Tu, Jiannan <jiannan...@uml.edu>
Cc: petsc-users <petsc-users@mcs.anl.gov>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.


  If you are using DMCreateMatrix() then MatMPIAIJSetPreallocation is not 
needed, it is automatically handled by the DM.

  It sounds like that the values you set in DMDASetBlockFills() did not include 
all the coupling, hence the preallocation was wrong, hence the time to insert 
values was too long.

  Barry



On Sep 10, 2022, at 8:16 AM, Tu, Jiannan 
<jiannan...@uml.edu<mailto:jiannan...@uml.edu>> wrote:

Just found MatMPIAIJSetPreallocation is needed. Now Jacobin insertion and 
assembly is fast.

From: Barry Smith <bsm...@petsc.dev<mailto:bsm...@petsc.dev>>
Sent: Thursday, September 8, 2022 9:53 PM
To: Tu, Jiannan <jiannan...@uml.edu<mailto:jiannan...@uml.edu>>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.





On Sep 8, 2022, at 8:34 PM, Tu, Jiannan 
<jiannan...@uml.edu<mailto:jiannan...@uml.edu>> wrote:

Barry,

Thank you.

I set up these two arrays according to the non-zero pattern of the Jacobian. 
The DMDASetBlockFills() is then called after the call to DMCreateMatrix(da, &A).

  Absolutely not! It must be called before DMCreateMatrix() so that the DM 
knows how to construct exactly the matrix you need.




Nevertheless, the program execution is still killed with exit code 9, which I 
assume is due to overuse of the memory. My desktop machine has 32 GB RAM.

I ran the code with a smaller mesh, 100x45x90, the result is the same. With 
that number of grids, 26 unknowns, and up to 10 non-zero at each matrix row, 
the estimated memory to store 100x45x90x26x10 elements should be less than 1 GB 
(double precision). I am wondering why the matrix still takes too much memory. 
Maybe I have to use matrix-free?

Thank you,
Jiannan

From: Barry Smith <bsm...@petsc.dev<mailto:bsm...@petsc.dev>>
Sent: Thursday, September 8, 2022 4:19 PM
To: Tu, Jiannan <jiannan...@uml.edu<mailto:jiannan...@uml.edu>>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.






On Sep 8, 2022, at 3:39 PM, Tu, Jiannan 
<jiannan...@uml.edu<mailto:jiannan...@uml.edu>> wrote:

Barry,

Thank you very much for the detailed description.

So the first array, dfill, is for intra grids and the second one, ofill is for 
the inter-grid coupling?

In the case for inter-grid coupling, a component is only coupled to itself, say 
x in your example is coupled to itself on above, below, right, left, front, and 
back, how to set values for the second array?

Then the off-diagonal one would just have values on its diagonal.




Jiannan

From: Barry Smith<mailto:bsm...@petsc.dev>
Sent: Thursday, September 8, 2022 2:12 PM
To: Tu, Jiannan<mailto:jiannan...@uml.edu>
Cc: petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.







On Sep 8, 2022, at 1:47 PM, Tu, Jiannan 
<jiannan...@uml.edu<mailto:jiannan...@uml.edu>> wrote:

Barry,

Thank you very much.

DMDASetBlockFills() needs two input arrays specifying sparsity pattern. In my 
case, I have 26 components at each grid. Is it correct that I need to have two 
arrays of 26x26 elements with value 1 for coupling and 0 for no coupling?

   Exactly





Also how to determine when the value should be 1 or 0?

  That is problem dependent. Given paper-and-pencil you know for your stencil 
which components are coupled with other components at each grid point and which 
components are coupled with components at neighboring grid points.

  Note the second array contains information about all the neighboring grid 
points. Above, below, right, left, front back. Normally the coupling is 
symmetric so the nonzero structures of those 6 blocks are identical. If, for 
you particular problem, the nonzero structures of the six are not identical you 
use the union of the nonzero structures of the six of them.


  2d example     say you have x,y,z at each grid point and the inter-point 
coupling is x is connected to y and y is connect to x then the first array is

[ 1 1 0 ; 1 1; 0 ; 0 0 1]; the zeros are because x is not coupled to z and y is 
not coupled to z.

For intra grid points, say x is coupled to z and itself, z is coupled to x and 
itself and y is not coupled to itself then

[1 0 1; 0 0 0; 1 0 1]; the middle row is zero because y is not coupled to 
anything and the zeros in the first and last are because x and z are not 
coupled to y.







Each component involves 7 stencils (i-1, j, k), (I, j, k), (i+1, j, k), (I, 
j-1, k), (I, j+1, k), (I, j, k-1), and (I, j, k+1), and couples with several 
other components.

Jiannan



Sent from 
Mail<https://nam10.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgo.microsoft.com%2Ffwlink%2F%3FLinkId%3D550986&data=05%7C01%7CJiannan_Tu%40uml.edu%7Ca839be5fc8ae45091a5808da934f46ff%7C4c25b8a617f746f983f054734ab81fb1%7C0%7C0%7C637984265966318343%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=dZLRf%2Br9HO8glBtvDV6eNQcFEKPzCnVydxOKjgLcWkI%3D&reserved=0>
 for Windows

From: Barry Smith<mailto:bsm...@petsc.dev>
Sent: Wednesday, September 7, 2022 11:53 AM
To: Tu, Jiannan<mailto:jiannan...@uml.edu>
Cc: petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.


DMDASetBlockFills() or DMDASetBlockFillsSparse() (they are just different ways 
of providing the same information) will help you here enormously, your type of 
case is exactly what they were designed for.

 Barry







On Sep 7, 2022, at 10:47 AM, Tu, Jiannan 
<jiannan...@uml.edu<mailto:jiannan...@uml.edu>> wrote:

Barry and Hong,

Thank you.

There are 26 components at each grid and there are not fully coupled in terms 
of stiff functions. Mutual coupling is among about 6 components.

I would prefer not using matrix-free since the Jacobina is not difficult to 
calculate and only up to 10 non-zeros at each row. I'll try DMDASetBlockFills() 
or DMDASetBlockFillsSparse() and see how they can reduce the memory usage.

Jiannan
<E6340AADF8494DB7861EA4659E7713CE.png>
From: Barry Smith <bsm...@petsc.dev<mailto:bsm...@petsc.dev>>
Sent: Tuesday, September 6, 2022 11:33 PM
To: Tu, Jiannan <jiannan...@uml.edu<mailto:jiannan...@uml.edu>>
Cc: petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov> 
<petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>>
Subject: Re: [petsc-users] Using matrix-free with KSP

CAUTION: This email was sent from outside the UMass Lowell network.








On Sep 6, 2022, at 11:00 PM, Tu, Jiannan 
<jiannan...@uml.edu<mailto:jiannan...@uml.edu>> wrote:

I am using TS IMEX to solve a large DAE system. The DAE is obtained by applying 
finite FV method to 3-D multi-specie ion/neutral fluids equations with magnetic 
induction equation. The Jacobian for stiff part is formed by using 
MatSetValuesStencil(). The Jacobian matrix is very sparse, no more than 10 
non-zeros on each row. MatSetValuesStencil requires local to global mapping by 
calling ISLocaltoGlobalMapping(). Could you please instruct me how to use local 
to global mapping?

   DMDA automatically sets up the ISLocaltoGlobalMapping() so you should not 
need to.

I also tried using DMCreateMatrix() to create the Jacobian. While local to 
global mapping is not necessary, the matrix takes too much memory and requires 
64-bit indices. I would prefer to take the advantage of sparsity of the 
Jacobian, pre-allocate the matrix to use as less as possible memory so that the 
code can be run on a multi-core desktop.

  If using matrix-free does not work for you because the linear solves do not 
converge or converge too slowly. Then you might be able to decrease the memory 
used in the matrix. The DMCreateMatrix() does take advantage of sparsity and 
tries to preallocate only what it needs. Often what it preallocates is the best 
possible,  but for multicomponent problems it has to assume there is full 
coupling within all the degrees of freedom that live at each at each grid 
point. How many components live at each grid point in your model and are they 
not all coupled to each other in the equations? If they are not fully coupled 
you can use either DMDASetBlockFills() or DMDASetBlockFillsSparse() to indicate 
the reduced coupling that actually exists for you model.

  Barry







Thank you very much for your advice.

Jiannan

Reply via email to