Thanks for the comment! I actually solved the problem this morning. I had 
simply made a mistake, partially
(as you suggested) related to the construction of edofF. So, combining DMDA and 
DMStag in the way
outlined in the sketch below seems to work perfectly fine


From: Patrick Sanan <patrick.sa...@gmail.com>
Sent: den 25 april 2022 08:48
To: Carl-Johan Thore <carl-johan.th...@liu.se>
Cc: Barry Smith <bsm...@petsc.dev>; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Combining DMDA and DMStag

In that case, from your original message, it seems like the issue might be in 
the

    // populate edofF, edofT

Or, perhaps, it's simply due to the fact that the numbering/ordering depends on 
the number of ranks. This is illustrated in this image in the manual: 
https://petsc.org/release/docs/manual/vec/#fig-daao<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fpetsc.org%2Frelease%2Fdocs%2Fmanual%2Fvec%2F%23fig-daao&data=05%7C01%7Ccarl-johan.thore%40liu.se%7Cef98948377bc4771eab108da2687906f%7C913f18ec7f264c5fa816784fe9a58edd%7C0%7C0%7C637864660932072744%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=smjpJ77ZiE10xeEP2OkfD5FYgbt2xKMviPWD%2FmmJEtM%3D&reserved=0>

This is for DMDA but DMStag does something similar. There is a "natural" 
ordering, which is the ordering you'd get with a single rank. There's also 
"PETSc ordering" which depends on the number of ranks - in this case each 
rank's portion of the vector is a contiguous range.

If this might be the explanation for your problem, note that for DMDA there are 
already some helper functions which can map "PETSc" to "natural" ordering. 
While these obviously require a lot of communication between ranks and so you'd 
want to avoid them in production runs, these are useful in situations where you 
want to directly compare vectors on different numbers of ranks for diagnostic 
or I/O purposes.

E.g. see this man page:
https://petsc.org/release/docs/manualpages/DMDA/DMDANaturalToGlobalBegin.html<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fpetsc.org%2Frelease%2Fdocs%2Fmanualpages%2FDMDA%2FDMDANaturalToGlobalBegin.html&data=05%7C01%7Ccarl-johan.thore%40liu.se%7Cef98948377bc4771eab108da2687906f%7C913f18ec7f264c5fa816784fe9a58edd%7C0%7C0%7C637864660932072744%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=dCPc2l6Uzg0FGSRKZyIC20OiFv60UjgT2%2BQ8JTRjrPk%3D&reserved=0>

I haven't implemented the equivalent for DMStag, but if it's needed I can look 
at adding it.

Am Fr., 22. Apr. 2022 um 17:28 Uhr schrieb Carl-Johan Thore 
<carl-johan.th...@liu.se<mailto:carl-johan.th...@liu.se>>:
Thanks for the quick replies!

Using dmstag for the temperature, or even for both fields is a possibility of 
course. Preferably however I'd like to keep the temp. code as it is and be able 
to quickly switch between different methods for the flow, based on dmda, dmstag 
and so on, so I'll try a bit more with the dmda-dmstag approach first.





________________________________
From: Patrick Sanan <patrick.sa...@gmail.com<mailto:patrick.sa...@gmail.com>>
Sent: 22 April 2022 17:04
To: Barry Smith <bsm...@petsc.dev<mailto:bsm...@petsc.dev>>
Cc: Carl-Johan Thore <carl-johan.th...@liu.se<mailto:carl-johan.th...@liu.se>>; 
petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov> 
<petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>>
Subject: Re: [petsc-users] Combining DMDA and DMStag



Am Fr., 22. Apr. 2022 um 16:04 Uhr schrieb Barry Smith 
<bsm...@petsc.dev<mailto:bsm...@petsc.dev>>:

   We would need more details as to exactly what goes wrong to determine any 
kind of fix; my guess would be that the layout of the velocity vectors and 
temperature vectors is slightly different since the DMDA uses nelx+1 while the 
stag uses nelx and may split things up slightly differently in parallel. You 
could try very small problems with say 2 or 4 ranks and put known values into 
the vectors and look at the ghost point update locations and exact locations in 
the local and global vectors to make sure everything is where you expect it to 
be.

  There is also the possibility of using a DMStag for the temperature instead 
of DMDA since DMDA provides essentially a subset of the functionality of DMDA 
to get a more consistent layout of the unknowns in the two vectors.


This was going to be my first suggestion as well - one way to ensure 
compatibility would be to use DMStag for everything. E.g. you could create your 
temperature DMStag with only vertex/corner (dof30 degrees of freedom, and then 
create one or more a "compatible" DMStags (same elements on each MPI rank) for 
your other fields, using DMStagCreateCompatibleDMStag() .

  Finally, one could use a single DMStag with all the unknowns but treat 
subvectors of the unknowns differently in your discretization and solve 
process. So assign velocity values (I guess) to the cell faces and temperature 
to the cell vertices, this will give consistent parallel decomposition of 
values but you will have to manage the fact that the unknowns are interlaced 
into a single vector so your solver portions of the code may need to "pull" out 
appropriate subvectors.

  Barry



On Apr 22, 2022, at 9:45 AM, Carl-Johan Thore 
<carl-johan.th...@liu.se<mailto:carl-johan.th...@liu.se>> wrote:

Hi!

I'm working on a convection-diffusion heat transfer problem. The temperature
is discretized using standard Q1 elements and a DMDA. The flow is modelled
using a stabilized Q1-Q0 method for which DMStag seemed like a good choice. The 
codes for the temperature
and flow work fine separately (both in serial and parallel), but when combined 
and running
in parallel, a problem sometimes arises in the assembly of the thermal system 
matrix.
Here's a rough sketch of the combined code:

// Create dmda for thermal problem and dmstag for flow problem
DMDACreate3d(PETSC_COMM_WORLD, bx, by, bz, DMDA_STENCIL_BOX, nelx+1, nely+1, 
nelz+1, PETSC_DECIDE, PETSC_DECIDE, PETSC_DECIDE,
                1, stencilWidth, 0, 0, 0, &dmthermal);
...
// A bit of code to adjust Lx,Ly,Lz so that dmthermal and dmflow are compatible 
in the sense of having the same
// local elements
...
DMStagCreate3d(PETSC_COMM_WORLD, bx, by, bz, nelx, nely, nelz, md,nd,pd,
3,0,0,0,DMSTAG_STENCIL_BOX,stencilWidth,Lx,Ly,Lz,&dmflow);

PetscInt edofT[8];          // 8-noded element with 1 temp DOF per node
DMStagStencil edofF[24];       // 8 nodes with 3 velocity DOFs each

// Assemble thermal system matrix K
for (PetscInt e=0 ...)   // Loop over local elements
{
// Populate edofF, edofT
                             // Get element velocities in ue from local 
velocity vector uloc
                             DMStagVecGetValuesStencil(dmflow,uloc,24,edof,ue);
                                                          ...
                             Ke = Ke_diffusion + Ke_convection(ue)
                                                          ...
                             MatSetValuesLocal(K, 8, edofT, 8, edofT, Ke, 
ADD_VALUES);
}

This always works fine in serial, but depending on the mesh and the number of 
ranks,
we don't always get the correct values in the element velocity vector ue. I 
suspect
this has something to do with the ordering of the elements and/or the DOFs, 
because
the elements in the global velocity vector are always the same but their order 
may change
(judging from the output of VecView at least).

Is it possible to ensure compatibility between the dm:s, or find some kind of 
mapping
between them, so that something along the lines of the code above always works?

Kind regards,
Carl-Johan

Reply via email to