Re: [petsc-users] Distribution of DMPlex for FEM

2016-08-30 Thread Morten Nobel-Jørgensen
We have now hit a related problem. If we change the dof from 1 to 3 we the 
following error message. I'm using the next branch (pulled from git today).

mpiexec -np 2 ./ex18k
[1]PETSC ERROR: - Error Message 
--
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: Row too large: row 36 max 35
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for 
trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.7.3-1857-g5b40c63  GIT Date: 
2016-08-29 22:13:25 -0500
[1]PETSC ERROR: ./ex18k on a arch-linux2-c-debug named morten-VirtualBox by 
morten Tue Aug 30 21:31:36 2016
[1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
--with-fc=gfortran --download-fblaslapack --download-mpich --download-netcdf 
--download-hdf5 --download-exodusii
[1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 556 in 
/home/morten/petsc/src/mat/impls/aij/mpi/mpiaij.c
[1]PETSC ERROR: #2 MatSetValues() line 1239 in 
/home/morten/petsc/src/mat/interface/matrix.c
[1]PETSC ERROR: #3 MatSetValuesLocal() line 2102 in 
/home/morten/petsc/src/mat/interface/matrix.c
[1]PETSC ERROR: #4 CreateGlobalStiffnessMatrix() line 82 in 
/home/morten/topop_in_petsc_unstruct/ex18k.cc
[1]PETSC ERROR: #5 main() line 113 in 
/home/morten/topop_in_petsc_unstruct/ex18k.cc
[1]PETSC ERROR: No PETSc Option Table entries

We are not fully sure if this is related to the way we use DMPlex or a bug in 
our code. Any help or tips will be appreciated :)

Kind regards,
Morten



From: Morten Nobel-Jørgensen
Sent: Thursday, July 14, 2016 9:45 AM
To: Matthew Knepley
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Distribution of DMPlex for FEM

Hi Matthew

Thanks for your answer and your fix. It works :)))

Kind regards,
Morten


Fra: Matthew Knepley <knep...@gmail.com<mailto:knep...@gmail.com>>
Dato: Thursday 14 July 2016 at 00:03
Til: Morten Nobel-Joergensen <m...@dtu.dk<mailto:m...@dtu.dk>>
Cc: "petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>" 
<petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>>
Emne: Re: [petsc-users] Distribution of DMPlex for FEM

On Wed, Jul 13, 2016 at 3:57 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
I’m having problems distributing a simple FEM model using DMPlex. For test case 
I use 1x1x2 hex box elements (/cells) with 12 vertices. Each vertex has one DOF.
When I distribute the system to two processors, each get a single element and 
the local vector has the size 8 (one DOF for each vertex of a hex box) as 
expected.

My problem is that when I manually assemble the global stiffness matrix (a 
12x12 matrix) it seems like my ghost values are ignored. I’m sure that I’m 
missing something obvious but cannot see what it is.

In the attached example, I’m assembling the global stiffness matrix using a 
simple local stiffness matrix of ones. This makes it very easy to see if the 
matrix is assembled correctly. If I run it on one process, then global 
stiffness matrix consists of 0’s, 1’s and 2’s and its trace is 16.0. But if I 
run it distributed on on two, then it consists only of 0's and 1’s and its 
trace is 12.0.

I hope that somebody can spot my mistake and help me in the right direction :)

This is my fault, and Stefano Zampini had already tried to tell me this was 
broken. I normally use DMPlexMatSetClosure(), which handles global indices 
correctly.
I have fixed this in the branch

  knepley/fix-plex-l2g

which is also merged to 'next'. I am attaching a version of your sample where 
all objects are freed correctly. Let me know if that works for you.

  Thanks,

 Matt

Kind regards,
Morten



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener


ex18k.cc
Description: ex18k.cc


Re: [petsc-users] Distribution of DMPlex for FEM

2016-07-14 Thread Morten Nobel-Jørgensen
Hi Matthew

Thanks for your answer and your fix. It works :)))

Kind regards,
Morten


Fra: Matthew Knepley <knep...@gmail.com<mailto:knep...@gmail.com>>
Dato: Thursday 14 July 2016 at 00:03
Til: Morten Nobel-Joergensen <m...@dtu.dk<mailto:m...@dtu.dk>>
Cc: "petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>" 
<petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>>
Emne: Re: [petsc-users] Distribution of DMPlex for FEM

On Wed, Jul 13, 2016 at 3:57 AM, Morten Nobel-Jørgensen 
<m...@dtu.dk<mailto:m...@dtu.dk>> wrote:
I’m having problems distributing a simple FEM model using DMPlex. For test case 
I use 1x1x2 hex box elements (/cells) with 12 vertices. Each vertex has one DOF.
When I distribute the system to two processors, each get a single element and 
the local vector has the size 8 (one DOF for each vertex of a hex box) as 
expected.

My problem is that when I manually assemble the global stiffness matrix (a 
12x12 matrix) it seems like my ghost values are ignored. I’m sure that I’m 
missing something obvious but cannot see what it is.

In the attached example, I’m assembling the global stiffness matrix using a 
simple local stiffness matrix of ones. This makes it very easy to see if the 
matrix is assembled correctly. If I run it on one process, then global 
stiffness matrix consists of 0’s, 1’s and 2’s and its trace is 16.0. But if I 
run it distributed on on two, then it consists only of 0's and 1’s and its 
trace is 12.0.

I hope that somebody can spot my mistake and help me in the right direction :)

This is my fault, and Stefano Zampini had already tried to tell me this was 
broken. I normally use DMPlexMatSetClosure(), which handles global indices 
correctly.
I have fixed this in the branch

  knepley/fix-plex-l2g

which is also merged to 'next'. I am attaching a version of your sample where 
all objects are freed correctly. Let me know if that works for you.

  Thanks,

 Matt

Kind regards,
Morten



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener


Re: [petsc-users] Distribution of DMPlex for FEM

2016-07-13 Thread Matthew Knepley
On Wed, Jul 13, 2016 at 3:57 AM, Morten Nobel-Jørgensen  wrote:

> I’m having problems distributing a simple FEM model using DMPlex. For test
> case I use 1x1x2 hex box elements (/cells) with 12 vertices. Each vertex
> has one DOF.
> When I distribute the system to two processors, each get a single element
> and the local vector has the size 8 (one DOF for each vertex of a hex box)
> as expected.
>
> My problem is that when I manually assemble the global stiffness matrix (a
> 12x12 matrix) it seems like my ghost values are ignored. I’m sure that I’m
> missing something obvious but cannot see what it is.
>
> In the attached example, I’m assembling the global stiffness matrix using
> a simple local stiffness matrix of ones. This makes it very easy to see if
> the matrix is assembled correctly. If I run it on one process, then global
> stiffness matrix consists of 0’s, 1’s and 2’s and its trace is 16.0. But if
> I run it distributed on on two, then it consists only of 0's and 1’s and
> its trace is 12.0.
>
> I hope that somebody can spot my mistake and help me in the right
> direction :)
>

This is my fault, and Stefano Zampini had already tried to tell me this was
broken. I normally use DMPlexMatSetClosure(), which handles global indices
correctly.
I have fixed this in the branch

  knepley/fix-plex-l2g

which is also merged to 'next'. I am attaching a version of your sample
where all objects are freed correctly. Let me know if that works for you.

  Thanks,

 Matt


> Kind regards,
> Morten
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
static char help[] = "";

#include 

const int dof = 1;

#undef __FUNCT__
#define __FUNCT__ "SetupDOFs"

/* Assign dofs (3 for each node) */
PetscErrorCode SetupDOFs(DM dm) {
PetscSection s;
PetscInt pStart, pEnd, cStart, cEnd, c, fStart, fEnd, f, vStart, vEnd, v, eStart, eEnd, e;
PetscErrorCode ierr;

PetscFunctionBeginUser;
ierr = PetscSectionCreate(PetscObjectComm((PetscObject) dm), ); CHKERRQ(ierr);
ierr = PetscSectionSetNumFields(s, 1); CHKERRQ(ierr);

ierr = DMPlexGetChart(dm, , ); CHKERRQ(ierr);
PetscSynchronizedFlush(PETSC_COMM_WORLD, PETSC_STDOUT);
ierr = DMPlexGetDepthStratum(dm, 0, , ); CHKERRQ(ierr);
ierr = PetscSectionSetChart(s, pStart, pEnd); CHKERRQ(ierr);

for (v = vStart; v < vEnd; ++v) {
ierr = PetscSectionSetDof(s, v, dof); CHKERRQ(ierr);
ierr = PetscSectionSetFieldDof(s, v, 0, dof); CHKERRQ(ierr);
}
ierr = PetscSectionSetUp(s); CHKERRQ(ierr);
ierr = DMSetDefaultSection(dm, s); CHKERRQ(ierr);
ierr = PetscSectionDestroy(); CHKERRQ(ierr);
PetscFunctionReturn(0);
}

#undef __FUNCT__
#define __FUNCT__ "CreateGlobalStiffnessMatrix"

PetscInt CreateGlobalStiffnessMatrix(DM dm, Mat *K) {
PetscSection localSection;
PetscInt cStart, cEnd, nStart, nEnd, e;
PetscInt edof[dof * 8];  // edof vector
PetscScalar ke[(dof * 8) * (dof * 8)]; // element matrix
PetscErrorCode ierr;

PetscFunctionBeginUser;
ierr = DMPlexGetHeightStratum(dm, 0, , );  CHKERRQ(ierr);
ierr = DMPlexGetDepthStratum(dm, 0, , );  CHKERRQ(ierr);

ierr = DMCreateMatrix(dm, K);  CHKERRQ(ierr);
ierr = MatViewFromOptions(*K, NULL, "-mat_view"); CHKERRQ(ierr);

ierr = DMGetDefaultSection(dm, );  CHKERRQ(ierr);

for (e = 0; e < (8 * dof) * (8 * dof); e++) {
ke[e] = 1.0; // placeholder value - replace with local stiffness matrix
}

for (e = cStart; e < cEnd; e++) {
PetscBool useCone = PETSC_TRUE;
PetscInt clojureCount;
PetscInt *clojureIndex = NULL;
PetscInt edofIndex = 0, v;

ierr = DMPlexGetTransitiveClosure(dm, e, useCone, , ); CHKERRQ(ierr);
for (v = 0; v < clojureCount; ++v) {
PetscInt cPoint = clojureIndex[v * 2];
PetscInt cOrientation = clojureIndex[v * 2 + 1];
PetscBool isVertex = cPoint >= nStart && cPoint < nEnd ? PETSC_TRUE  : PETSC_FALSE; // is vertex if it exist in the vertex range

if (isVertex) {
PetscInt offset, dofSize, i;

ierr = PetscSectionGetOffset(localSection, cPoint, ); CHKERRQ(ierr);
ierr = PetscSectionGetDof(localSection, cPoint, ); CHKERRQ(ierr);
for (i = 0; i < dofSize; ++i) {
edof[edofIndex] = offset + i;
edofIndex++;
}
}
}

ierr = DMPlexRestoreTransitiveClosure(dm, e, useCone, , ); CHKERRQ(ierr);
ierr = MatSetValuesLocal(*K, 8 * dof, edof, 8 * dof, edof, ke, ADD_VALUES); CHKERRQ(ierr);
}
ierr = MatAssemblyBegin(*K, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
ierr = MatAssemblyEnd(*K, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
PetscFunctionReturn(0);
}

#undef __FUNCT__
#define __FUNCT__ "main"

int main(int argc, char **argv) {
DM