Dear Matt
Did you (or anyone else) find time to look at our issue?
We are really looking forward to your answer :)
Kind regards,
Morten
From: Matthew Knepley [knep...@gmail.com]
Sent: Wednesday, October 12, 2016 3:41 PM
To: Morten Nobel-Jørgensen
Cc: petsc-users
Dear PETSc developers / Matt
Thanks for your suggestions regarding our use of dmplex in a FEM context.
However, Matt's advise on using the PetscFE is not sufficient for our needs
(our end goal is a topology optimization framework - not just FEM) and we must
honestly admit that we do not see ho
e?
Kind regards,
Morten
From: Matthew Knepley [knep...@gmail.com]
Sent: Monday, September 26, 2016 2:19 PM
To: Morten Nobel-Jørgensen
Cc: PETSc [petsc-users@mcs.anl.gov]
Subject: Re: [petsc-users] DMPlex problem
On Mon, Sep 26, 2016 at 7:00 AM, Morten Nobel-J
From: petsc-users-boun...@mcs.anl.gov [petsc-users-boun...@mcs.anl.gov] on
behalf of Morten Nobel-Jørgensen [m...@dtu.dk]
Sent: Sunday, September 25, 2016 11:15 AM
To: Matthew Knepley
Cc: PETSc [petsc-users@mcs.anl.gov]
Subject: Re: [petsc-users] DMPlex problem
Hi
Nobel-Jørgensen
Cc: PETSc [petsc-users@mcs.anl.gov]
Subject: Re: [petsc-users] DMPlex problem
On Fri, Sep 23, 2016 at 7:45 AM, Matthew Knepley
mailto:knep...@gmail.com>> wrote:
On Fri, Sep 23, 2016 at 3:48 AM, Morten Nobel-Jørgensen
mailto:m...@dtu.dk>> wrote:
Dear PETSc developers
From: Matthew Knepley [knep...@gmail.com]
Sent: Friday, September 09, 2016 12:21 PM
To: Morten Nobel-Jørgensen
Cc: PETSc [petsc-users@mcs.anl.gov]
Subject: Re: [petsc-users] DMPlex problem
On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen
mailto:m...@dtu.dk>> wrote:
Dear PETSc deve
Dear PETSc developers and users,
Last week we posted a question regarding an error with DMPlex and multiple dofs
and have not gotten any feedback yet. This is uncharted waters for us, since we
have gotten used to an extremely fast feedback from the PETSc crew. So - with
the chance of sounding i
help or tips will be appreciated :)
Kind regards,
Morten
From: Morten Nobel-Jørgensen
Sent: Thursday, July 14, 2016 9:45 AM
To: Matthew Knepley
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Distribution of DMPlex for FEM
Hi Matthew
Thanks for your answe
.gov>"
mailto:petsc-users@mcs.anl.gov>>
Emne: Re: [petsc-users] Distribution of DMPlex for FEM
On Wed, Jul 13, 2016 at 3:57 AM, Morten Nobel-Jørgensen
mailto:m...@dtu.dk>> wrote:
I’m having problems distributing a simple FEM model using DMPlex. For test case
I use 1x1x2 he
I’m having problems distributing a simple FEM model using DMPlex. For test case
I use 1x1x2 hex box elements (/cells) with 12 vertices. Each vertex has one DOF.
When I distribute the system to two processors, each get a single element and
the local vector has the size 8 (one DOF for each vertex o
Hi all,
I hope someone can help me with the following:
I’m having some problems when exporting a distributed DMPlex – the cells (+cell
types) seems to be duplicated.
When I’m running the code on a non-distributed system it works as expected, but
when I run it on multiple processors (2 in my ca
outputs the matrix to a file called ‘Kmat.m'.
Kind regards,
Morten
Fra: Matthew Knepley mailto:knep...@gmail.com>>
Dato: Tuesday 8 March 2016 at 19:47
Til: Morten Nobel-Jørgensen mailto:m...@mek.dtu.dk>>
Cc: "petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>"
I have some problems using DMPlex on unstructured grids in 3D.
After I have created the DMPlex and assigned dofs (3 dofs on each node), I run
into some problems when assembling the global stiffness matrix. I have created
a small example in the attached cc file. My problems are:
* It seems l
I found a solution to my problem by using the global section instead. I still
don’t quite understand what my problem ISLocalToGlobalMapping was.
> Yes, that is the solution I use.
Thanks – good to hear that I’m on the right track :)
> I think I ignore nonlocal indices in the l2g mapping rather
(ltogm, &array);
return array[point]<0;
}
From: Morten Nobel-Jørgensen mailto:m...@mek.dtu.dk>>
Date: Monday 30 November 2015 at 17:24
To: Matthew Knepley mailto:knep...@gmail.com>>
Cc: "petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>"
mailto:petsc-users@mcs.a
>>
Date: Monday 30 November 2015 at 14:08
To: Morten Nobel-Jørgensen mailto:m...@mek.dtu.dk>>
Cc: "petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>"
mailto:petsc-users@mcs.anl.gov>>
Subject: Re: [petsc-users] DMPlex: Ghost points after DMRefine
On Mon,
I have a very simple unstructured mesh composed of two triangles (four
vertices) with one shared edge using a DMPlex:
/|\
/ | \
\ | /
\|/
After distributing this mesh to two processes, each process owns a triangle.
However one process owns tree vertices, while the last vertex is owned by the
After distributing a DMPlex it seems like my cells are appearing twice (or
rather multiple cells maps onto the same vertices).
I’m assuming the way I’m iterating the DMPlex is wrong. Essentially I iterate
the DMPlex the following way after distribution (see code snippet below – or
attached file
18 matches
Mail list logo