Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Mike Michell
I will. Thank you very much! > On Sun, Feb 26, 2023 at 6:56 PM Mike Michell > wrote: > >> Okay that was the part of the code incompatible with the latest petsc. >> Thank you for support. >> I also need to call DMPlexLabelComplete() for the vertices on the >> parallel and physical boundaries, but

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Matthew Knepley
On Sun, Feb 26, 2023 at 6:56 PM Mike Michell wrote: > Okay that was the part of the code incompatible with the latest petsc. > Thank you for support. > I also need to call DMPlexLabelComplete() for the vertices on the parallel > and physical boundaries, but I get an error if DMPlexLabelComplete()

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Mike Michell
Okay that was the part of the code incompatible with the latest petsc. Thank you for support. I also need to call DMPlexLabelComplete() for the vertices on the parallel and physical boundaries, but I get an error if DMPlexLabelComplete() is called before DMSetPointSF(dm, sf). So I do: { PetscSF

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Matthew Knepley
On Sun, Feb 26, 2023 at 2:07 PM Mike Michell wrote: > I cannot agree with this argument, unless you also tested with petsc > 3.18.4 tarball from https://petsc.org/release/install/download/. > If library has issue, it is trivial that I will see an error from my code. > > I ran my code with valgrin

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Pierre Jolivet
> On 26 Feb 2023, at 8:07 PM, Mike Michell wrote: > > I cannot agree with this argument, unless you also tested with petsc 3.18.4 > tarball from https://petsc.org/release/install/download/. > If library has issue, it is trivial that I will see an error from my code. > > I ran my code with v

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Mike Michell
I cannot agree with this argument, unless you also tested with petsc 3.18.4 tarball from https://petsc.org/release/install/download/. If library has issue, it is trivial that I will see an error from my code. I ran my code with valgrind and see no error if it is with petsc 3.18.4. You can test wit

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Matthew Knepley
On Sun, Feb 26, 2023 at 11:32 AM Mike Michell wrote: > This is what I get from petsc main which is not correct: > Overall volume computed from median-dual ... >6.37050098781844 > Overall volume computed from PETSc ... >3.1547005380 > > > This is what I get from petsc 3.18.4 which is c

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Mike Michell
This is what I get from petsc main which is not correct: Overall volume computed from median-dual ... 6.37050098781844 Overall volume computed from PETSc ... 3.1547005380 This is what I get from petsc 3.18.4 which is correct: Overall volume computed from median-dual ... 3.15470053800

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Matthew Knepley
On Sun, Feb 26, 2023 at 11:19 AM Mike Michell wrote: > Which version of petsc you tested? With petsc 3.18.4, median duan volume > gives the same value with petsc from DMPlexComputeCellGeometryFVM(). > This is only an accident of the data layout. The code you sent writes over memory in the local

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Mike Michell
Which version of petsc you tested? With petsc 3.18.4, median duan volume gives the same value with petsc from DMPlexComputeCellGeometryFVM(). > On Sat, Feb 25, 2023 at 3:11 PM Mike Michell > wrote: > >> My apologies for the late follow-up. There was a time conflict. >> >> A simple example code r

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Matthew Knepley
On Sat, Feb 25, 2023 at 3:11 PM Mike Michell wrote: > My apologies for the late follow-up. There was a time conflict. > > A simple example code related to the issue I mentioned is attached here. > The sample code does: (1) load grid on dm, (2) compute vertex-wise control > volume for each node in

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-21 Thread Matthew Knepley
On Mon, Feb 20, 2023 at 12:05 PM Matthew Knepley wrote: > On Sat, Feb 18, 2023 at 12:00 PM Mike Michell > wrote: > >> As a follow-up, I tested: >> >> (1) Download tar for v3.18.4 from petsc gitlab ( >> https://gitlab.com/petsc/petsc/-/tree/v3.18.4) has no issue on DMPlex >> halo exchange. This v

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-20 Thread Matthew Knepley
On Sat, Feb 18, 2023 at 12:00 PM Mike Michell wrote: > As a follow-up, I tested: > > (1) Download tar for v3.18.4 from petsc gitlab ( > https://gitlab.com/petsc/petsc/-/tree/v3.18.4) has no issue on DMPlex > halo exchange. This version works as I expect. > (2) Clone main branch (git clone https:/

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-18 Thread Mike Michell
As a follow-up, I tested: (1) Download tar for v3.18.4 from petsc gitlab ( https://gitlab.com/petsc/petsc/-/tree/v3.18.4) has no issue on DMPlex halo exchange. This version works as I expect. (2) Clone main branch (git clone https://gitlab.com/petsc/petsc.git) has issues with DMPlex halo exchange.

[petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-17 Thread Mike Michell
Dear PETSc team, I am using PETSc for Fortran with DMPlex. I have been using this version of PETSc: >>git rev-parse origin >>995ec06f924a86c4d28df68d1fdd6572768b0de1 >>git rev-parse FETCH_HEAD >>9a04a86bf40bf893fb82f466a1bc8943d9bc2a6b There has been no issue, before the one with VTK viewer, whic