Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Brandon Denton via petsc-users
How exactly does the aux data work? What is typically available there? Is it something the user can populate? From: Matthew Knepley Sent: Wednesday, October 11, 2023 8:07 PM To: Brandon Denton Cc: Jed Brown ; petsc-users Subject: Re: [petsc-users] FEM

Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Matthew Knepley
On Wed, Oct 11, 2023 at 4:15 PM Brandon Denton wrote: > By natural coordinates, I am referring to the reference element > coordinates. Usually these are represented as (xi, eta, zeta) in the > literature. > > Yes. I would like to have the Jacobian and the derivatives of the map > available

Re: [petsc-users] Parallel DMPlex

2023-10-11 Thread erdemguer via petsc-users
Thank you! That's exactly what I need. Sent with [Proton Mail](https://proton.me/) secure email. --- Original Message --- On Wednesday, October 11th, 2023 at 4:17 PM, Matthew Knepley wrote: > On Wed, Oct 11, 2023 at 4:42 AM erdemguer wrote: > >> Hi again, > > I see the problem. FV

Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Brandon Denton via petsc-users
By natural coordinates, I am referring to the reference element coordinates. Usually these are represented as (xi, eta, zeta) in the literature. Yes. I would like to have the Jacobian and the derivatives of the map available within PetscDSSetResidual() f0 and f1 functions. I believe

Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Matthew Knepley
On Wed, Oct 11, 2023 at 2:09 PM Brandon Denton wrote: > Thank you for the discussion. > > Are we agreed then that the derivatives of the natural coordinates are > required for the described approach? If so, is this something PETSc can > currently do within the point-wise residual functions? > I

Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Brandon Denton via petsc-users
Thank you for the discussion. Are we agreed then that the derivatives of the natural coordinates are required for the described approach? If so, is this something PETSc can currently do within the point-wise residual functions? Matt - Thank you for the command line option for the 2nd

Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Jed Brown
Matthew Knepley writes: > On Wed, Oct 11, 2023 at 1:03 PM Jed Brown wrote: > >> I don't see an attachment, but his thesis used conservative variables and >> defined an effective length scale in a way that seemed to assume constant >> shape function gradients. I'm not aware of systematic

Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Matthew Knepley
On Wed, Oct 11, 2023 at 1:03 PM Jed Brown wrote: > I don't see an attachment, but his thesis used conservative variables and > defined an effective length scale in a way that seemed to assume constant > shape function gradients. I'm not aware of systematic literature comparing > the covariant

Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Jed Brown
I don't see an attachment, but his thesis used conservative variables and defined an effective length scale in a way that seemed to assume constant shape function gradients. I'm not aware of systematic literature comparing the covariant and contravariant length measures on anisotropic meshes,

Re: [petsc-users] [EXTERNAL] Re: Unexpected performance losses switching to COO interface

2023-10-11 Thread Fackler, Philip via petsc-users
I'm on it. Philip Fackler Research Software Engineer, Application Engineering Group Advanced Computing Systems Research Section Computer Science and Mathematics Division Oak Ridge National Laboratory From: Junchao Zhang Sent: Wednesday, October 11, 2023 10:14 To:

Re: [petsc-users] SLEPc/NEP for shell matrice T(lambda) and T'(lambda)

2023-10-11 Thread Kenneth C Hall
Jose, Thanks very much for your help with this. Greatly appreciated. I will look at the MR. Please let me know if you do get the Fortran example working. Thanks, and best regards, Kenneth From: Jose E. Roman Date: Wednesday, October 11, 2023 at 2:41 AM To: Kenneth C Hall Cc:

Re: [petsc-users] Configuration of PETSc with Intel OneAPI and Intel MPI fails

2023-10-11 Thread Satish Balay via petsc-users
The same docs should be available in https://web.cels.anl.gov/projects/petsc/download/release-snapshots/petsc-with-docs-3.20.0.tar.gz Satish On Wed, 11 Oct 2023, Richter, Roland wrote: > Hei, > Thank you very much for the answer! I looked it up, but petsc.org seems to > be a bit unstable here,

Re: [petsc-users] Parallel DMPlex

2023-10-11 Thread Matthew Knepley
On Wed, Oct 11, 2023 at 4:42 AM erdemguer wrote: > Hi again, > I see the problem. FV ghosts mean extra boundary cells added in FV methods using DMPlexCreateGhostCells() in order to impose boundary conditions. They are not the "ghost" cells for overlapping parallel decompositions. I have changed

Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Brandon Denton via petsc-users
I was thinking about trying to implement Ben Kirk's approach to Navier-Stokes (see attached paper; Section 5). His approach uses these quantities to align the orientation of the unstructured element/cell with the fluid velocity to apply the stabilization/upwinding and to detect shocks. If you

Re: [petsc-users] Compilation failure of PETSc with "The procedure name of the INTERFACE block conflicts with a name in the encompassing scoping unit"

2023-10-11 Thread Matthew Knepley
On Wed, Oct 11, 2023 at 4:22 AM Richter, Roland wrote: > Hei, > > following my last question I managed to configure PETSc with Intel MPI and > Intel OneAPI using the following configure-line: > > > > *./configure --prefix=/media/storage/local_opt/petsc > --with-scalar-type=complex

Re: [petsc-users] FEM Implementation of NS with SUPG Stabilization

2023-10-11 Thread Matthew Knepley
On Tue, Oct 10, 2023 at 9:34 PM Brandon Denton via petsc-users < petsc-users@mcs.anl.gov> wrote: > Good Evening, > > I am looking to implement a form of Navier-Stokes with SUPG Stabilization > and shock capturing using PETSc's FEM infrastructure. In this > implementation, I need access to the

Re: [petsc-users] Parallel DMPlex

2023-10-11 Thread erdemguer via petsc-users
Hi again, Here is my code: #include static char help[] = "dmplex"; int main(int argc, char **argv) { PetscCall(PetscInitialize(, , NULL, help)); DM dm, dm_dist; PetscSection section; PetscInt cStart, cEndInterior, cEnd, rank; PetscInt nc[3] = {3, 3, 3}; PetscReal upper[3] = {1, 1, 1};

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Pierre Jolivet
> On 11 Oct 2023, at 9:13 AM, Thanasis Boutsikakis > wrote: > > Very good catch Pierre, thanks a lot! > > This made everything work: the two-step process and the ptap(). I mistakenly > thought that I should not let the local number of columns to be None, since > the matrix is only

[petsc-users] Compilation failure of PETSc with "The procedure name of the INTERFACE block conflicts with a name in the encompassing scoping unit"

2023-10-11 Thread Richter, Roland
Hei, following my last question I managed to configure PETSc with Intel MPI and Intel OneAPI using the following configure-line: ./configure --prefix=/media/storage/local_opt/petsc --with-scalar-type=complex --with-cc=mpiicc --with-cxx=mpiicpc --CPPFLAGS="-fPIC -march=native -mavx2"

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Thanasis Boutsikakis
Very good catch Pierre, thanks a lot! This made everything work: the two-step process and the ptap(). I mistakenly thought that I should not let the local number of columns to be None, since the matrix is only partitioned row-wise. Could you please explain what happened because of my setting

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Pierre Jolivet
That’s because: size = ((None, global_rows), (global_cols, global_cols)) should be: size = ((None, global_rows), (None, global_cols)) Then, it will work. $ ~/repo/petsc/arch-darwin-c-debug-real/bin/mpirun -n 4 python3.12 test.py && echo $? 0 Thanks, Pierre > On 11 Oct 2023, at 8:58 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Thanasis Boutsikakis
Furthermore, I tried to perform the Galerkin projection in two steps by substituting > A_prime = A.ptap(Phi) With AL = Phi.transposeMatMult(A) A_prime = AL.matMult(Phi) And running this with 3 procs, results to the false creation of a matrix AL that has 3 times bigger dimensions that it

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Thanasis Boutsikakis
Pierre, I see your point, but my experiment shows that it does not even run due to size mismatch, so I don’t see how being sparse would change things here. There must be some kind of problem with the parallel ptap(), because it does run sequentially. In order to test that, I changed the flags

Re: [petsc-users] Configuration of PETSc with Intel OneAPI and Intel MPI fails

2023-10-11 Thread Richter, Roland
Hei, Thank you very much for the answer! I looked it up, but petsc.org seems to be a bit unstable here, quite often I can't reach petsc.org. Regards, Roland Richter -Ursprüngliche Nachricht- Von: Satish Balay Gesendet: mandag 9. oktober 2023 17:29 An: Barry Smith Cc: Richter, Roland ;

Re: [petsc-users] SLEPc/NEP for shell matrice T(lambda) and T'(lambda)

2023-10-11 Thread Jose E. Roman
Kenneth, The MatDuplicate issue should be fixed in the following MR https://gitlab.com/petsc/petsc/-/merge_requests/6912 Note that the NLEIGS solver internally uses MatDuplicate for creating multiple copies of the shell matrix, each one with its own value of lambda. Hence your implementation