[petsc-users] PETSc with Julia Binary Builder

2021-07-01 Thread Kozdon, Jeremy (CIV)
I have been talking with Boris Kaus and Patrick Sanan about trying to revive the Julia PETSc interface wrappers. One of the first things to get going is to use Julia's binary builder [1] to wrap more scalar, real, and int type builds of the PETSc library; the current distribution is just Real,

Re: [petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Barry Smith
Sounds good. Yes the outer most Jacobian can be shell (or even nest but then I think you need to "build" it yourself, I don't think the DMDA will give back an appropriate nest matrix.) > On Jul 1, 2021, at 4:10 PM, Matteo Semplice > wrote: > > Thank you, Matthew and Barry! > > I can now

Re: [petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Matteo Semplice
Thank you, Matthew and Barry! I can now see a way forward. Il 01/07/21 21:42, Barry Smith ha scritto: I do not understand how creating a DMDA with n0+n1 dofs will let me easily reuse my shell preconditioner code on the top-left block. PCFIELDSPLIT (and friends) do not order the dof by

Re: [petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Barry Smith
> On Jul 1, 2021, at 11:44 AM, Matteo Semplice > wrote: > > Il 01/07/21 17:52, Jed Brown ha scritto: >> I think ex28 is better organization of code. You can DMCreateMatrix() and >> then set types/preallocation for off-diagonal blocks of the MatNest. I think >> the comment is unclear and

Re: [petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Matthew Knepley
On Thu, Jul 1, 2021 at 11:44 AM Matteo Semplice < matteo.sempl...@uninsubria.it> wrote: > Il 01/07/21 17:52, Jed Brown ha scritto: > > I think ex28 is better organization of code. You can DMCreateMatrix() > and then set types/preallocation for off-diagonal blocks of the MatNest. I > think the

Re: [petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Matteo Semplice
Il 01/07/21 17:52, Jed Brown ha scritto: I think ex28 is better organization of code. You can DMCreateMatrix() and then set types/preallocation for off-diagonal blocks of the MatNest. I think the comment is unclear and not quite what was intended and originally worked (which was to assemble

Re: [petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Jed Brown
Matteo Semplice writes: > Hi. > > We are designing a PETSc application that will employ a SNES solver on a > multiphysics problem whose jacobian will have a 2x2 block form, say > A=[A00,A01;A10,A11]. We already have code for the top left block A_00 (a > MatShell and a related Shell

Re: [petsc-users] Scatter parallel Vec to sequential Vec on non-zeroth process

2021-07-01 Thread Junchao Zhang
Peder, PETSCSF_PATTERN_ALLTOALL only supports MPI_Alltoall (not Alltoallv), and is only used by petsc internally at few places. I suggest you can go with Matt's approach. After it solves your problem, you can distill an example to demo the communication pattern. Then we can see how to

Re: [petsc-users] Scatter parallel Vec to sequential Vec on non-zeroth process

2021-07-01 Thread Jed Brown
Peder Jørgensgaard Olesen writes: > Each process is assigned an indexed subset of the tasks (the tasks are of > constant size), and, for each task index, the relevant data is scattered as a > SEQVEC to the process (this is done for all processes in each step, using an > adaption of the code

Re: [petsc-users] Scatter parallel Vec to sequential Vec on non-zeroth process

2021-07-01 Thread Peder Jørgensgaard Olesen via petsc-users
Dear Jed I'm not really sure what it is you're asking (that's on me, still a rookie in the field), but I'll try to describe what I've done: Each process is assigned an indexed subset of the tasks (the tasks are of constant size), and, for each task index, the relevant data is scattered as a

Re: [petsc-users] SLEPc: smallest eigenvalues

2021-07-01 Thread Varun Hiremath
Thank you very much for these suggestions! We are currently using version 3.12, so I'll try to update to the latest version and try your suggestions. Let me get back to you, thanks! On Thu, Jul 1, 2021, 4:45 AM Jose E. Roman wrote: > Then I would try Davidson methods

Re: [petsc-users] SLEPc: smallest eigenvalues

2021-07-01 Thread Jose E. Roman
Then I would try Davidson methods https://doi.org/10.1145/2543696 You can also try Krylov-Schur with "inexact" shift-and-invert, for instance, with preconditioned BiCGStab or GMRES, see section 3.4.1 of the users manual. In both cases, you have to pass matrix A in the call to EPSSetOperators()

Re: [petsc-users] SLEPc: smallest eigenvalues

2021-07-01 Thread Varun Hiremath
Thanks. I actually do have a 1st order approximation of matrix A, that I can explicitly compute and also invert. Can I use that matrix as preconditioner to speed things up? Is there some example that explains how to setup and call SLEPc for this scenario? On Thu, Jul 1, 2021, 4:29 AM Jose E.

Re: [petsc-users] SLEPc: smallest eigenvalues

2021-07-01 Thread Jose E. Roman
For smallest real parts one could adapt ex34.c, but it is going to be costly https://slepc.upv.es/documentation/current/src/eps/tutorials/ex36.c.html Also, if eigenvalues are clustered around the origin, convergence may still be very slow. It is a tough problem, unless you are able to compute a

Re: [petsc-users] SLEPc: smallest eigenvalues

2021-07-01 Thread Varun Hiremath
I'm solving for the smallest eigenvalues in magnitude. Though is it cheaper to solve smallest in real part, as that might also work in my case? Thanks for your help. On Thu, Jul 1, 2021, 4:08 AM Jose E. Roman wrote: > Smallest eigenvalue in magnitude or real part? > > > > El 1 jul 2021, a las

Re: [petsc-users] SLEPc: smallest eigenvalues

2021-07-01 Thread Jose E. Roman
Smallest eigenvalue in magnitude or real part? > El 1 jul 2021, a las 11:58, Varun Hiremath escribió: > > Sorry, no both A and B are general sparse matrices (non-hermitian). So is > there anything else I could try? > > On Thu, Jul 1, 2021 at 2:43 AM Jose E. Roman wrote: > Is the problem

Re: [petsc-users] SLEPc: smallest eigenvalues

2021-07-01 Thread Varun Hiremath
Sorry, no both A and B are general sparse matrices (non-hermitian). So is there anything else I could try? On Thu, Jul 1, 2021 at 2:43 AM Jose E. Roman wrote: > Is the problem symmetric (GHEP)? In that case, you can try LOBPCG on the > pair (A,B). But this will likely be slow as well, unless

Re: [petsc-users] SLEPc: smallest eigenvalues

2021-07-01 Thread Jose E. Roman
Is the problem symmetric (GHEP)? In that case, you can try LOBPCG on the pair (A,B). But this will likely be slow as well, unless you can provide a good preconditioner. Jose > El 1 jul 2021, a las 11:37, Varun Hiremath escribió: > > Hi All, > > I am trying to compute the smallest

[petsc-users] SLEPc: smallest eigenvalues

2021-07-01 Thread Varun Hiremath
Hi All, I am trying to compute the smallest eigenvalues of a generalized system A*x= lambda*B*x. I don't explicitly know the matrix A (so I am using a shell matrix with a custom matmult function) however, the matrix B is explicitly known so I compute inv(B)*A within the shell matrix and solve

[petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Matteo Semplice
Hi. We are designing a PETSc application that will employ a SNES solver on a multiphysics problem whose jacobian will have a 2x2 block form, say A=[A00,A01;A10,A11]. We already have code for the top left block A_00 (a MatShell and a related Shell preconditioner) that we wish to reuse. We