[petsc-users] Help building Zoltan with gcc11

2022-09-21 Thread Lucas Banting
Hello all, I am having a problem building Zoltan using the "--download-Zoltan" using gcc11 (default version on Ubuntu 22.04). I have opened an issue on the PETSc gitlab: https://gitlab.com/petsc/petsc/-/issues/1248 Which also has my configure.log. The error is: Error: Type mismatch between

[petsc-users] PCApplySymmetricRight for PCBJACOBI

2022-09-21 Thread Abylay Zhumekenov
Hello, I have been playing around with a block Jacobi preconditioner (PCJACOBI) with an incomplete Cholesky (PCICC) sub-preconditioner. Although you cannot set KSPSetPCSide to PC_SYMMETRIC for PCBJACOBI, you still can do it for PCICC. I was surprised to find that PCApplySymmetricLeft is properly

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-09-21 Thread Matthew Knepley
On Wed, Sep 21, 2022 at 10:35 AM feng wang wrote: > Hi Jose, > > For your 2nd suggestion on halo exchange, I get the idea and roughly know > how to do it, but there are some implementation details which I am not > quite sure. > > If I understand it correctly, in MatMult(Mat m ,Vec x, Vec y), Vec

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-09-21 Thread feng wang
Hi Jose, For your 2nd suggestion on halo exchange, I get the idea and roughly know how to do it, but there are some implementation details which I am not quite sure. If I understand it correctly, in MatMult(Mat m ,Vec x, Vec y), Vec x is a normal parallel vector and it does not contain halo

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-09-21 Thread Jose E. Roman
> El 21 sept 2022, a las 14:47, feng wang escribió: > > Thanks Jose, I will try this and will come back to this thread if I have any > issue. > > Besides, for EPSGetEigenpair, I guess each rank gets its portion of the > eigenvector, and I need to put them together afterwards? Eigenvectors

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-09-21 Thread Jose E. Roman
> El 21 sept 2022, a las 14:47, feng wang escribió: > > Thanks Jose, I will try this and will come back to this thread if I have any > issue. > > Besides, for EPSGetEigenpair, I guess each rank gets its portion of the > eigenvector, and I need to put them together afterwards? Eigenvectors

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-09-21 Thread feng wang
Thanks Jose, I will try this and will come back to this thread if I have any issue. Besides, for EPSGetEigenpair, I guess each rank gets its portion of the eigenvector, and I need to put them together afterwards? Thanks, Feng From: Jose E. Roman Sent: 21

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-09-21 Thread Jose E. Roman
If you define the MATOP_CREATE_VECS operation in your shell matrix so that it creates a ghost vector, then all vectors within EPS will be ghost vectors, including those that are received as arguments of MatMult(). Not sure if this will work. A simpler solution is that you store a ghost vector

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-09-21 Thread feng wang
Thanks for your reply. For GMRES, I create a ghost vector and give it to KSPSolve. For Slepc, it only takes the shell matrix for EPSSetOperators. Suppose the shell matrix of the eigensolver defines MatMult(Mat m ,Vec x, Vec y), how does it know Vec x is a ghost vector and how many ghost cells

Re: [petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-09-21 Thread Matthew Knepley
On Wed, Sep 21, 2022 at 7:41 AM feng wang wrote: > Hello, > > I am using Slepc with a shell matrix. The sequential version seems working > and now I am trying to make it run in parallel. > > The partition of the domain is done, I am not sure how to do the halo > exchange in the shell matrix in

[petsc-users] Slepc, shell matrix, parallel, halo exchange

2022-09-21 Thread feng wang
Hello, I am using Slepc with a shell matrix. The sequential version seems working and now I am trying to make it run in parallel. The partition of the domain is done, I am not sure how to do the halo exchange in the shell matrix in Slepc. I have a parallel version of matrix-free GMRES in my

Re: [petsc-users] Problem solving Ax=b with rectangular matrix A

2022-09-21 Thread Pierre Jolivet
Yes, but you need to use a KSP that handles rectangular Mat, such as KSPLSQR (-ksp_type lsqr). PCLU does not handle rectangular Pmat. The only PC that handle rectangular Pmat are PCQR, PCNONE. If you supply the normal equations as the Pmat for LSQR, then you can use “standard” PC. You can have

[petsc-users] Problem solving Ax=b with rectangular matrix A

2022-09-21 Thread fujisan
I'm trying to solve Ax=b with a sparse rectangular matrix A (of size 33x17 in my test) using options '-ksp_type stcg -pc_type lu' on 1 or 2 cpus. And I always get an error saying "Incompatible vector local lengths" (see below). Here is the relevant lines of my code: program test ... !