Thank you for this suggestion, I tried to implement that but it's
proven pretty hard to implement MATOP_GET_DIAGONAL without completely
tanking performance. After all, B is a shell matrix for a reason : it
looks like M+R^H P M P R with R itself a shell matrix.

Allow me to point out that I have no shift. My eigenvalue problem is
purely about the largest ones out there. Section 8.2 and 3.4.3 led me
to think that there was a way to avoid computing (or writing a shell
matrix about it) B^-1... But you seem to stress that there's no way
around it.

Quentin



On Mon, 17 Jul 2023 at 11:56, Jose E. Roman <jro...@dsic.upv.es> wrote:
>
> The B-inner product is independent of the ST operator. See Table 3.2. In 
> generalized eigenproblems you always have an inverse.
>
> If your matrix is diagonally dominant, try implementing the 
> MATOP_GET_DIAGONAL operation and using PCJACOBI. Apart from this, you have to 
> build your own preconditioner.
>
> Jose
>
>
> > El 17 jul 2023, a las 11:48, Quentin Chevalier 
> > <quentin.cheval...@polytechnique.edu> escribió:
> >
> > Hello Jose,
> >
> > I guess I expected B to not be inverted but instead used as a mass for a 
> > problem-specific inner product since I specified GHEP as a problem type. 
> > p50 of the same user manual seems to imply that that would indeed be the 
> > case. I don't see what problem there would be with using a shell B matrix 
> > as a weighting matrix, as long as a mat utility is provided of course.
> >
> > I tried the first approach - I set up my KSP as CG since B is hermitian 
> > positive-definite (I made a mistake in my first email), but I'm getting a 
> > KSPSolve has not converged, reason DIVERGED_ITS error. I'm letting it run 
> > for 1000 iterations already so it seems suspiciously slow for a CG solver.
> >
> > I'm grappling with a shell preconditioner now to try and speed it up, but 
> > I'm unsure which one allows for shell matrices.
> >
> > Thank you for your time,
> >
> > Quentin
> >
> >
> > On Wed, 12 Jul 2023 at 19:24, Jose E. Roman <jro...@dsic.upv.es> wrote:
> > >
> > > By default, it is solving the problem as B^{-1}*A*x=lambda*x (see chapter 
> > > on Spectral Transformation). That is why A can be a shell matrix without 
> > > problem. But B needs to be an explicit matrix in order to compute an LU 
> > > factorization. If B is also a shell matrix then you should set an 
> > > iterative solver for the associated KSP (see examples in the chapter).
> > >
> > > An alternative is to create a shell matrix M that computes the action of 
> > > B^{-1}*A, then pass M to the EPS solver as a standard eigenproblem.
> > >
> > > Jose
> > >
> > >
> > > > El 12 jul 2023, a las 19:04, Quentin Chevalier 
> > > > <quentin.cheval...@polytechnique.edu> escribió:
> > > >
> > > > Hello PETSc Users,
> > > >
> > > > I have a generalised eigenvalue problem : Ax= lambda Bx
> > > > I used to have only A as a matrix-free method, I used mumps and an LU 
> > > > preconditioner, everything worked fine.
> > > >
> > > > Now B is matrix-free as well, and my solver is returning an error : 
> > > > "MatSolverType mumps does not support matrix type python", which is 
> > > > ironic given it seem to handle A quite fine.
> > > >
> > > > I have read in the user manual here that there some methods may require 
> > > > additional methods to be supplied for B like MATOP_GET_DIAGONAL but 
> > > > it's unclear to me exactly what I should be implementing and what is 
> > > > the best solver for my case.
> > > >
> > > > A is hermitian, B is hermitian positive but not positive-definite or 
> > > > real. Therefore I have specified a GHEP problem type to the EPS object.
> > > >
> > > > I use PETSc in complex mode through the petsc4py bridge.
> > > >
> > > > Any help on how to get EPS to work for a generalised matrix-free case 
> > > > would be welcome. Performance is not a key issue here - I have a 
> > > > tractable high value case on hand.
> > > >
> > > > Thank you for your time,
> > > >
> > > > Quentin
> > >
>

Reply via email to