Marcel,

   Would you like to do the following? Compute   

    Q A  P where Q is a row permutation, P a column permutation and then apply 
LSQR on QAP? 
  

    From the manual page: 

In exact arithmetic the LSQR method (with no preconditioning) is identical to 
the KSPCG algorithm applied to the normal equations.

   [Q A  P]' [Q A  P] = P' A' A P = P'(A'A) P  the Q drops out because  
permutation matrices' transposes are their inverse

 Note that P is a small square matrix. 

  So my conclusion is that any column permutation of A is also a symmetric 
permutation of A'A so you can just try using regular reorderings of A'A if 
you want to "concentrate" the "important" parts of A'A into your "block 
diagonal" preconditioner (and throw away the other parts)

  I don't know what it will do to the convergence. I've never had much luck 
generically trying to symmetrically reorder matrices to improve preconditioners 
but 
for certain situation maybe it might help. For example if the matrix is  [0 1; 
1 0] and you permute it you get the [1 0; 0 1] which looks better.

  There is this https://epubs.siam.org/doi/10.1137/S1064827599361308 
<https://epubs.siam.org/doi/10.1137/S1064827599361308>  but it is for 
non-symmetric permutations and in your case if you use a non symmetric 
permeation you can no longer use LSQR. 

  Barry




> On Oct 22, 2020, at 4:55 AM, Matthew Knepley <[email protected]> wrote:
> 
> On Thu, Oct 22, 2020 at 4:24 AM Marcel Huysegoms <[email protected] 
> <mailto:[email protected]>> wrote:
> Hi all,
> 
> I'm currently implementing a Gauss-Newton approach for minimizing a
> non-linear cost function using PETSc4py.
> The (rectangular) linear systems I am trying to solve have dimensions of
> about (5N, N), where N is in the range of several hundred millions.
> 
> Due to its size and because it's an over-determined system, I use LSQR
> in conjunction with a preconditioner (which operates on A^T x A, e.g.
> BJacobi).
> Depending on the ordering of the unknowns the algorithm only converges
> for special cases. When I use a direct LR solver (as preconditioner) it
> consistently converges, but consumes too much memory. I have read in the
> manual that the LR solver internally also applies a matrix reordering
> beforehand.
> 
> My question would be:
> How can I improve the ordering of the unknowns for a rectangular matrix
> (in order to converge also with iterative preconditioners)? If I use
> MatGetOrdering(), it only works for square matrices. Is there a way to
> achieve this from within PETSc4py?
> ParMETIS seems to be a promising framework for that task. Is it possible
> to apply its reordering algorithm to a rectangular PETSc-matrix?
> 
> I would be thankful for every bit of advice that might help.
> 
> We do not have any rectangular reordering algorithms. I think your first step 
> is to
> find something in the literature that you think will work.
> 
>   Thanks,
> 
>      Matt
>  
> Best regards,
> Marcel
> 
> 
> ------------------------------------------------------------------------------------------------
> ------------------------------------------------------------------------------------------------
> Forschungszentrum Juelich GmbH
> 52425 Juelich
> Sitz der Gesellschaft: Juelich
> Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
> Vorsitzender des Aufsichtsrats: MinDir Volker Rieke
> Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
> Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt
> ------------------------------------------------------------------------------------------------
> ------------------------------------------------------------------------------------------------
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>

Reply via email to