Dear PETSc/Tao team,

it seems to be that there is a bug in the TaoALMM class:

In the methods TaoALMMSubsolverObjective_Private and TaoALMMSubsolverObjectiveAndGradient_Private the vector where the function value for the augmented Lagrangian is evaluate is copied into the current solution, see, e.g., https://petsc.org/release/src/tao/constrained/impls/almm/almm.c.html line 672 or 682.  This causes subsolver routine to not converge if the line search for the subsolver rejects the step length 1. for some
update.  In detail:

Suppose the current iterate is xk and the current update is dxk. The line search evaluates the augmented Lagrangian now at (xk + dxk).  This causes that the value (xk + dxk) is copied in the current solution.  If the point (xk + dxk) is rejected, the line search should try the point (xk + alpha * dxk), where alpha < 1.  But due to the copying, what happens is that the point ((xk + dxk) + alpha * dxk) is evaluated, see, e.g., https://petsc.org/release/src/tao/linesearch/impls/armijo/armijo.c.html line 191.

Best regards
Stephan Köhler

--
Stephan Köhler
TU Bergakademie Freiberg
Institut für numerische Mathematik und Optimierung

Akademiestraße 6
09599 Freiberg
Gebäudeteil Mittelbau, Zimmer 2.07

Telefon: +49 (0)3731 39-3173 (Büro)

Attachment: OpenPGP_0xC9BF2C20DFE9F713.asc
Description: OpenPGP public key

Attachment: OpenPGP_signature
Description: OpenPGP digital signature

Reply via email to