[ 
https://issues.apache.org/jira/browse/MATH-1656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17741627#comment-17741627
 ] 

Gilles Sadowski commented on MATH-1656:
---------------------------------------

bq.  *{{DebugMode}}* [...] intermediate iteration results

I understand the purpose but I'm don't think that hijacking 
{{OptimizationData}} is the right way to go.
Also the {{GradientLikeOptimizer}} are not the only ones that could use some 
help when debugging.

I believe that the usual way is logging.  Originally, CM avoided it (as any and 
all external dependencies), but we are not there anymore.  It would make sense 
(?) to depend on the 
[Log4J2|https://logging.apache.org/log4j/2.x/manual/api-separation.html] API.



> Classical multivariate optimizers (gradient descent, Raphson-Newton, BFGS) 
> are missing
> --------------------------------------------------------------------------------------
>
>                 Key: MATH-1656
>                 URL: https://issues.apache.org/jira/browse/MATH-1656
>             Project: Commons Math
>          Issue Type: Wish
>          Components: legacy
>    Affects Versions: 4.0-beta1
>            Reporter: François Laferrière
>            Priority: Major
>              Labels: features
>         Attachments: MATH-1656-GradientDescent-Newton-BFGS-v2.0.zip, 
> MATH-1658-GradientDescent-Newton-BFGS.patch, Screenshot from 2023-07-10 
> 12-13-38.png
>
>
> Some classical multivariate such as
>  * gradient descent,
>  * Raphson-Newton,
>  * BFGS
> are missing.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to