[ 
https://issues.apache.org/jira/browse/MATH-1656?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17741534#comment-17741534
 ] 

François Laferrière commented on MATH-1656:
-------------------------------------------

Please find a new patch fixing those issues

[^0001-MATH-1658-GradientDescent-Newton-BFGS-v2.0.patch]
{quote}1. The default build fails. Log excerpt: 
{quote}
The problematic code (dealing with old SecurityManager API) has been removed.
{quote}Files location ...
{quote}
Source code, test code and test resources have been moved to proper package.
{quote}Strange (spurious ?) files in resources directories: 
 * stdout.txt
 * stderr.txt
 * status.txt{quote}
Those are test references, generated when using DiffTest (see below 
{*}Tests{*}):
 * *stdout.txt* : whatever is written to System.out, during test execution, is 
redirected to this file.
 * *stderr.txt* : whatever is written to System.err, during test execution, is 
redirected to this file.
 * status.txt: has been removed.

 
{quote}Additions to the _public_ API need prior discussion (to avoid code 
bloat), e.g.
 * {{DebugMode}}
 * {{DoubleData}}
 * {{DoubleStrictlyPostiveData}}
 * {{ObjectiveFunctionDimension}}
 * {{OptimizationStatus}}
 * {{SearchRange}}
 * {{MinDirectionNorm}}
 * {{LinearUtil}}{quote}
* *DebugMode:* boolean OptimizationData. when true, intermediate iteration 
results (the trajectory).

 
 * *DoubleData, DoubleStrictlyPositiveData:* Abstract OptimizationData 
factorizing code for unconstrained and contrained double OptimizationData.
 * *SearchRange:* double OptimizationData used for the line search performed at 
each iteration of gradient-like optimizer. For the moment, the Brent optimizer 
is hardcoded but in (relatively near) future that may be more flexible to allow 
other algorithms. But this should be discussed elsewhere.
 * *MinDirectionNorm:* double OptimizationData used to set a lower bound to the 
gradient, below which optimizer stops without line search. This is a very fine 
tuning that default to 10⁻15. It is useful in some cases stop here to avoid 
zero, near zero divide, or other problems such as processing implying singular 
or almost singular matrices.
 * *ObjectiveFunctionDimension:* integer OptimizationData that gives the number 
of parameters of the MultivariateFunction objective function. This is made 
necessary due to the fact that MultivariateFunction interface does not provide 
(and perhaps should not provide) any way to know the number of parameters to 
the function.
 * *ObjectiveFunctionHessian:* MultivariateMatrixFunction OptimizationData 
required for Newton-Raphson optimizer.

{quote} * {{LinearUtil}} defines many utilities already present in (or that 
should belong to) class 
[{{MathArrays}}|https://github.com/apache/commons-math/blob/master/commons-math-legacy-core/src/main/java/org/apache/commons/math4/legacy/core/MathArrays.java].{quote}
Sorry, I missed that. {{LinearUtil}} has been removed.
{quote}The test infrastructure may be a neat idea (I didn't look at the 
details) but it completely departs from the existing approach (Junit 4) and the 
expected upgrade path using Junit 5 features (see e.g. in 
["SimplexOptimizerTest.java"|https://github.com/apache/commons-math/blob/master/commons-math-legacy/src/test/java/org/apache/commons/math4/legacy/optim/nonlinear/scalar/noderiv/SimplexOptimizerTest.java])
{quote}
*Tests*

The tests are pure Junit tests. They do not use any exotic feature of Junit but 
@Test annotation and the very basic junit assertions. Thus it run on Junit 4v 
shall upgrade gracefully to Junit 4. It is based on a DiffTest.java a very 
lightweight Junit based framework that check test conformity by comparing, line 
by line, output files with test references. This allow to mass compare a lot of 
(mostly numerical) data in a single blow. It is very practical to check the 
conformity of gradient-like optimization, because I can check the overall test 
results in stdout.txt (for nominal test) or in stderr.txt (for error tests like 
 GradientLikeOptimizerTest.testBadOptimizationData()). Further intermediate 
data (the optimization trajectory) are stored in gnuplot compatible data that 
can be visualized (if gnuplot is installed)
 * {{gnuplot --persist 
src/test/resources/org/apache/commons/math4/legacy/optim/nonlinear/scalar/gradient/NewtonRaphsonOptimizerTest/testRosenbrock/res/gnuplot.plt}}

!Screenshot from 2023-07-10 12-13-38.png!

I admit that test framework documentation gives room for enhancement.
{quote}Other, comparatively smaller, issues:
{quote} * 
{quote}We don't use {{@author}} tags.{quote}
 * 
{quote}All fields and methods (including _private_ ones) must be 
documented.{quote}
 * 
{quote}Abbreviated names should be avoided ({{{}initOptimization{}}} for 
example should be {{{}initializeOptimization{}}}, or perhaps just 
{{{}initialize{}}}) unless they have become "well-known" (like "BFGS" 
maybe).{quote}
 * 
{quote}The "default constructor" should not be explicitly declared.{quote}

 

fixed

> Classical multivariate optimizers (gradient descent, Raphson-Newton, BFGS) 
> are missing
> --------------------------------------------------------------------------------------
>
>                 Key: MATH-1656
>                 URL: https://issues.apache.org/jira/browse/MATH-1656
>             Project: Commons Math
>          Issue Type: Wish
>          Components: legacy
>    Affects Versions: 4.0-beta1
>            Reporter: François Laferrière
>            Priority: Major
>              Labels: features
>         Attachments: MATH-1658-GradientDescent-Newton-BFGS.patch, Screenshot 
> from 2023-07-10 12-13-38.png, Slim Screenshot from 2023-02-07 11-42-40.tiff
>
>
> Some classical multivariate such as
>  * gradient descent,
>  * Raphson-Newton,
>  * BFGS
> are missing.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to