I have a working prototype for real univariate functions that uses
finite differences. The code is very small and simple, but quite
messy. I'll clean it up soon. It's trivial to extend this to
univariate vectorial and matrix functions. I think it's also worth add
Savitzky-Golay smoothing filters for univariate functions for the
first pass. Computing SG coefficients is really easy:
http://en.wikipedia.org/wiki/Savitzky%E2%80%93Golay_smoothing_filter
Or see Numerical Recipes. (The implementation in NR is crazy, but
obviously correct. It could be written with CM objects in about 10
simple, clear lines; NR's uses about 50 lines of relative madness.)

Adding SG filters for the first pass is, I think, a reasonably good
idea since both SG and FD are basically convolution-type filters, so
we could develop some infrastructure for these. With it, we also use
other algos, such as the low-noise filters from Holoborodko mentioned
in a previous email, without much extra work.

I think the multivariate world should come later. It can be built atop
the univariate infrastructure for the most part, so
all-things-univariate should get nailed down before tackling the
multivariate side.

Shall I create a JIRA ticket and submit a patch for this?

Cheers,
Fran.



On Fri, Aug 12, 2011 at 7:42 AM, Patrick Meyer <meyer...@gmail.com> wrote:
> Thanks for the information Luc. I didn't know those existed. I'm happy to
> keep the discussion at the implementation levels.
>
> On 8/12/2011 6:23 AM, Luc Maisonobe wrote:
>>
>> Le 12/08/2011 00:30, Patrick Meyer a écrit :
>>>
>>> I like the idea of adding this feature. What about an abstract class
>>> that implements DifferentiableMultivariateRealFunction and provides the
>>> method for partialDerivative (). People could then override the
>>> partialDerivative method if they have an analytic derivative.
>>>
>>> Here's some code that I'm happy to contribute to Commons-math. It
>>> computes the derivative by the central difference meathod and the
>>> Hessian by finite difference. I can add this to JIRA when it's there.
>>
>> Hi Patrick,
>>
>> I think we need to discuss about the API we want and then about the
>> implementation. There are already other finite differences implementations
>> in some tests for both Apache Commons Math and a complete package in Apache
>> Commons Nabla. Your code adds Hessian to this which is really a good thing.
>>
>> Thanks,
>> Luc
>>
>>>
>>> /**
>>> * Numerically compute gradient by the central difference method.
>>> Override this method
>>> * when the analytic gradient is available.
>>> *
>>> *
>>> * @param x
>>> * @return
>>> */
>>> public double[] derivativeAt(double[] x){
>>> int n = x.length;
>>> double[] grd = new double[n];
>>> double[] u = Arrays.copyOfRange(x, 0, x.length);
>>> double f1 = 0.0;
>>> double f2 = 0.0;
>>> double stepSize = 0.0001;
>>>
>>> for(int i=0;i<n;i++){
>>> stepSize = Math.sqrt(EPSILON)*(Math.abs(x[i])+1.0);//from SAS manual on
>>> nlp procedure
>>> u[i] = x[i] + stepSize;
>>> f1 = valueAt(u);
>>> u[i] = x[i] - stepSize;
>>> f2 = valueAt(u);
>>> grd[i] = (f1-f2)/(2.0*stepSize);
>>> }
>>> return grd;
>>> }
>>>
>>> /**
>>> * Numerically compute Hessian using a finite difference method. Override
>>> this
>>> * method when the analytic Hessian is available.
>>> *
>>> * @param x
>>> * @return
>>> */
>>> public double[][] hessianAt(double[] x){
>>> int n = x.length;
>>> double[][] hessian = new double[n][n];
>>> double[] gradientAtXpls = null;
>>> double[] gradientAtX = derivativeAt(x);
>>> double xtemp = 0.0;
>>> double stepSize = 0.0001;
>>>
>>> for(int j=0;j<n;j++){
>>> stepSize = Math.sqrt(EPSILON)*(Math.abs(x[j])+1.0);//from SAS manual on
>>> nlp procedure
>>> xtemp = x[j];
>>> x[j] = xtemp + stepSize;
>>> double [] x_copy = Arrays.copyOfRange(x, 0, x.length);
>>> gradientAtXpls = derivativeAt(x_copy);
>>> x[j] = xtemp;
>>> for(int i=0;i<n;i++){
>>> hessian[i][j] = (gradientAtXpls[i]-gradientAtX[i])/stepSize;
>>> }
>>> }
>>> return hessian;
>>> }
>>>
>>>
>>> On 8/11/2011 5:36 PM, Luc Maisonobe wrote:
>>>>
>>>> Le 11/08/2011 23:27, Fran Lattanzio a écrit :
>>>>>
>>>>> Hello,
>>>>
>>>> Hi Fran,
>>>>
>>>>>
>>>>> I have a proposal for a numerical derivatives framework for Commons
>>>>> Math. I'd like to add the ability to take any UnivariateRealFunction
>>>>> and produce another function that represents it's derivative for an
>>>>> arbitrary order. Basically, I'm saying add a factory-like interface
>>>>> that looks something like the following:
>>>>>
>>>>> public interface UniverateNumericalDeriver {
>>>>> public UnivariateRealFunction derive(UnivariateRealFunction f, int
>>>>> derivOrder);
>>>>> }
>>>>
>>>> This sound interesting. did you have a look at Commons Nabla
>>>> UnivariateDifferentiator interface and its implementations ?
>>>>
>>>> Luc
>>>>
>>>>>
>>>>> For an initial implementation of this interface, I propose using
>>>>> finite differences - either central, forward, or backward. Computing
>>>>> the finite difference coefficients, for any derivative order and any
>>>>> error order, is a relatively trivial linear algebra problem. The user
>>>>> will simply choose an error order and difference type when setting up
>>>>> an FD univariate deriver - everything else will happen automagically.
>>>>> You can compute the FD coefficients once the user invokes the function
>>>>> in the interface above (might be expensive), and determine an
>>>>> appropriate stencil width when they call evaluate(double) on the
>>>>> function returned by the aformentioned method - for example, if the
>>>>> user has asked for the nth derivative, we simply use the nth root of
>>>>> the machine epsilon/double ulp for the stencil width. It would also be
>>>>> pretty easy to let the user control this (which might be desirable in
>>>>> some cases). Wikipedia has decent article on FDs of all flavors:
>>>>> http://en.wikipedia.org/wiki/Finite_difference
>>>>>
>>>>> There are, of course, many other univariate numerical derivative
>>>>> schemes that could be added in the future - using Fourier transforms,
>>>>> Barak's adaptive degree polynomial method, etc. These could be added
>>>>> later. We could also add the ability to numerically differentiate at
>>>>> single point using an arbitrary or user-defined grid (rather than an
>>>>> automatically generated one, like above). Barak's method and Fornberg
>>>>> finite difference coefficients could be used in this case:
>>>>> http://pubs.acs.org/doi/abs/10.1021/ac00113a006
>>>>>
>>>>> http://amath.colorado.edu/faculty/fornberg/Docs/MathComp_88_FD_formulas.pdf
>>>>>
>>>>>
>>>>> It would also make sense to add vectorial and matrix-flavored versions
>>>>> of interface above. These interfaces would be slightly more complex,
>>>>> but nothing too crazy. Again, the initial implementation would be
>>>>> finite differences. This would also be really easy to implement, since
>>>>> multivariate FD coefficients are nothing more than an outer product of
>>>>> their univariate cousins. The Wikipedia article also has some good
>>>>> introductory material on multivariate FDs.
>>>>>
>>>>> Cheers,
>>>>> Fran.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
>>>>> For additional commands, e-mail: dev-h...@commons.apache.org
>>>>>
>>>>>
>>>>
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
>>>> For additional commands, e-mail: dev-h...@commons.apache.org
>>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
>>> For additional commands, e-mail: dev-h...@commons.apache.org
>>>
>>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
>> For additional commands, e-mail: dev-h...@commons.apache.org
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
> For additional commands, e-mail: dev-h...@commons.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-mail: dev-h...@commons.apache.org

Reply via email to