Hi Edward.

I get:

Verify, without constraints, C code BFGS
Traceback (most recent call last):
  File "profiling_relax_fit.py", line 401, in <module>
    main()
  File "profiling_relax_fit.py", line 177, in main
    cProfile.run('verify(min_algor="BFGS",
constraints=%s)'%(constraints), filename)
  File 
"/Applications/Canopy.app/appdata/canopy-1.4.0.1938.macosx-x86_64/Canopy.app/Contents/lib/python2.7/cProfile.py",
line 29, in run
    prof = prof.run(statement)
  File 
"/Applications/Canopy.app/appdata/canopy-1.4.0.1938.macosx-x86_64/Canopy.app/Contents/lib/python2.7/cProfile.py",
line 135, in run
    return self.runctx(cmd, dict, dict)
  File 
"/Applications/Canopy.app/appdata/canopy-1.4.0.1938.macosx-x86_64/Canopy.app/Contents/lib/python2.7/cProfile.py",
line 140, in runctx
    exec cmd in globals, locals
  File "<string>", line 1, in <module>
  File "profiling_relax_fit.py", line 323, in verify
    results = generic_minimise(func=func, dfunc=dfunc, d2func=d2func,
args=(), x0=x0, min_algor=E.min_algor, min_options=E.min_options,
func_tol=E.func_tol, grad_tol=E.grad_tol, maxiter=E.max_iterations,
A=E.A, b=E.b, full_output=True, print_flag=E.verbosity)
  File 
"/Users/tlinnet/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/minfx/generic.py",
line 322, in generic_minimise
    results = bfgs(func=func, dfunc=dfunc, args=args, x0=x0,
min_options=min_options, func_tol=func_tol, grad_tol=grad_tol,
maxiter=maxiter, full_output=full_output, print_flag=print_flag,
print_prefix=print_prefix)
  File 
"/Users/tlinnet/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/minfx/bfgs.py",
line 49, in bfgs
    results = min.minimise()
  File 
"/Users/tlinnet/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/minfx/base_classes.py",
line 299, in minimise
    self.update()
  File 
"/Users/tlinnet/Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/minfx/bfgs.py",
line 151, in update_bfgs
    yk = self.dfk_new - self.dfk
TypeError: unsupported operand type(s) for -: 'list' and 'list'

Best
Troels

2014-08-26 19:52 GMT+02:00 Edward d'Auvergne <[email protected]>:
> The exponential curve-fitting Hessian from the
> target_functions.relax_fit.d2func() function is bug free and fully
> functional.
>
> Regards,
>
> Edward
>
>
> On 26 August 2014 19:24, Edward d'Auvergne <[email protected]> wrote:
>> Oh well, I had some time between frame order calculations and decided
>> to practice my C coding a little.  The full exponential curve Hessian
>> is now implemented.  It may still need debugging.  But you can now use
>> the full Newton algorithm in your timing tests and comparisons to
>> Scipy.
>>
>> Regards,
>>
>> Edward
>>
>> On 26 August 2014 13:32, Edward d'Auvergne <[email protected]> wrote:
>>> Test it and see ;)  And if you are interested in the speed up in this
>>> part of the analysis, you can take it one more step and implement the
>>> Hessian in the C modules and then time the Newton algorithm!
>>>
>>> Regards,
>>>
>>> Edward
>>>
>>> On 26 August 2014 13:26, Troels Emtekær Linnet <[email protected]> 
>>> wrote:
>>>> Should this be used instead then in R2eff minimisation?
>>>>
>>>> Is it faster than simplex?
>>>>
>>>> Best
>>>> Troels
>>>>
>>>> 2014-08-26 13:04 GMT+02:00 Edward d'Auvergne <[email protected]>:
>>>>> Oh, I forgot to mention, but I also converted the
>>>>> Relax_fit.test_curve_fitting_height and
>>>>> Relax_fit.test_curve_fitting_volume system tests to use BFGS
>>>>> optimisation.  This is one of the best optimisation techniques when
>>>>> only the function and gradient are present, as it tries to numerically
>>>>> approximate the Hessian matrix, updating it as the algorithm moves
>>>>> along.  It is fast and performs incredibly well, so it is a widely
>>>>> used algorithm.  The system tests using BFGS demonstrate that the
>>>>> gradient works very well for optimisation.  It isn't as fast as Newton
>>>>> optimisation however.
>>>>>
>>>>> Regards,
>>>>>
>>>>> Edward
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On 26 August 2014 13:00, Edward d'Auvergne <[email protected]> wrote:
>>>>>> Hi Troels,
>>>>>>
>>>>>> I've now implemented the exponential curve-fitting dfunc() function
>>>>>> for calculating the gradient.  This includes:
>>>>>>
>>>>>> - The Python wrapper function
>>>>>> specific_analyses.relax_fit.optimisation.dfunc_wrapper(),
>>>>>> - The target_functions/c_chi2.c function dchi2(),
>>>>>> - The target_functions/exponential.c functions exponential_dI0() and
>>>>>> exponential_dR(),
>>>>>> - The target_functions.relax_fit C module dfunc() Python function.
>>>>>>
>>>>>> I have tested the gradient using the numerical integration in the
>>>>>> test_suite/shared_data/curve_fitting/numeric_gradient/integrate.py
>>>>>> file to determine what the chi-squared gradient should be at different
>>>>>> parameter combinations.  And this has been converted into a few unit
>>>>>> tests.  As this works, that means that the jacobian() function of the
>>>>>> C module should also be correct and bug-free, hence you should be able
>>>>>> to use it to obtain the covariance matrix.
>>>>>>
>>>>>> This is all I will do for now.  All that is left is to do for the
>>>>>> target_functions.relax_fit C module is simply the same thing, but for
>>>>>> the Hessian.  Feel free to give this a go if you are interested.  If I
>>>>>> have time in the future, I might add this too.
>>>>>>
>>>>>> Regards,
>>>>>>
>>>>>> Edward
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On 24 August 2014 17:56, Troels E. Linnet
>>>>>> <[email protected]> wrote:
>>>>>>> URL:
>>>>>>>   <http://gna.org/task/?7822>
>>>>>>>
>>>>>>>                  Summary: Implement user function to estimate R2eff and
>>>>>>> associated errors for exponential curve fitting.
>>>>>>>                  Project: relax
>>>>>>>             Submitted by: tlinnet
>>>>>>>             Submitted on: Sun 24 Aug 2014 03:56:36 PM UTC
>>>>>>>          Should Start On: Sun 24 Aug 2014 12:00:00 AM UTC
>>>>>>>    Should be Finished on: Sun 24 Aug 2014 12:00:00 AM UTC
>>>>>>>                 Category: relax's source code
>>>>>>>                 Priority: 5 - Normal
>>>>>>>                   Status: In Progress
>>>>>>>         Percent Complete: 0%
>>>>>>>              Assigned to: tlinnet
>>>>>>>              Open/Closed: Open
>>>>>>>          Discussion Lock: Any
>>>>>>>                   Effort: 0.00
>>>>>>>
>>>>>>>     _______________________________________________________
>>>>>>>
>>>>>>> Details:
>>>>>>>
>>>>>>> A verification script, showed that using scipy.optimize.leastsq reaches 
>>>>>>> the
>>>>>>> exact same parameters as minfx for exponential curve fitting.
>>>>>>>
>>>>>>> The verification script is in:
>>>>>>> test_suite/shared_data/curve_fitting/profiling/profiling_relax_fit.py
>>>>>>> test_suite/shared_data/curve_fitting/profiling/verify_error.py
>>>>>>>
>>>>>>> The profiling script shows that a 10 X increase in speed can be reached 
>>>>>>> by
>>>>>>> removing
>>>>>>> the linear constraints when using minfx.
>>>>>>>
>>>>>>> The profiling also shows that scipy.optimize.leastsq is 10X as fast as 
>>>>>>> using
>>>>>>> minfx, even without linear constraints.
>>>>>>>
>>>>>>> scipy.optimize.leastsq is a wrapper around wrapper around MINPACK's 
>>>>>>> lmdif and
>>>>>>> lmder algorithms.
>>>>>>>
>>>>>>> MINPACK is a FORTRAN90 library which solves systems of nonlinear 
>>>>>>> equations, or
>>>>>>> carries out the least squares minimization of the residual of a set of 
>>>>>>> linear
>>>>>>> or nonlinear equations.
>>>>>>>
>>>>>>>  The verification script also shows, that a very heavy and time 
>>>>>>> consuming
>>>>>>> monte carlo simulation of 2000 steps, reaches the same errors as the 
>>>>>>> errors
>>>>>>> reported by scipy.optimize.leastsq.
>>>>>>>
>>>>>>> The return from scipy.optimize.leastsq, gives the estimated co-variance.
>>>>>>> Taking the square root of the co-variance corresponds with 2X error 
>>>>>>> reported
>>>>>>> by minfx after 2000 Monte-Carlo simulations.
>>>>>>>
>>>>>>> This could be an extremely time saving step, when performing model 
>>>>>>> fitting in
>>>>>>> R1rho, where the errors of the R2eff values, are estimated by 
>>>>>>> Monte-Carlo
>>>>>>> simulations.
>>>>>>>
>>>>>>> The following setup illustrates the problem.
>>>>>>> This was analysed on a: MacBook Pro, 13-inch, Late 2011.
>>>>>>> With no multi-core setup.
>>>>>>>
>>>>>>> Script running is:
>>>>>>> test_suite/shared_data/dispersion/Kjaergaard_et_al_2013/2_pre_run_r2eff.py
>>>>>>>
>>>>>>> This script analyses just the R2eff values for 15 residues.
>>>>>>> It estimates the errors of R2eff based on 2000 Monte Carlo simulations.
>>>>>>> For each residues, there is 14 exponential graphs.
>>>>>>>
>>>>>>> The script was broken after 35 simulations.
>>>>>>> This was measured to 20 minutes.
>>>>>>> So 500 simulations would take about 4.8 Hours.
>>>>>>>
>>>>>>> The R2eff values and errors can by scipy.optimize.leastsq can instead be
>>>>>>> calculated in: 15 residues * 0.02 seconds = 0.3 seconds.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>     _______________________________________________________
>>>>>>>
>>>>>>> Reply to this item at:
>>>>>>>
>>>>>>>   <http://gna.org/task/?7822>
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>>   Message sent via/by Gna!
>>>>>>>   http://gna.org/
>>>>>>>
>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> relax (http://www.nmr-relax.com)
>>>>>>>
>>>>>>> This is the relax-devel mailing list
>>>>>>> [email protected]
>>>>>>>
>>>>>>> To unsubscribe from this list, get a password
>>>>>>> reminder, or change your subscription options,
>>>>>>> visit the list information page at
>>>>>>> https://mail.gna.org/listinfo/relax-devel

_______________________________________________
relax (http://www.nmr-relax.com)

This is the relax-devel mailing list
[email protected]

To unsubscribe from this list, get a password
reminder, or change your subscription options,
visit the list information page at
https://mail.gna.org/listinfo/relax-devel

Reply via email to