Hi Edward.

If I in systemtest:
test_estimate_r2eff_err

change:
self.interpreter.minimise.execute(min_algor='Newton',
constraints=False, verbosity=1)
to
self.interpreter.minimise.execute(min_algor='Newton',
constraints=True, verbosity=1)

Then I get:

##############################################

relax> minimise.grid_search(lower=None, upper=None, inc=11,
verbosity=1, constraints=True, skip_preset=True)


Grid search setup:  the spin block [':52@N']
--------------------------------------------

......

relax> minimise.execute(min_algor='Newton', line_search=None,
hessian_mod=None, hessian_type=None, func_tol=1e-25, grad_tol=None,
max_iter=10000000, constraints=True, scaling=True, verbosity=1)
Resetting the minimisation statistics.


Fitting to spin :52@N, frequency 799777399.1 and dispersion point 431.0
-----------------------------------------------------------------------



Logarithmic barrier function
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
k: 0        xk: [            8.8,     200000.0001]    fk: 37.0428718161
Entering sub-algorithm.

    Newton minimisation
    ~~~~~~~~~~~~~~~~~~~
    Line search:  Backtracking line search.
    Hessian modification:  The Gill, Murray, and Wright modified
Cholesky algorithm.
E
======================================================================
ERROR: test_estimate_r2eff_err (test_suite.system_tests.relax_disp.Relax_disp)
Test the user function for estimating R2eff errors from exponential
curve fitting.
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"/sbinlab2/tlinnet/software/NMR-relax/relax_trunk/test_suite/system_tests/relax_disp.py",
line 2990, in test_estimate_r2eff_err
    self.interpreter.minimise.execute(min_algor='Newton',
constraints=True, verbosity=1)
  File "/sbinlab2/tlinnet/software/NMR-relax/relax_trunk/prompt/uf_objects.py",
line 223, in __call__
    self._backend(*new_args, **uf_kargs)
  File 
"/sbinlab2/tlinnet/software/NMR-relax/relax_trunk/pipe_control/minimise.py",
line 527, in minimise
    api.minimise(min_algor=min_algor, min_options=min_options,
func_tol=func_tol, grad_tol=grad_tol, max_iterations=max_iter,
constraints=constraints, scaling_matrix=scaling_matrix,
verbosity=verbosity)
  File 
"/sbinlab2/tlinnet/software/NMR-relax/relax_trunk/specific_analyses/relax_disp/api.py",
line 668, in minimise
    minimise_r2eff(spins=spins, spin_ids=spin_ids,
min_algor=min_algor, min_options=min_options, func_tol=func_tol,
grad_tol=grad_tol, max_iterations=max_iterations,
constraints=constraints, scaling_matrix=scaling_matrix[model_index],
verbosity=verbosity, sim_index=sim_index, lower=lower_i,
upper=upper_i, inc=inc_i)
  File 
"/sbinlab2/tlinnet/software/NMR-relax/relax_trunk/specific_analyses/relax_disp/optimisation.py",
line 424, in minimise_r2eff
    results = generic_minimise(func=func, dfunc=dfunc, d2func=d2func,
args=(), x0=param_vector, min_algor=min_algor,
min_options=min_options, func_tol=func_tol, grad_tol=grad_tol,
maxiter=max_iterations, A=A, b=b, full_output=True,
print_flag=verbosity)
  File 
"/sbinlab2/software/python-enthought-dis/canopy-1.4.0-full-rh5-64/Canopy_64bit/User/lib/python2.7/site-packages/minfx/generic.py",
line 399, in generic_minimise
    results = log_barrier_function(func=func, dfunc=dfunc,
d2func=d2func, args=args, x0=x0, min_options=min_options, A=A, b=b,
func_tol=func_tol, grad_tol=grad_tol, maxiter=maxiter,
full_output=full_output, print_flag=print_flag)
  File 
"/sbinlab2/software/python-enthought-dis/canopy-1.4.0-full-rh5-64/Canopy_64bit/User/lib/python2.7/site-packages/minfx/log_barrier_function.py",
line 96, in log_barrier_function
    results = min.minimise()
  File 
"/sbinlab2/software/python-enthought-dis/canopy-1.4.0-full-rh5-64/Canopy_64bit/User/lib/python2.7/site-packages/minfx/log_barrier_function.py",
line 264, in minimise
    results = self.generic_minimise(func=self.func_log,
dfunc=self.func_dlog, d2func=self.func_d2log, args=self.args,
x0=self.xk, min_algor=self.min_algor, min_options=self.min_options,
func_tol=self.func_tol, grad_tol=self.grad_tol, maxiter=maxiter,
full_output=1, print_flag=self.print_flag, print_prefix="\t")
  File 
"/sbinlab2/software/python-enthought-dis/canopy-1.4.0-full-rh5-64/Canopy_64bit/User/lib/python2.7/site-packages/minfx/generic.py",
line 326, in generic_minimise
    results = newton(func=func, dfunc=dfunc, d2func=d2func, args=args,
x0=x0, min_options=min_options, func_tol=func_tol, grad_tol=grad_tol,
maxiter=maxiter, full_output=full_output, print_flag=print_flag,
print_prefix=print_prefix)
  File 
"/sbinlab2/software/python-enthought-dis/canopy-1.4.0-full-rh5-64/Canopy_64bit/User/lib/python2.7/site-packages/minfx/newton.py",
line 47, in newton
    min = Newton(func, dfunc, d2func, args, x0, min_options, func_tol,
grad_tol, maxiter, a0, mu, eta, mach_acc, full_output, print_flag,
print_prefix)
  File 
"/sbinlab2/software/python-enthought-dis/canopy-1.4.0-full-rh5-64/Canopy_64bit/User/lib/python2.7/site-packages/minfx/newton.py",
line 156, in __init__
    self.setup_newton()
  File 
"/sbinlab2/software/python-enthought-dis/canopy-1.4.0-full-rh5-64/Canopy_64bit/User/lib/python2.7/site-packages/minfx/newton.py",
line 211, in setup_newton
    self.dfk, self.g_count = self.dfunc(*(self.xk,)+self.args), self.g_count + 1
  File 
"/sbinlab2/software/python-enthought-dis/canopy-1.4.0-full-rh5-64/Canopy_64bit/User/lib/python2.7/site-packages/minfx/log_barrier_function.py",
line 211, in func_dlog
    raise NameError("The logarithmic barrier gradient is not implemented yet.")
NameError: The logarithmic barrier gradient is not implemented yet.

----------------------------------------------------------------------



2014-09-01 12:42 GMT+02:00 Edward d'Auvergne <[email protected]>:
> On 1 September 2014 12:34, Troels Emtekær Linnet <[email protected]> 
> wrote:
>> Anyway, before minfx can handle constraints in for example BFGS,
>> this is just a waste of time.
>
> Minfx can do this :)  The log-barrier constraint algorithm works with
> all optimisation techniques in minfx, well, apart from the grid search
> (https://en.wikipedia.org/wiki/Barrier_function#Logarithmic_barrier_function).
> And if gradients are supplied, the more powerful
> Methods-of-Multipliers algorithm can also be used in combination with
> all optimisation techniques
> (https://en.wikipedia.org/wiki/Augmented_Lagrangian_method).
>
>
>> I think there will be a 10 x speed up, just for the Jacobian.
>
> For the analytic models, you could have a 10x speed up if symbolic
> gradients and Hessians are implemented.  I'm guessing that's what you
> mean.
>
>
>> And when you have the Jacobian, estimating the errors are trivial.
>>
>> std(q) = sqrt ( (dq/dx std(x))*2 + (dq/dz std(z))*2 )
>
> :S  I'm not sure about this estimate.  It looks rather too linear.  I
> wish errors would be so simple.
>
>
>> where q is the function. x and z are R1 and R1rho_prime.
>>
>> So, until then, implementing the Jacobian is only for testing the
>> error estimation compared to
>> Monte-Carlo simulations.
>
> If you do add the equations, the lib.dispersion.dpl94 module would be
> the natural place to put them.  And the interface as dfunc_DPL94(),
> d2func_DPL94(), and jacobian_DPL94().
>
> Regards,
>
> Edward

_______________________________________________
relax (http://www.nmr-relax.com)

This is the relax-devel mailing list
[email protected]

To unsubscribe from this list, get a password
reminder, or change your subscription options,
visit the list information page at
https://mail.gna.org/listinfo/relax-devel

Reply via email to