Hmmm, then why are the errors close to double?  This is very strange!

Regards,

Edward

On 28 August 2014 13:58, Troels Emtekær Linnet <[email protected]> wrote:
> Hi Edward.
>
> I see that I did the documentation wrong.
>
> The implementation is as exactly as you describe.
>
> Best Troels
>
> On 28 Aug 2014 13:46, "Edward d'Auvergne" <[email protected]> wrote:
>>
>> Hi,
>>
>> What happens if you construct the covariance matrix as:
>>
>>     covar = (J^T.X.J)^-1 ,
>>
>> Where X is a square matrix of dimension n, the number of data points
>> (here time points).  All elements of X are zero except for the
>> diagonal which is equal to the intensity errors squared.  I.e. this is
>> the peak intensity covariance matrix, where the covariances are zero
>> as the peak intensity errors are not dependent on each other and
>> diagonal is simply the peak errors squared or the variances.  This is
>> from
>> https://en.wikipedia.org/wiki/Propagation_of_uncertainty#Non-linear_combinations
>> :
>>
>>     cov(f) = J.cov(x).J^T.
>>
>> The J matrices might be transposed in Wikipedia's definition compared
>> to ours.  Maybe this would solve the problem of the errors being too
>> high?
>>
>> Regards,
>>
>> Edward
>>
>>
>>
>> On 27 August 2014 11:41, Troels Emtekær Linnet <[email protected]>
>> wrote:
>> > Hi Edward.
>> >
>> > There is no scaling involved when calculating.
>> >
>> > What I meant with scaling is correction to the co-variance matrix.
>> > Related to errors, or similar.
>> >
>> > But I haven't figured it out yet.
>> >
>> >
>> > Best
>> > Troels
>> >
>> > 2014-08-27 11:33 GMT+02:00 Edward d'Auvergne <[email protected]>:
>> >> Does it give the same small result?  Try turning off parameter scaling.
>> >>
>> >> Regards,
>> >>
>> >> Edward
>> >>
>> >> On 27 August 2014 11:29,  <[email protected]> wrote:
>> >>> Author: tlinnet
>> >>> Date: Wed Aug 27 11:29:24 2014
>> >>> New Revision: 25330
>> >>>
>> >>> URL: http://svn.gna.org/viewcvs/relax?rev=25330&view=rev
>> >>> Log:
>> >>> Tried to implement the Jacobian from C-code.
>> >>>
>> >>> This though also report errors which are to small.
>> >>>
>> >>> Maybe some scaling is wrong.
>> >>>
>> >>> task #7822(https://gna.org/task/index.php?7822): Implement user
>> >>> function to estimate R2eff and associated errors for exponential curve
>> >>> fitting.
>> >>>
>> >>> Modified:
>> >>>     trunk/specific_analyses/relax_disp/estimate_r2eff.py
>> >>>
>> >>> Modified: trunk/specific_analyses/relax_disp/estimate_r2eff.py
>> >>> URL:
>> >>> http://svn.gna.org/viewcvs/relax/trunk/specific_analyses/relax_disp/estimate_r2eff.py?rev=25330&r1=25329&r2=25330&view=diff
>> >>>
>> >>> ==============================================================================
>> >>> --- trunk/specific_analyses/relax_disp/estimate_r2eff.py
>> >>> (original)
>> >>> +++ trunk/specific_analyses/relax_disp/estimate_r2eff.py        Wed
>> >>> Aug 27 11:29:24 2014
>> >>> @@ -42,7 +42,7 @@
>> >>>  from specific_analyses.relax_disp.variables import MODEL_R2EFF
>> >>>  from specific_analyses.relax_fit.optimisation import func_wrapper,
>> >>> dfunc_wrapper, d2func_wrapper
>> >>>  from target_functions.chi2 import chi2_rankN
>> >>> -from target_functions.relax_fit import setup
>> >>> +from target_functions.relax_fit import jacobian, setup
>> >>>
>> >>>
>> >>>  # Scipy installed.
>> >>> @@ -734,7 +734,7 @@
>> >>>      E.set_settings_minfx(min_algor=min_algor)
>> >>>
>> >>>      # Do C code
>> >>> -    do_C = False
>> >>> +    do_C = True
>> >>>
>> >>>      if do_C:
>> >>>          # Initialise the function to minimise.
>> >>> @@ -766,19 +766,27 @@
>> >>>      param_vector, chi2, iter_count, f_count, g_count, h_count,
>> >>> warning = results_minfx
>> >>>
>> >>>      # Get the Jacobian.
>> >>> -    # First make a call to the Jacobian function, which store it in
>> >>> the class.
>> >>> -    E.func_exp_grad(params=param_vector)
>> >>> -    jacobian_matrix = deepcopy(E.jacobian_matrix)
>> >>> -
>> >>> +    if do_C:
>> >>> +        # First make a call to the Jacobian function, which store it
>> >>> in the class.
>> >>> +        jacobian_matrix = transpose(asarray( jacobian(param_vector) )
>> >>> )
>> >>> +
>> >>> +        # Compare with python code.
>> >>> +        #E.func_exp_grad(params=param_vector)
>> >>> +        #jacobian_matrix2 = deepcopy(E.jacobian_matrix)
>> >>> +        #print jacobian_matrix
>> >>> +        #print " "
>> >>> +        #print jacobian_matrix2
>> >>> +    else:
>> >>> +        jacobian_matrix = deepcopy(E.jacobian_matrix)
>> >>> +
>> >>> +    # Get the co-variance
>> >>> +    pcov = E.multifit_covar(J=jacobian_matrix)
>> >>> +
>> >>> +    # To compute one standard deviation errors on the parameters,
>> >>> take the square root of the diagonal covariance.
>> >>> +    param_vector_error = sqrt(diag(pcov))
>> >>>      # Set error to inf.
>> >>>      #param_vector_error = [inf, inf]
>> >>>
>> >>> -    # Get the co-variance
>> >>> -    pcov = E.multifit_covar(J=jacobian_matrix)
>> >>> -
>> >>> -    # To compute one standard deviation errors on the parameters,
>> >>> take the square root of the diagonal covariance.
>> >>> -    param_vector_error = sqrt(diag(pcov))
>> >>> -
>> >>>      # Pack to list.
>> >>>      results = [param_vector, param_vector_error, chi2, iter_count,
>> >>> f_count, g_count, h_count, warning]
>> >>>
>> >>>
>> >>>
>> >>> _______________________________________________
>> >>> relax (http://www.nmr-relax.com)
>> >>>
>> >>> This is the relax-commits mailing list
>> >>> [email protected]
>> >>>
>> >>> To unsubscribe from this list, get a password
>> >>> reminder, or change your subscription options,
>> >>> visit the list information page at
>> >>> https://mail.gna.org/listinfo/relax-commits
>> >>
>> >> _______________________________________________
>> >> relax (http://www.nmr-relax.com)
>> >>
>> >> This is the relax-devel mailing list
>> >> [email protected]
>> >>
>> >> To unsubscribe from this list, get a password
>> >> reminder, or change your subscription options,
>> >> visit the list information page at
>> >> https://mail.gna.org/listinfo/relax-devel

_______________________________________________
relax (http://www.nmr-relax.com)

This is the relax-devel mailing list
[email protected]

To unsubscribe from this list, get a password
reminder, or change your subscription options,
visit the list information page at
https://mail.gna.org/listinfo/relax-devel

Reply via email to