Yes higher R factors is the usual reason people don't like I-based
refinement!

Anyway, refining against Is doesn't solve the problem, it only postpones
it: you still need the Fs for maps! (though errors in Fs may be less
critical then).

-- Ian


On 20 June 2013 17:20, Dale Tronrud <det...@uoxray.uoregon.edu> wrote:

>    If you are refining against F's you have to find some way to avoid
> calculating the square root of a negative number.  That is why people
> have historically rejected negative I's and why Truncate and cTruncate
> were invented.
>
>    When refining against I, the calculation of (Iobs - Icalc)^2 couldn't
> care less if Iobs happens to be negative.
>
>    As for why people still refine against F...  When I was distributing
> a refinement package it could refine against I but no one wanted to do
> that.  The "R values" ended up higher, but they were looking at R
> values calculated from F's.  Of course the F based R values are lower
> when you refine against F's, that means nothing.
>
>    If we could get the PDB to report both the F and I based R values
> for all models maybe we could get a start toward moving to intensity
> refinement.
>
> Dale Tronrud
>
>
> On 06/20/2013 09:06 AM, Douglas Theobald wrote:
>
>> Just trying to understand the basic issues here.  How could refining
>> directly against intensities solve the fundamental problem of negative
>> intensity values?
>>
>>
>> On Jun 20, 2013, at 11:34 AM, Bernhard Rupp <hofkristall...@gmail.com>
>> wrote:
>>
>>  As a maybe better alternative, we should (once again) consider to refine
>>>> against intensities (and I guess George Sheldrick would agree here).
>>>>
>>>
>>> I have a simple question - what exactly, short of some sort of historic
>>> inertia (or memory lapse), is the reason NOT to refine against intensities?
>>>
>>> Best, BR
>>>
>>

Reply via email to