I guess you mean higher redundancy/completeness in the HIGHER resolution
shells.

Rather than decreasing that sigma cutoff, a better solution is to increase
the profile fitting radius. The default value is fine for most lab-based
detectors, but too small for most (larger) synchrotron detectors. Making
that value larger allows reflections further away to be included in the
profile fitting, rather than including weaker reflections. In my hands this
has often changed the completeness of the high resolution shell from <30%
to >90% as weak reflections will no longer be ignored.
There is a button in one of the graphic windows to show the profile fitting
radius, it will show a circle that follows the mouse pointer around. The
profile fitting radius should be made large enough that everywhere (within
your resolution limit) you go, 5-10 strong reflections will be inside that
circle.

Jose.



"Ed Pozharski" <epozh...@umaryland.edu> wrote:
> On Thu, 2009-10-22 at 10:18 -0400, protein.chemist protein.chemist
> wrote:
>> What is the Sigma Cutoff that one should use for Data Processing using
>> HKL2000.
>> 
> 
> Since you say HKL2000, I assume that you mean the "Refinement Sigma
> Cutoff" in index tab.  The parameter, imu, determines which reflections
> will be considered "strong" and used in parameter refinement and (?)
> profile fitting.  The default value is 5.0, which is just fine for good
> data.  I do, however, routinely set it to lower value of 3.0, since it
> was my observation that then you get higher redundancy/completeness in
> lower resolution shells.  
> There is some evidence that the mechanism here is related to rejections
> due to incomplete profiles.  Obviously, as you reach the outer rim,
> strong reflections become sparse and if you are also using the
> relatively small default value of the profile fitting radius, large
> number of reflections may be rejected because denzo can't calculate
> average profiles in their vicinity (I expected that in the absence of
> the profile the integrated intensity should be used instead, but perhaps
> it's not the case).
> 
>> Is there a minimum or maximum value.
>> 
> 
> I'd say it makes no sense to go below 1.0, but you can sure try and see
> what happens.  Upper limit is obviously defined by the point where you
> don't have enough strong reflections for robust refinement of
> parameters.  The absolute values will, of course, vary from dataset to
> dataset.
> 
> 
> -- 
> 


--
***************************
Jose Antonio Cuesta-Seijo

Biophysical
Chemistry Group
Department of Chemistry
University of Copenhagen 
Tlf:
+45-35320261
Universitetsparken 5 
DK-2100 Copenhagen,
Denmark
***************************

Reply via email to