The flux from the spots fall off as the square as well. Assuming that flux at the detector is linear with respect to measured intensity, I'm not sure where the benefit would be. I'm also assuming an ideal beam and ignoring other sources of noise.

James



On Nov 23, 2009, at 2:54 PM, Richard Gillilan wrote:

It seems to be widely known and observed that diffuse background scattering decreases more rapidly with increasing detector-to-sample distance than Bragg reflections. For example, Jim Pflugrath, in his 1999 paper (Acta Cryst 1999 D55 1718-1725) says "Since the X-ray background falls off as the square of the distance, the expectation is that a larger crystal-to-detector distance is better for reduction of the x-ray background. ..."

Does anyone know of a more rigorous discussion of why background scatter fades while Bragg reflections remain collimated with distance?


Richard Gillilan
MacCHESS

Reply via email to