I mean I'm not assuming an ideal beam.

On Nov 23, 2009, at 2:54 PM, Richard Gillilan wrote:

It seems to be widely known and observed that diffuse background scattering decreases more rapidly with increasing detector-to-sample distance than Bragg reflections. For example, Jim Pflugrath, in his 1999 paper (Acta Cryst 1999 D55 1718-1725) says "Since the X-ray background falls off as the square of the distance, the expectation is that a larger crystal-to-detector distance is better for reduction of the x-ray background. ..."

Does anyone know of a more rigorous discussion of why background scatter fades while Bragg reflections remain collimated with distance?


Richard Gillilan
MacCHESS

Reply via email to