Dear Jatin,

the optimum mued of 2.x is not just derived by simple photon counting statistics. As Matt pointed out, for transmission measurements at a synchrotron beamline in conventional scanning mode this is seldom a matter. Nevertheless, one should avoid to measure subtle changes of absorption at the extreme ends, that is, transmission near 0 % or 100 %. In optical photometry this is described by the more or less famous "Ringbom plots" which describe the dependency of the accuracy of quantitative analysis by absorption measurements (usually but not necessarily in the UV/Vis) from the total absorption of the sample.

This time the number is only near to 42, the optimum transmission is 36.8 % (mue = 1). So, to achieve the highest accuracy in the determination of small Delta c (c = concentration) you should try to measure samples with transmissions near to this value (actually the minimum is broad and transmissions between 0.2 and 0.7 are ok). In our case, we are not interested in the concentration of the absorber, but we are also interested in (very) small changes of the transmission resp. absorption in our samples. Or, using Bouger, Lambert Beer's law, in our case mue (-ln(I1/I0) is a function of the absorption coefficient (mue0). The concentration of the absorber and the thickness (d) of the sample are constant.

-ln(I1/I0) = mue0 * c * d

But then: If the optimum is a mue between 0.35 and 1.6 why are we all measuring successfully (ok, more or less ;-) using samples having a mue between 2 and 3? ...and 0.35 seems desperately small to me! Maybe sample homogeneity is an issue?

Cheers,
Edmund Welter






_______________________________________________
Ifeffit mailing list
Ifeffit@millenia.cars.aps.anl.gov
http://millenia.cars.aps.anl.gov/mailman/listinfo/ifeffit

Reply via email to