I guess, this discussion has already died down but I couldn't find a moment for 
reply soon enough:-)

As Prague was already mentioned, let me try to summarize what I think about 
this subject and have said there (let's hope I actually remember it:-):
1. A careful line broadening analysis (at this point in time) is better done 
outside Rietveld refinement
2. A physical model is better and preferred to a phenomenological model for 
analyzing line broadening

However, because we discuss the Size-Strain analysis in Rietveld here:
3. Rietveld obviously needs some kind of line-broadening modeling in order to 
at least correct for sample broadening effects (especially anisotropic ones) to 
extract correct integrated intensities for crystal-structure refinement. Thus, 
any model that works is good.
4. Rietveld needs to have a line-broadening model that works for an arbitrary 
crystal structure (up to triclinic) and arbitrary sample (i.e. many possible 
sources of broadening could be present in a given sample). Therefore, a 
phenomenological model is the only one available at this point, as physical 
models are still struggling with cubic (or hexagonal) structures and a very 
limited spectrum of physical sources causing broadening.

In conclusion:
5. I think that the work done by Nick Armstrong and others is definitely a way 
to go, but also a long way to go before we get to the level mentioned under 4 
(I certainly won't live to see it:-).
6. I also believe that (even when 5 is fulfilled) diffraction will often need 
some additional information provided by complementary characterization methods 
(i.e. TEM, SEM,...) to completely and accurately characterize defects in a 
sample, as we may calculate the most probable solution but won't often be able 
to discriminate between other very likely solutions, that is, the most probable 
is very often not significantly different from other physically plausible 
solutions (lognormal and gamma examples already mentioned).
7. Previous point implies that trying to do "too much" with only diffraction 
data might actually be dangerous. One can find too many dead-wrong numbers in 
the literature using some of the physical models (for instance, dislocation 
densities, etc.), as a real physical cause of broadening was probably different 
and/or there was a strong correlation between refinable parameters that depend 
on the diffraction angle in a similar way.

Considering the above:
8. The simple modified TCH model ("triple-Voigt"), used in most major Rietveld 
programs these days, is surprisingly flexible. It works well for most of the 
samples ("super-Lorentzian" is an example when it fails, as well as many 
others, but this is less frequent that one would expect) and gives some 
"numbers" for coherent domain size and strain. If we are lucky to know more 
about the sample (for instance, the information is available that a lognormal 
size distribution, certain type of dislocations, etc., is most likely to be 
prevalent for majority of grains in the sample), those "numbers" will let us 
calculate real numbers that relate to the real physical parameters (say, the 
first moment and dispersion of the size distribution, etc.) in many cases, as 
discussed here previously.

Davor
P.S:
9. The fact that a certain physical model does not yield a particular 
analytical function as a physically broadened profile does not mean that the 
function cannot successfully approximate that profile, as any such calculation 
includes many approximations of different kinds. There were numerous examples 
in literature showing that a simple Voigt function was able to approximate 
quite different cases. Of course, that is not true in general.


> -----Original Message-----
> From: Matteo Leoni [mailto:[EMAIL PROTECTED] 
> Sent: Tuesday, March 29, 2005 4:59 AM
> To: rietveld_l@ill.fr
> Subject: RE: Size Strain In GSAS
> 
> Leonid (and others)
> 
> just my 2 cents to the whole story (as this is a long 
> standing point of  
> discussion: Davor correct me if I'm wrong, but this was also 
> one of the 
> key points in the latest size-strain meeting in Prague, right?)
> 
> > Your recipe for estimating size distribution from the 
> parameters of a
> > Voight-fitted profile is clear and straightforward, but I 
> wonder have
> > you, or someone else, tested it on, say, simulated data for 
> the model
> > of spherical crystallites having lognormal size distribution with
> > various dispersions?
> 
> done several times... if you start from a pattern synthesised from a 
> lognormal and you analyse it using a post-mortem LPA method 
> (i.e. extract 
> a width and a shape parameter and play with them to get some 
> microstructural information), you obtain a result which (in 
> most cases) 
> does not allow you to reconstruct the original data (the Fourier 
> transform of a Voigt and that of the function describing a lognormal 
> distribution of spherical domains are different).
> I would invite all people using ANY "traditional" line 
> profile analysis  
> method to do always this check. Davor already pointed out 
> cases where it 
> works and cases where it does not: according to my experience those 
> belonging to the first category are just a few.
> 
> With a whole pattern approach and working directly with the profile 
> arising from a distribution of domains, in most cases you're able to 
> recostruct the original distribution without making any 
> assumption on its  
> functional shape (after all, most of the information to do so is 
> contained in the whole pattern, even if it is well hidden).
> 
> Concerning the Beyesian/maxent method, well, it is always a 
> great idea, 
> but unfortunately right now it is not mature enough to cope 
> with a simple 
> problem of combined instrumental, size AND strain broadening (unless 
> something has been done in the last year). So ok it gives you 
> the best 
> result compatible with your hypotheses, but beware that 
> "absence of any 
> other source of broadening" should be listed among them.. and 
> I'm not sure 
> this is always the case!
> 
> To put some water on the fire (otherwise it will burn all of 
> us), I think 
> the level of detail one needs on the microstructure, conditions the 
> methods one's going to use to extract a result. No need to use highly 
> sophisticated methods to roughly estimate a domain size (with 
> an error up 
> to +/- 50%) or to establish a trend within a homogeneous set 
> of data, or 
> also to obtain a better fit in the Rietveld method.
> 
> Conversely, if a very high level of detail is sought, then I'd forget 
> about a "traditional Rietveld refinement" and start approaching the 
> problem from the microstructure point of view (after all, if one is 
> interested in winning a F1 GP, he'd certainly not go for a Ferrari  
> powered by a John Deere tractor engine!).
> 
> cheers
> 
> Mat
> 
> -------------------------
> Matteo Leoni, PhD
> Department of Materials Engineering
> and Industrial Technologies 
> University of Trento
> 38050 Mesiano (TN)
> ITALY
> 

Reply via email to