Hi Johan,

On Wed, Sep 30, 2015 at 10:53 AM, Johan Nilsson <johan.nils...@chalmers.se>
wrote:

> Hello Matt,
>
> I was able to import pre_edge, autobk, and xftf without problems. Is it
> possible to import feff paths and do a fit to exafs data in a python
> script? I was able to do
>
>    larch.enable_plugins()
>    from larch_plugins.xafs import feffdat
>
>    path1 = feffdat.feffpath('feff0001.dat')
>
> but I could not figure out how to get the feffit_transform and
> feffit_dataset functions to work. Before I upgraded I was able to use those
> two in a python script but I was not able to get the feffit function to run
> to do the actual fit.
>
>
Sorry, the use from python is not as well documented as it should be.   A
larch script for a simple use of feffit would be:

## simple feffit in larch
# read chi(k)
cu_data = read_ascii('../xafsdata/cu.chi', labels='k, chi')

# define fitting parameter group
pars = group(amp    = param(1, vary=True),
             del_e0 = param(0.1, vary=True),
             sig2   = param(0.002, vary=True),
             del_r  = param(0., vary=True) )

# define a Feff Path, give expressions for Path Parameters
path1 = feffpath('feffcu01.dat',
                 s02    = 'amp',
                 e0     = 'del_e0',
                 sigma2 = 'sig2',
                 deltar = 'del_r')

# set tranform / fit ranges
trans = feffit_transform(kmin=3, kmax=17, kw=2, dk=4,
                         window='kaiser', rmin=1.4, rmax=3.0)

# define dataset to include data, pathlist, transform
dset  = feffit_dataset(data=cu_data, pathlist=[path1], transform=trans)

# perform fit
out = feffit(pars, dset)

# print result
print feffit_report(out)
###########################################


Translating that into pure Python would be:

## simple feffit in python
import larch
from larch import Group, Parameter
from larch_plugins.io import read_ascii
from larch_plugins.xafs import FeffPathGroup
from larch_plugins.xafs.feffit import (TransformGroup, FeffitDataSet,
                                       feffit, feffit_report)

#create larch session
session = larch.Interpreter()

# read chi(k)
cu_data = read_ascii('../xafsdata/cu.chi', labels='k, chi', _larch=session)

# define fitting parameter group
pars = larch.Group(amp    = Parameter(1, vary=True),
                   del_e0 = Parameter(0.1, vary=True),
                   sig2   = Parameter(0.002, vary=True),
                   del_r  = Parameter(0., vary=True) )

# define a Feff Path, give expressions for Path Parameters
path1 = FeffPathGroup('feffcu01.dat',
                      s02    = 'amp',
                      e0     = 'del_e0',
                      sigma2 = 'sig2',
                      deltar = 'del_r', _larch=session)

# set tranform / fit ranges
trans = TransformGroup(kmin=3, kmax=17, kw=2, dk=4,
                       window='kaiser', rmin=1.4, rmax=3.0, _larch=session)

# define dataset to include data, pathlist, transform
dset  = FeffitDataSet(data=cu_data, pathlist=[path1],
                      transform=trans, _larch=session)
# perform fit
out = feffit(pars, dset, _larch=session)

# print result
print feffit_report(out, _larch=session)
##################################################


So, it's FeffPathGroup, TransformGroup and FeffitDataSet that you were
looking for -- python classes, with the corresponding larch functione just
return an instance of these classes.    And you have to pass in a
'_larch=session'
to all of these classes and fnctions, as these all assume they can
read/write to the session-wide larch symbol table (and, while fitting, they
actually do).   Yeah, a couple of these might be possible to eliminate, and
it's not how you'd do it in pure python, but you'd probably build a Feffit
class, and be passing around 'self', which is really not so different from
"reference to global state".

Anyway, it is doable in non-larch Python.  Suggestions for improvement
welcome!

--Matt
_______________________________________________
Ifeffit mailing list
Ifeffit@millenia.cars.aps.anl.gov
http://millenia.cars.aps.anl.gov/mailman/listinfo/ifeffit

Reply via email to