Hi, I still have one more doubt. Why does split[1] show only 5 chunks as opposed to 6? Isn't split[1] the training set?
-> result = transerror(split[1], split[0]) (Pdb) > /usr/lib/python2.5/site-packages/mvpa/algorithms/cvtranserror.py(176)_call() -> if clf_hastestdataset and expose_testdataset: (Pdb) print split[1] Dataset / float32 825 x 2 uniq: 5 chunks 2 labels (Pdb) print split[0] Dataset / float32 326 x 2 uniq: 2 chunks 2 labels Thanks, Geethmala On Mon, Jan 25, 2010 at 11:29 AM, Geethmala <[email protected]> wrote: > Thanks. I had not done zscore. It seems to work now. > > Thanks, > Geethmala > > > On Mon, Jan 25, 2010 at 11:23 AM, Michael Hanke > <[email protected]>wrote: > >> On Mon, Jan 25, 2010 at 11:14:14AM -0500, Geethmala wrote: >> > Sorry, my bad. It got stuck again after displaying the following >> > >> > [SLC] DBG: Doing 47070 spheres: 3 (2 features) [0%] >> >> Looks like the SVM has trouble converging on your data. Does your >> dataset contain invariant features? Did you zscore, or otherwise >> normalize the data? >> >> >> Michael >> >> -- >> GPG key: 1024D/3144BE0F Michael Hanke >> http://mih.voxindeserto.de >> >> _______________________________________________ >> Pkg-ExpPsy-PyMVPA mailing list >> [email protected] >> http://lists.alioth.debian.org/mailman/listinfo/pkg-exppsy-pymvpa >> > >
_______________________________________________ Pkg-ExpPsy-PyMVPA mailing list [email protected] http://lists.alioth.debian.org/mailman/listinfo/pkg-exppsy-pymvpa

