Hi Guys,

Great toolbox, keep up the good work. I have a question about speeding up the 
Searchlight algorithm.

When I run the algorithm, it tends to start off quickly and then progressively 
slow down to a snail's pace. Windows reports that pythonw.exe, at it's peak, is 
only consuming ~250mbs of memory (despite the availibility of up to 14gbs of 
RAM and a massive pagefile).

My question is, is there some way to increase the memory being used and thus 
speed up the process? Windows limits 32bit programs to 2gbs and provides memory 
upon request and there does not seem to be any reason that Windows would refuse 
python any memory?

Also, it seems counterintuitive to me that Searchlight would start off quick 
and then slow down, surely each voxel should take the same amount of time? Can 
you think of anything that might be causing this?


OS: Windows Vista Business 64bit (running Python in 32bit mode)

Xeon E5440 processor

16gb RAM



#dataset is the accompanying Haxby study

#nifti images and mask load fine

clf = LinearNuSVMC(nu=0.1)

cv = CrossValidatedTransferError(TransferError(clf),

                                 NFoldSplitter())

sl = Searchlight(cv, radius=3)

sl_map = sl(dataset)

Regards,

Dan

                                          
_________________________________________________________________
Hotmail: Trusted email with Microsoft’s powerful SPAM protection.
https://signup.live.com/signup.aspx?id=60969
_______________________________________________
Pkg-ExpPsy-PyMVPA mailing list
[email protected]
http://lists.alioth.debian.org/mailman/listinfo/pkg-exppsy-pymvpa

Reply via email to