Hi there,

How does one perform across-subjects analysis on the surface using PyMVPA?

I am currently estimating the noise ceiling on my data (as described by Nili et 
al., 2014). This requires anatomically-aligned data from all of the subjects in 
the experiment. I'm using the following approach: create a single large PyMVPA 
dataset including data from all template-aligned subjects, and assign a chunks 
sample attribute corresponding to subject ID, then use a searchlight with a 
custom noise ceiling estimation measure (that uses the chunks sample attribute 
to split the samples appropriately).

This works great, but I would like to move my analysis to the surface. So for 
each searchlight center _on the surface template_ I would need to get 
surrounding voxels _from all individual subjects._

It is difficult for me to tell whether this is possible based on reading the 
query engine documentation and the surface searchlight example.

Thanks!

Bests,
Beau Sievers

—

Refs:

Nili, H., Wingfield, C., Walther, A., Su, L., Marslen-Wilson, W., & 
Kriegeskorte, N. (2014). A toolbox for representational similarity analysis. 
PLoS Comput Biol, 10(4), e1003553.
_______________________________________________
Pkg-ExpPsy-PyMVPA mailing list
Pkg-ExpPsy-PyMVPA@lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa

Reply via email to