Thank you Michael. Both of those seem like plausible techniques and make
sense. Once again I'm surprised by the flexibility of PyMVPA :)

Thanks!

Cheers
Jason

On Wednesday, March 26, 2014, Michael Hanke <[email protected]> wrote:

> Hi,
>
> On Tue, Mar 25, 2014 at 12:58:27PM -0400, Jason Ozubko wrote:
> > The classifiers in PyMVPA all seem to be targeted at classifying patterns
> > into nominal categories.  Is there anything that can be done when you
> have
> > a linear "category"?
>
> Two ideas:
>
> 1. Use a regression. PyMVPA handles regressions and classifiers in very
>    similar ways, hence in most cases you can just replace a classifier
>    instance with a regression. Here is an example on how to use any
>    regression algorithm implemented in scikit-learn within PyMVPA:
>
>    http://www.pymvpa.org/examples/skl_regression_demo.html
>
> 2. Keep doing classification, but use a custom error function. So if the
>    "distance" from the target value within you linear "category" is
>    meaningful, you can assign an error function like this one:
>
>    def eucd(targets, predictions):
>        return targets - predictions
>
> Does that make sense in your context?
>
> Michael
>
> --
> J.-Prof. Dr. Michael Hanke
> Psychoinformatik Labor,    Institut für  Psychologie II
> Otto-von-Guericke-Universität Magdeburg,  Universitätsplatz 2, Geb.24
> Tel.: +49(0)391-67-18481 Fax: +49(0)391-67-11947  GPG: 4096R/7FFB9E9B
>
> _______________________________________________
> Pkg-ExpPsy-PyMVPA mailing list
> [email protected] <javascript:;>
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
>
_______________________________________________
Pkg-ExpPsy-PyMVPA mailing list
[email protected]
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa

Reply via email to