On Wed, Apr 24, 2013 at 11:50 AM, Ronnie Ghose <[email protected]> wrote: > sorry but -1 for neural net based things. neural nets are heavily based on > structure, they're not blackbox afaik.
I am a bit skeptical too, but to be honest, an RBM is not much different from factor analysis except for the assumptions and the algorithm. Similar cases can be made for other methods too. Convolutional nets and things with flexible structure (ie more flexible than just number of units per layer) is where I would draw the line. We can have networks with uniform structure and homogeneous node types (ie all layers except the last, maybe). This can be configured with just a couple of parameters, and maybe with good training algorithms they can work reasonably for some problems. They would also be useful as baselines before moving to maybe Theano and start tying some weights and customizing some cost functions. I doubt we can have more flexibility than this without moving into DSL territory which goes against the scikit-learn api, as was simultaneously pointed out in a different thread. I think it's plausible to merge RBM and MLPs, and then even to have a simple wrapper that can train deep networks layer by layer as RBMs and implement predict, and then optionally do some discriminative backprop. The place might be too tight for two out of three GSoC projects being dedicated to neural stuff, though, this is up for debate. Cheers, Vlad ------------------------------------------------------------------------------ Try New Relic Now & We'll Send You this Cool Shirt New Relic is the only SaaS-based application performance monitoring service that delivers powerful full stack analytics. Optimize and monitor your browser, app, & servers with just a few lines of code. Try New Relic and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr _______________________________________________ Scikit-learn-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
