On 04/06/2012 08:04 AM, xinfan meng wrote:
On Fri, Apr 6, 2012 at 1:57 PM, David Warde-Farley <[email protected] <mailto:[email protected]>> wrote:On 2012-04-05, at 5:17 PM, Vlad Niculae <[email protected] <mailto:[email protected]>> wrote: > > http://ufldl.stanford.edu/wiki/images/8/84/SelfTaughtFeatures.png > > It is easy to set up the skeleton of such an example and when the implementation is good it will magically run :) I have stared at way too many MNIST weight visualizations over the years, and this one strikes me as "too good" to have been produced using supervised SGD. Based on the filename it sounds like some form of "self taught learning", which would imply an unsupervised architecture of some kind. Interesting. Does good visualization leads to good prediction?
No ;) But it looks oh so good in the paper?!Btw, I looked at the exercise sheet an the visualisation of the weights there looks quite different
and more like what I would expect.
------------------------------------------------------------------------------ For Developers, A Lot Can Happen In A Second. Boundary is the first to Know...and Tell You. Monitor Your Applications in Ultra-Fine Resolution. Try it FREE! http://p.sf.net/sfu/Boundary-d2dvs2
_______________________________________________ Scikit-learn-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
