[
https://issues.apache.org/jira/browse/MAHOUT-703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13036022#comment-13036022
]
Hector Yee commented on MAHOUT-703:
-----------------------------------
Yeah was planning to do L2 regularization first. L1 can be tricky due to edge
cases like crossing / following the simplex, so I'll enforce sparsity with
Andrew Ng's bias tweaking trick first.
> Implement Gradient machine
> --------------------------
>
> Key: MAHOUT-703
> URL: https://issues.apache.org/jira/browse/MAHOUT-703
> Project: Mahout
> Issue Type: New Feature
> Components: Classification
> Affects Versions: 0.6
> Reporter: Hector Yee
> Priority: Minor
> Labels: features
> Original Estimate: 72h
> Remaining Estimate: 72h
>
> Implement a gradient machine (aka 'neural network) that can be used for
> classification or auto-encoding.
> It will just have an input layer, identity, sigmoid or tanh hidden layer and
> an output layer.
> Training done by stochastic gradient descent (possibly mini-batch later).
> Sparsity will be optionally enforced by tweaking the bias in the hidden unit.
> For now it will go in classifier/sgd and the auto-encoder will wrap it in the
> filter unit later on.
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira