[ 
https://issues.apache.org/jira/browse/SINGA-204?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351081#comment-15351081
 ] 

ASF subversion and git services commented on SINGA-204:
-------------------------------------------------------

Commit 97648a60a9ff9feb55ef1a8b86e5372837e0b4f8 in incubator-singa's branch 
refs/heads/dev from [~flytosky]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-singa.git;h=97648a6 ]

SINGA-204 Support the training of feed-forward neural nets

Implement Alexnet model for Cifar10 https://code.google.com/p/cuda-convnet/
The accuracy of the test data is 0.82 (the same as reported by in the above 
link).
NOTE:
1. do not convert from uint_8 data to float directly. Instead, cast
the data to uint8_t and then to float.
2. it is necessary to substract the max value for softmax. Numeric error
(nan) is likely to happen otherwise.


> Support the training of feed-forward neural nets
> ------------------------------------------------
>
>                 Key: SINGA-204
>                 URL: https://issues.apache.org/jira/browse/SINGA-204
>             Project: Singa
>          Issue Type: New Feature
>            Reporter: wangwei
>            Assignee: wangwei
>
> For feed-forward neural nets, their layers construct a directly acyclic 
> graph. For this ticket, we are going to implement add a FeedForwordNet to 
> training these nets, which 
> 1. consists of a set of directly connected layers, and 
> 2. provides functions for constructing the net by adding layers one by one
> 3. provides access functions for layers and parameters.
> 4. provides functions for forward and backward



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to