[ https://issues.apache.org/jira/browse/SYSTEMML-618?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15374046#comment-15374046 ]
Mike Dusenberry commented on SYSTEMML-618: ------------------------------------------ Update: LSTM and RNN layers have been added in [commit d926ea0 | https://github.com/apache/incubator-systemml/commit/d926ea0522af05aed2d3b1f01c5d76eb88c008b8]. > Deep Learning DML Library > ------------------------- > > Key: SYSTEMML-618 > URL: https://issues.apache.org/jira/browse/SYSTEMML-618 > Project: SystemML > Issue Type: New Feature > Reporter: Mike Dusenberry > Assignee: Mike Dusenberry > > This issue tracks the creation of a layers-based *deep learning library* in > pure DML. > The library contains layers with simple {{forward}} (function evaluation) and > {{backward}} (gradient computation) functions for affine, convolution (start > with 2D), max-pooling, non-linearities (relu, sigmoid, softmax, etc.), > dropout, loss functions, other layers, optimizers, and gradient checks. > *Examples*: Please see example *scripts* and *notebooks* in the {{examples}} > folder: > [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN/examples]. > *SystemML-NN*: > [https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN] > * Layers: > ** Core: > *** Affine > *** Spatial Convolution > *** LSTM > *** Max Pooling > *** RNN > ** Nonlinearities: > *** ReLU > *** Sigmoid > *** Softmax > *** Tanh > ** Loss: > *** Cross-entropy loss > *** L1 loss > *** L2 loss > *** Log ("Logistic") loss > ** Regularization: > *** Dropout > *** L1 reg > *** L2 reg > * Optimizers: > ** Adagrad > ** Adam > ** RMSprop > ** SGD > ** SGD w/ Momentum > ** SGD w/ Nesterov Momentum > * Tests: > ** Gradient Checks -- This message was sent by Atlassian JIRA (v6.3.4#6332)