Repository: systemml Updated Branches: refs/heads/gh-pages a46f1fb68 -> 573427fb5
[SYSTEMML-540] Add support for RNN and LSTM in Caffe2DML and Keras2DML Closes #707. Project: http://git-wip-us.apache.org/repos/asf/systemml/repo Commit: http://git-wip-us.apache.org/repos/asf/systemml/commit/d4b723eb Tree: http://git-wip-us.apache.org/repos/asf/systemml/tree/d4b723eb Diff: http://git-wip-us.apache.org/repos/asf/systemml/diff/d4b723eb Branch: refs/heads/gh-pages Commit: d4b723eb92256abcde904b2fa2e244ccb2924563 Parents: a46f1fb Author: Niketan Pansare <npan...@us.ibm.com> Authored: Tue Dec 12 11:45:16 2017 -0800 Committer: Niketan Pansare <npan...@us.ibm.com> Committed: Tue Dec 12 11:46:11 2017 -0800 ---------------------------------------------------------------------- reference-guide-caffe2dml.md | 55 +++++++++++++++++++++++++++++++++++++++ release-process.md | 11 ++++++++ 2 files changed, 66 insertions(+) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/systemml/blob/d4b723eb/reference-guide-caffe2dml.md ---------------------------------------------------------------------- diff --git a/reference-guide-caffe2dml.md b/reference-guide-caffe2dml.md index 24d5753..be8c078 100644 --- a/reference-guide-caffe2dml.md +++ b/reference-guide-caffe2dml.md @@ -168,6 +168,61 @@ layer { } ``` +## Recurrent Layers + +### RNN Layer + +In a simple RNN, the output of the previous timestep is fed back in as an additional input at the current timestep. + +Invokes [nn/layers/rnn.dml](https://github.com/apache/systemml/blob/master/scripts/nn/layers/rnn.dml) layer. + +**Required Parameters:** + +- num_output: number of output +- return_sequences: Whether to return output at all timesteps, or just for the final timestep. + +**Sample Usage:** +``` +layer { + top: "rnn_1" + recurrent_param { + return_sequences: false + num_output: 32 + } + type: "RNN" + name: "rnn_1" + bottom: "rnn_1_input" +} +``` + +### LSTM Layer + +In an LSTM, an internal cell state is maintained, additive +interactions operate over the cell state at each timestep, and +some amount of this cell state is exposed as output at each +timestep. Additionally, the output of the previous timestep is fed +back in as an additional input at the current timestep. + +Invokes [nn/layers/lstm.dml](https://github.com/apache/systemml/blob/master/scripts/nn/layers/lstm.dml) layer. + +**Required Parameters:** + +- num_output: number of output +- return_sequences: Whether to return output at all timesteps, or just for the final timestep. + +**Sample Usage:** +``` +layer { + top: "lstm_1" + recurrent_param { + return_sequences: false + num_output: 32 + } + type: "LSTM" + name: "lstm_1" + bottom: "lstm_1_input" +} +``` ## Common Layers http://git-wip-us.apache.org/repos/asf/systemml/blob/d4b723eb/release-process.md ---------------------------------------------------------------------- diff --git a/release-process.md b/release-process.md index e5d0737..3798ec7 100644 --- a/release-process.md +++ b/release-process.md @@ -382,6 +382,17 @@ file and remove all the `@Ignore` annotations from all the tests. Then run the N mvn -Dit.test=org.apache.sysml.test.gpu.NeuralNetworkOpTests verify -PgpuTests +# Run other GPU Unit Tests + + rm result.txt + for t in AggregateUnaryOpTests BinaryOpTests MatrixMatrixElementWiseOpTests RightIndexingTests AppendTest MatrixMultiplicationOpTest ReorgOpTests ScalarMatrixElementwiseOpTests UnaryOpTests + do + mvn -Dit.test="org.apache.sysml.test.gpu."$t verify -PgpuTests &> tmp.txt + SUCCESS=`grep "BUILD SUCCESS" tmp.txt` + echo $t" => "$SUCCESS >> result.txt + rm tmp.txt + done + # Voting Following a successful release candidate vote by SystemML PMC members on the SystemML mailing list, the release candidate