[incubator-mxnet] branch fit-api updated: move to gluon contrib (#14635)

2019-04-05 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/fit-api by this push:
 new d4bf975  move to gluon contrib (#14635)
d4bf975 is described below

commit d4bf975c89061c027a7040e80da3b3f1631215b7
Author: Lai Wei 
AuthorDate: Fri Apr 5 13:58:38 2019 -0700

move to gluon contrib (#14635)
---
 python/mxnet/{ => gluon}/contrib/estimator/__init__.py  | 0
 python/mxnet/{ => gluon}/contrib/estimator/estimator.py | 6 +++---
 python/mxnet/{ => gluon}/contrib/estimator/event_handler.py | 0
 tests/nightly/estimator/test_estimator_cnn.py   | 2 +-
 tests/nightly/estimator/test_sentiment_rnn.py   | 2 +-
 tests/python/unittest/test_gluon_estimator.py   | 2 +-
 tests/python/unittest/test_gluon_event_handler.py   | 2 +-
 7 files changed, 7 insertions(+), 7 deletions(-)

diff --git a/python/mxnet/contrib/estimator/__init__.py 
b/python/mxnet/gluon/contrib/estimator/__init__.py
similarity index 100%
rename from python/mxnet/contrib/estimator/__init__.py
rename to python/mxnet/gluon/contrib/estimator/__init__.py
diff --git a/python/mxnet/contrib/estimator/estimator.py 
b/python/mxnet/gluon/contrib/estimator/estimator.py
similarity index 99%
rename from python/mxnet/contrib/estimator/estimator.py
rename to python/mxnet/gluon/contrib/estimator/estimator.py
index 5294991..f7c97c4 100644
--- a/python/mxnet/contrib/estimator/estimator.py
+++ b/python/mxnet/gluon/contrib/estimator/estimator.py
@@ -22,9 +22,9 @@
 import copy
 import warnings
 from .event_handler import EventHandler, LoggingHandler
-from ... import gluon, autograd
-from ...context import Context, cpu, gpu, num_gpus
-from ...metric import EvalMetric, Loss, Accuracy
+from  import gluon, autograd
+from context import Context, cpu, gpu, num_gpus
+from metric import EvalMetric, Loss, Accuracy
 
 __all__ = ['Estimator']
 
diff --git a/python/mxnet/contrib/estimator/event_handler.py 
b/python/mxnet/gluon/contrib/estimator/event_handler.py
similarity index 100%
rename from python/mxnet/contrib/estimator/event_handler.py
rename to python/mxnet/gluon/contrib/estimator/event_handler.py
diff --git a/tests/nightly/estimator/test_estimator_cnn.py 
b/tests/nightly/estimator/test_estimator_cnn.py
index 92d7889..7d0018b 100644
--- a/tests/nightly/estimator/test_estimator_cnn.py
+++ b/tests/nightly/estimator/test_estimator_cnn.py
@@ -22,7 +22,7 @@ import numpy as np
 import mxnet as mx
 from mxnet import gluon, init, nd
 from mxnet.gluon import data
-from mxnet.contrib.estimator import estimator
+from mxnet.gluon.contrib.estimator import estimator
 from mxnet.gluon.model_zoo import vision
 
 def load_data_mnist(batch_size, resize=None, num_workers=4):
diff --git a/tests/nightly/estimator/test_sentiment_rnn.py 
b/tests/nightly/estimator/test_sentiment_rnn.py
index 89b6778..5fd93c1 100644
--- a/tests/nightly/estimator/test_sentiment_rnn.py
+++ b/tests/nightly/estimator/test_sentiment_rnn.py
@@ -29,7 +29,7 @@ import mxnet as mx
 from mxnet import nd, gluon
 from mxnet.contrib import text
 from mxnet.gluon import nn, rnn
-from mxnet.contrib.estimator import estimator
+from mxnet.gluon.contrib.estimator import estimator
 
 
 class TextCNN(nn.Block):
diff --git a/tests/python/unittest/test_gluon_estimator.py 
b/tests/python/unittest/test_gluon_estimator.py
index fd6693d..13fcd96 100644
--- a/tests/python/unittest/test_gluon_estimator.py
+++ b/tests/python/unittest/test_gluon_estimator.py
@@ -24,7 +24,7 @@ import warnings
 import mxnet as mx
 from mxnet import gluon
 from mxnet.gluon import nn
-from mxnet.contrib.estimator import Estimator, EventHandler
+from mxnet.gluon.contrib.estimator import Estimator, EventHandler
 from nose.tools import assert_raises
 
 
diff --git a/tests/python/unittest/test_gluon_event_handler.py 
b/tests/python/unittest/test_gluon_event_handler.py
index f3181f5..dd2e60d 100644
--- a/tests/python/unittest/test_gluon_event_handler.py
+++ b/tests/python/unittest/test_gluon_event_handler.py
@@ -20,7 +20,7 @@ import tempfile
 import mxnet as mx
 from mxnet import nd
 from mxnet.gluon import nn, loss
-from mxnet.contrib.estimator import estimator, event_handler
+from mxnet.gluon.contrib.estimator import estimator, event_handler
 
 def _get_test_network():
 net = nn.Sequential()



[incubator-mxnet] branch fit-api updated: move estimator to contrib (#14633)

2019-04-05 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/fit-api by this push:
 new ffad48a  move estimator to contrib (#14633)
ffad48a is described below

commit ffad48a7b157ef70d4e35955d0b2f5e2f63287a7
Author: Lai Wei 
AuthorDate: Fri Apr 5 13:10:58 2019 -0700

move estimator to contrib (#14633)
---
 python/mxnet/{gluon => contrib}/estimator/__init__.py  | 0
 python/mxnet/{gluon => contrib}/estimator/estimator.py | 0
 python/mxnet/{gluon => contrib}/estimator/event_handler.py | 0
 tests/nightly/estimator/test_estimator_cnn.py  | 2 +-
 tests/nightly/estimator/test_sentiment_rnn.py  | 2 +-
 tests/python/unittest/test_gluon_estimator.py  | 2 +-
 tests/python/unittest/test_gluon_event_handler.py  | 2 +-
 7 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/python/mxnet/gluon/estimator/__init__.py 
b/python/mxnet/contrib/estimator/__init__.py
similarity index 100%
rename from python/mxnet/gluon/estimator/__init__.py
rename to python/mxnet/contrib/estimator/__init__.py
diff --git a/python/mxnet/gluon/estimator/estimator.py 
b/python/mxnet/contrib/estimator/estimator.py
similarity index 100%
rename from python/mxnet/gluon/estimator/estimator.py
rename to python/mxnet/contrib/estimator/estimator.py
diff --git a/python/mxnet/gluon/estimator/event_handler.py 
b/python/mxnet/contrib/estimator/event_handler.py
similarity index 100%
rename from python/mxnet/gluon/estimator/event_handler.py
rename to python/mxnet/contrib/estimator/event_handler.py
diff --git a/tests/nightly/estimator/test_estimator_cnn.py 
b/tests/nightly/estimator/test_estimator_cnn.py
index b4311b3..92d7889 100644
--- a/tests/nightly/estimator/test_estimator_cnn.py
+++ b/tests/nightly/estimator/test_estimator_cnn.py
@@ -22,7 +22,7 @@ import numpy as np
 import mxnet as mx
 from mxnet import gluon, init, nd
 from mxnet.gluon import data
-from mxnet.gluon.estimator import estimator
+from mxnet.contrib.estimator import estimator
 from mxnet.gluon.model_zoo import vision
 
 def load_data_mnist(batch_size, resize=None, num_workers=4):
diff --git a/tests/nightly/estimator/test_sentiment_rnn.py 
b/tests/nightly/estimator/test_sentiment_rnn.py
index c9dcbd2..89b6778 100644
--- a/tests/nightly/estimator/test_sentiment_rnn.py
+++ b/tests/nightly/estimator/test_sentiment_rnn.py
@@ -29,7 +29,7 @@ import mxnet as mx
 from mxnet import nd, gluon
 from mxnet.contrib import text
 from mxnet.gluon import nn, rnn
-from mxnet.gluon.estimator import estimator
+from mxnet.contrib.estimator import estimator
 
 
 class TextCNN(nn.Block):
diff --git a/tests/python/unittest/test_gluon_estimator.py 
b/tests/python/unittest/test_gluon_estimator.py
index c86f4ff..fd6693d 100644
--- a/tests/python/unittest/test_gluon_estimator.py
+++ b/tests/python/unittest/test_gluon_estimator.py
@@ -24,7 +24,7 @@ import warnings
 import mxnet as mx
 from mxnet import gluon
 from mxnet.gluon import nn
-from mxnet.gluon.estimator import Estimator, EventHandler
+from mxnet.contrib.estimator import Estimator, EventHandler
 from nose.tools import assert_raises
 
 
diff --git a/tests/python/unittest/test_gluon_event_handler.py 
b/tests/python/unittest/test_gluon_event_handler.py
index 023b046..f3181f5 100644
--- a/tests/python/unittest/test_gluon_event_handler.py
+++ b/tests/python/unittest/test_gluon_event_handler.py
@@ -20,7 +20,7 @@ import tempfile
 import mxnet as mx
 from mxnet import nd
 from mxnet.gluon import nn, loss
-from mxnet.gluon.estimator import estimator, event_handler
+from mxnet.contrib.estimator import estimator, event_handler
 
 def _get_test_network():
 net = nn.Sequential()



[incubator-mxnet] branch fit-api updated (c2e2f80 -> d76234b)

2019-04-05 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


 discard c2e2f80  [MXNET-1344, 1346][FIT API] Retrieve Batch size and Logging 
verbose support for Gluon fit() API (#14587)
 discard b1ef99a  [MXNet-1343][Fit API]Add CNN integration test for fit() API 
(#14405)
 discard 81ec379  [MXNet-1375][Fit API]Added RNN integration test for fit() API 
(#14547)
 discard ed7f6e5  [MXNet-1340][Fit API]Update train stats (#14494)
 discard 8186772  [MXNet-1349][Fit API]Add validation support and unit tests 
for fit() API (#14442)
 discard e6c63b1  Fixed issue where the estimator was printing beyond the 
dataset size … (#14464)
 discard 41392fa  [MXNet-1334][Fit API]base class for estimator and 
eventhandler (#14346)
 new 427b6d4  Fix shape inference pass (#14153)
 new d754da3  Relaxing type requirements for reshape_like op (#14325)
 new f2497aa  Updated news.md with the latest mkldnn submodule version 
(#14298)
 new 49d7fc6  Enhance gpu quantization (#14094)
 new d6eafca  Bypass ThreadedEngine in 
test_operator_gpu.py:test_convolution_multiple_streams. (#14338)
 new 111b881  Limit workspace for cudnnGet results (#14326)
 new 19d737f  [MXNET-1331] Removal of non-MXNET classes from JAR (#14303)
 new 184c2a5  fix render issue in NDArray linalg docs (#14258)
 new fccce20  Add more support for mxnet_to_coreml (#14222)
 new a0f3f92   Add default parameters for Scala NDArray.arange (#13816)
 new b486594  Register fake grad to subgraph and quantized operators 
(#14275)
 new 39412b3  corrected a spellign (#14247)
 new 83d2c2d  [MXNET-1324] Add NaiveRunGraph to imperative utils (#14192)
 new 8ab7998  Updates build_lib.sh to copy the cub library license (#14347)
 new efb8823  Add MKLDNN headers to pip package (#14339)
 new 7b8e3a9  compatibility with opencv4 (#14313)
 new f4ab2d7  [MXNET-1291] solve pylint errors in examples with issue 
no.12205 (#13848)
 new 49932fa  #14199: catch subprocess.CalledProcessError in get_gpus() 
(#14212)
 new 6caaa38  print error message for mxnet::cpp::Operator::Invoke when 
failed (#14318)
 new 8beea18  Bulked op segments to allow Variable nodes (#14200)
 new e703694  Fixes #14181, validate model output shape for ObjectDetector. 
(#14215)
 new 12c41e6  Optimizer MXKVStoreUpdater bug fix in serializeState method 
(#14337)
 new 2b7d57d  Installs qemu pip requirements from qemu requirements file 
(#14355)
 new 838e256  Optimize NMS part 2 (#14352)
 new 30b1cbc  add exception (#14362)
 new 8668db7  MKLDNN based Quantized FullyConnected Operator and its fusion 
(#14128)
 new ce9e3cf  add pos_weight for SigmoidBinaryCrossEntropyLoss (#13612)
 new ed83071  Julia: split symbolic-node.jl into several snippets (#14024)
 new c645591  Fix NaN value comparisons in relu, max and min ops (#14262)
 new 8be97d7  [clojure-package][wip] add `->nd-vec` function in 
`ndarray.clj` (#14308)
 new 35098b8  support leading dimension of -1 in ravel/unravel (#14356)
 new 4f5cba5  fix engine crash in shutdown phase (#14382)
 new 47d4d66  Flaky test 
https://github.com/apache/incubator-mxnet/issues/14189 (#14190)
 new af41af5  Julia: rename `mx.clip` to `clamp` for `NDArray` (#14027)
 new 2df5756  add backgroud class in box_nms (#14058)
 new 0e8c270  CI Changes for Codified Windows AMIs (#14336)
 new a4b9802  [Clojure] Helper function for n-dim vector to ndarray (#14305)
 new 89bebd1  [DOC] fix sym.arange doc (#14237)
 new 73b29fa  Julia: add binding for runtime feature detection (#13992)
 new ab0ca86  [MXNET-1093] Add python3 Docker images for each MXNet release 
(#12791)
 new 66c74cc  Enable bulking test on windows (#14392)
 new 6aa8c27  [MXNET-1327] Allow RNN Layers to be initialized to fp16 
(#14219)
 new c4cae6e  Disables flaky test_operator.test_sgld test (#14410)
 new 82504ad  Fix relative difference scala (#14417)
 new ce99e49  Cudnn conv dgrad algo filtering (#14310)
 new 4432af1  [MXNET-1226] add Docs update for MXNet Java (#14395)
 new ae55b75  fix Makefile (#14424)
 new 9fd3153  [MXNET-1291] solve pylint errors in examples with issue 
no.12205 (#13938)
 new 88b3741  Disables flaky TestStochasticTiming_2D test (#14412)
 new b077965  Add dtype visualization to plot_network (#14066)
 new 74c2274  Support multi-threading for Custom Operator (#14363)
 new d1fcda9  Fix entropy for uint8 (#14150)
 new d001eaf  what's new - add 1.4.0 release (#14435)
 new 43173f5  moveaxis operator now accepts negative indices and sequence 
of ints as well. (#14321)
 new 226212b  Add repr for SymbolBlock (#14423)
 new a091d36  temporarily disable integ tests with a dependency on origami 
repo (#14448)
 new f602b0d  fix OOM error during resource allocation (#1)
 new

[incubator-mxnet] branch fit-api updated (6b800e4 -> c2e2f80)

2019-04-05 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


omit 6b800e4  Add BERT QA Scala/Java example (#14592)
omit 736be66  added note about cuda9.2 requirement (#14140)
omit 943b734  GELU (#14449)
omit 996d2b1  fix quantize graph pass (#14605)
omit 49c1ccc  Add Gluon Transformer Crop (#14259)
omit d843a85  set _scale in Trainer using optimizer rescale_grad (#14593)
omit 3414b06  Updated documentation about nightly tests (#14493)
omit 8ddede0  [MXNET-1379] update reshape operator (#14600)
omit 3a5dc6e  Fixing unintentional variable overloading (#14438)
omit deeba00  Fix flaky test poisson generator & 
test_negative_binomial_generator (#14571)
omit 04979b4  fix build cpp examples option (#14562)
omit 821502a  Updates gpu tests to use CUDNN_VERSION supplied by the 
environment but default to 7.0.3 if not set (#14595)
omit b6a767d  [MXNET-1357] Fix the cpp-examples to add exception handling 
(#14441)
omit 984d70f  Disable Flaky Test test_poisson_generator (#14540)
omit a4b85a5  Support SyncBatchNorm5D (#14542)
omit e5aadca  example/ssd/evaluate/eval_metric.py (#14561)
omit e444341  Fixes static build script for cub directory rename (#14578)
omit ea4f571  Change CUB submodule to track Nvidia CUB project. (#13322)
omit ee341b8  Do not touch GPU 0 during ReleaseAll (#14550)
omit 25cdb1c  Enhance subgraph API (#14113)
omit b4583cb  fix tests (#14565)
omit e761e2f  [clojure][image] add draw-bounding-box interop (#14533)
omit c95afee  Chouffe/clojure fix tests (#14531)
omit 0f0b2dd  [clojure]: add comp-metric based on CompositeEvalMetric 
(#14553)
omit a3c11da  Remove unnecessary "also" in README.md (#14543)
omit 0c3d11d  Memory fixes. Resolves #10867, and resolves #14080 (#14372)
omit a310b44  Tidy up storage allocation and deallocation (#14480)
omit db55051  speedup SequenceMask on GPU (#14445)
omit c4b8d30  Performance improving for MKL-DNN Quantized FullyConnected 
(#14528)
omit a41a643  Adds context parameter to check_rnn_layer_forward calls in 
test_lstmp (#14529)
omit 890b186  add filter to warnings (#14532)
omit a350661  Fix script retrieval (#14519)
omit dee351b  reenable the test (#14483)
omit f1354b4  [MXNET-1285] Draw bounding box with Scala/Java Image API 
(#14474)
omit 0ab1da2  Enhance PartitionGraph (#14277)
omit ce5bc19  Fixes for CI downloads (#14504)
omit 308c4e6  Add examples of running MXNet with Horovod (#14286)
omit a211550  Fixed tutorial warnings (#14472)
omit 07901c3  [MXNET-949] Module API to Gluon API tutorial (#12542)
omit 199c3bb  Fixes test_operator_gpu.test_multinomial_generator (#14475)
omit 2e218fc  Added link to landing page for Java examples (#14481)
omit d0aab5e  Change Straight Dope to Dive into Deep Learning (#14465)
omit 4b256aa  fix custom operation in fork (#14451)
omit 708931a  Revert "Fix memory leak for size-zero ndarray (#14365)" 
(#14477)
omit 73fe286  Fixes the test_sgld (#14473)
omit adc0ce7  [MKL-DNN] Enable s8 support for inner product and 3d input 
with flatten=false (#14466)
omit 73b9890  Enforce determinism for backwards compatibility checker 
(#14463)
omit 56c0d8f  [Doc] Start the tutorials for MKL-DNN backend (#14202)
omit dbc86a7  Fix memory leak for size-zero ndarray (#14365)
omit ffc0708  begin=end not a valid input (#14403)
omit f1103ad  add contributors from intel (#14455)
omit 7ffa150  Fix crashes on visualization (#14425)
omit 48131ed  Speedup _contrib_index_copy (#14359)
omit a0c4177  Update MKL-DNN to v0.18 release (was: fix the Dense layer 
issue) (#13668)
omit 711b8b5  Correct update count with Gluon trainer and 
update_on_kvstore=False (#14377)
omit 09a4d64  fix OOM error during resource allocation (#1)
omit c47d8c8  temporarily disable integ tests with a dependency on origami 
repo (#14448)
omit 314a41a  Add repr for SymbolBlock (#14423)
omit 5809c2d  moveaxis operator now accepts negative indices and sequence 
of ints as well. (#14321)
omit c73a3c4  what's new - add 1.4.0 release (#14435)
omit 506bc77  Fix entropy for uint8 (#14150)
omit 207c6c1  Support multi-threading for Custom Operator (#14363)
omit 2b315c1  Add dtype visualization to plot_network (#14066)
omit 6599756  Disables flaky TestStochasticTiming_2D test (#14412)
omit 36d84c8  [MXNET-1291] solve pylint errors in examples with issue 
no.12205 (#13938)
omit 361acc9  fix Makefile (#14424)
omit 692f8fb  [MXNET-1226] add Docs update for MXNet Java (#14395)
omit 4f1e22b  Cudnn conv dgrad algo filtering (#14310)
omit dc1238b  Fix relative difference scala (#14417)
omit 86601d7  Disables flaky test_operator.test_sgld test (#14410)
omit 38d3151  [MXNET

[incubator-mxnet] branch fit-api updated (c2e2f80 -> 6b800e4)

2019-04-05 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from c2e2f80  [MXNET-1344, 1346][FIT API] Retrieve Batch size and Logging 
verbose support for Gluon fit() API (#14587)
 new c35a82c  Fix shape inference pass (#14153)
 new 7dc7db0  Relaxing type requirements for reshape_like op (#14325)
 new 778e495  Updated news.md with the latest mkldnn submodule version 
(#14298)
 new d014ff3  Enhance gpu quantization (#14094)
 new ae2dda2  Bypass ThreadedEngine in 
test_operator_gpu.py:test_convolution_multiple_streams. (#14338)
 new ae4703b  Limit workspace for cudnnGet results (#14326)
 new 8287656  [MXNET-1331] Removal of non-MXNET classes from JAR (#14303)
 new f0cd148  fix render issue in NDArray linalg docs (#14258)
 new e64cfaa  Add more support for mxnet_to_coreml (#14222)
 new f6dc492   Add default parameters for Scala NDArray.arange (#13816)
 new b5e0890  Register fake grad to subgraph and quantized operators 
(#14275)
 new d58764c  corrected a spellign (#14247)
 new 0f96e66  [MXNET-1324] Add NaiveRunGraph to imperative utils (#14192)
 new e14e8be  Updates build_lib.sh to copy the cub library license (#14347)
 new 7e14eb4  Add MKLDNN headers to pip package (#14339)
 new 1b4e080  compatibility with opencv4 (#14313)
 new 43d87b0  [MXNET-1291] solve pylint errors in examples with issue 
no.12205 (#13848)
 new 9a6cd58  #14199: catch subprocess.CalledProcessError in get_gpus() 
(#14212)
 new 2447f80  print error message for mxnet::cpp::Operator::Invoke when 
failed (#14318)
 new 6b2b44a  Bulked op segments to allow Variable nodes (#14200)
 new 8658a1b  Fixes #14181, validate model output shape for ObjectDetector. 
(#14215)
 new c9d57bf  Optimizer MXKVStoreUpdater bug fix in serializeState method 
(#14337)
 new 0b3a965  Installs qemu pip requirements from qemu requirements file 
(#14355)
 new d503bb4  Optimize NMS part 2 (#14352)
 new 1bb78eb  add exception (#14362)
 new f8eeab7  MKLDNN based Quantized FullyConnected Operator and its fusion 
(#14128)
 new a26b3fc  add pos_weight for SigmoidBinaryCrossEntropyLoss (#13612)
 new 0c51c69  Julia: split symbolic-node.jl into several snippets (#14024)
 new 16eb81a  Fix NaN value comparisons in relu, max and min ops (#14262)
 new 897bf59  [clojure-package][wip] add `->nd-vec` function in 
`ndarray.clj` (#14308)
 new 43b03ab  support leading dimension of -1 in ravel/unravel (#14356)
 new d5bf85b  fix engine crash in shutdown phase (#14382)
 new c152c39  Flaky test 
https://github.com/apache/incubator-mxnet/issues/14189 (#14190)
 new 25560b3  Julia: rename `mx.clip` to `clamp` for `NDArray` (#14027)
 new d0a2f8d  add backgroud class in box_nms (#14058)
 new 19f05b0  CI Changes for Codified Windows AMIs (#14336)
 new f981f4e  [Clojure] Helper function for n-dim vector to ndarray (#14305)
 new 3e94618  [DOC] fix sym.arange doc (#14237)
 new 95baafb  Julia: add binding for runtime feature detection (#13992)
 new c45e9ac  [MXNET-1093] Add python3 Docker images for each MXNet release 
(#12791)
 new 438aa6e  Enable bulking test on windows (#14392)
 new 38d3151  [MXNET-1327] Allow RNN Layers to be initialized to fp16 
(#14219)
 new 86601d7  Disables flaky test_operator.test_sgld test (#14410)
 new dc1238b  Fix relative difference scala (#14417)
 new 4f1e22b  Cudnn conv dgrad algo filtering (#14310)
 new 692f8fb  [MXNET-1226] add Docs update for MXNet Java (#14395)
 new 361acc9  fix Makefile (#14424)
 new 36d84c8  [MXNET-1291] solve pylint errors in examples with issue 
no.12205 (#13938)
 new 6599756  Disables flaky TestStochasticTiming_2D test (#14412)
 new 2b315c1  Add dtype visualization to plot_network (#14066)
 new 207c6c1  Support multi-threading for Custom Operator (#14363)
 new 506bc77  Fix entropy for uint8 (#14150)
 new c73a3c4  what's new - add 1.4.0 release (#14435)
 new 5809c2d  moveaxis operator now accepts negative indices and sequence 
of ints as well. (#14321)
 new 314a41a  Add repr for SymbolBlock (#14423)
 new c47d8c8  temporarily disable integ tests with a dependency on origami 
repo (#14448)
 new 09a4d64  fix OOM error during resource allocation (#1)
 new 711b8b5  Correct update count with Gluon trainer and 
update_on_kvstore=False (#14377)
 new a0c4177  Update MKL-DNN to v0.18 release (was: fix the Dense layer 
issue) (#13668)
 new 48131ed  Speedup _contrib_index_copy (#14359)
 new 7ffa150  Fix crashes on visualization (#14425)
 new f1103ad  add contributors from intel (#14455)
 new ffc0708  begin=end not a valid input (#14403)
 new dbc86a7  Fix memory leak for size-zero ndarray (#14365)
 new 56c0d8f  [Doc] Start the tutorials for MKL-DNN backend (#14202)

[incubator-mxnet] branch fit-api updated: [MXNET-1344, 1346][FIT API] Retrieve Batch size and Logging verbose support for Gluon fit() API (#14587)

2019-04-05 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/fit-api by this push:
 new c2e2f80  [MXNET-1344, 1346][FIT API] Retrieve Batch size and Logging 
verbose support for Gluon fit() API (#14587)
c2e2f80 is described below

commit c2e2f80474652cae2eb52d3614ef00a05472a679
Author: Karan Jariwala 
AuthorDate: Fri Apr 5 11:04:04 2019 -0700

[MXNET-1344, 1346][FIT API] Retrieve Batch size and Logging verbose support 
for Gluon fit() API (#14587)

* Retrieve Batch size and Logging verbose support for Gluon fit() API

* NIT changes

* Addressed review comments: shifted the batch size code to a separate 
method, sentence correction

* Modified unittest

* removed redundant parameter

* Resolve CI test failure

* only support DataLoader for now, future PRs will include DataIter to 
DataLoader converter

* Get the number of samples from shape attribute instead of length due to 
low space complexity

* Simplified batch size retrieval code

* removed batch_size parameter from fit() method and fixed the tests

* Verbose exception handling

* Assigning constant to a verbose

* Modified exception message

* Resolved undefined class reference

* Addressed review comments: Modified verbose level names, docs, variable 
names

* Update estimator.py
---
 python/mxnet/gluon/estimator/estimator.py | 43 
 python/mxnet/gluon/estimator/event_handler.py | 61 +++
 tests/nightly/estimator/test_estimator_cnn.py | 12 ++---
 tests/nightly/estimator/test_sentiment_rnn.py |  6 +--
 tests/python/unittest/test_gluon_estimator.py | 45 +++--
 tests/python/unittest/test_gluon_event_handler.py |  5 +-
 6 files changed, 91 insertions(+), 81 deletions(-)

diff --git a/python/mxnet/gluon/estimator/estimator.py 
b/python/mxnet/gluon/estimator/estimator.py
index c5da0c0..5294991 100644
--- a/python/mxnet/gluon/estimator/estimator.py
+++ b/python/mxnet/gluon/estimator/estimator.py
@@ -21,11 +21,9 @@
 
 import copy
 import warnings
-
 from .event_handler import EventHandler, LoggingHandler
 from ... import gluon, autograd
 from ...context import Context, cpu, gpu, num_gpus
-from ...io import DataIter
 from ...metric import EvalMetric, Loss, Accuracy
 
 __all__ = ['Estimator']
@@ -168,7 +166,7 @@ class Estimator(object):
 
  Parameters
  --
- val_data : DataLoader or DataIter
+ val_data : DataLoader
  validation data with data and labels
  batch_fn : function
  custom batch function to extract data and label
@@ -182,13 +180,10 @@ class Estimator(object):
 if not batch_fn:
 if isinstance(val_data, gluon.data.DataLoader):
 data, label = self._batch_fn(batch, self.context)
-elif isinstance(val_data, DataIter):
-data, label = self._batch_fn(batch, self.context, 
is_iterator=True)
 else:
 raise ValueError("You are using a custom iteration, please 
also provide "
  "batch_fn to extract data and label. 
Alternatively, you "
- "can provide the data as 
gluon.data.DataLoader or "
- "mx.io.DataIter")
+ "can provide the data as 
gluon.data.DataLoader.")
 else:
 data, label = batch_fn(batch, self.context)
 pred = [self.net(x) for x in data]
@@ -208,16 +203,17 @@ class Estimator(object):
 def fit(self, train_data,
 val_data=None,
 epochs=1,
-batch_size=None,
 event_handlers=None,
 batch_fn=None):
-"""Main training loop
+"""Trains the model on a given dataset for a specified
+number of epochs. Also, the batch size is inferred from the
+DataLoader's batch_size.
 
 Parameters
 --
-train_data : DataLoader or DataIter
+train_data : DataLoader
 training data with data and labels
-val_data : DataLoader or DataIter
+val_data : DataLoader
 validation data with data and labels
 epochs : int, default 1
 number of epochs to iterate on the training data.
@@ -232,12 +228,8 @@ class Estimator(object):
 """
 
 self.max_epoch = epochs
-if not batch_size:
-self.batch_size = 32 * len(self.context)
-else:
-self.batch_size = batch_size
 self.stop_training = False
-se

[incubator-mxnet] branch fit-api updated: [MXNet-1343][Fit API]Add CNN integration test for fit() API (#14405)

2019-04-03 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/fit-api by this push:
 new b1ef99a  [MXNet-1343][Fit API]Add CNN integration test for fit() API 
(#14405)
b1ef99a is described below

commit b1ef99ae4f84da2fbd9c25115040e612e7250323
Author: Abhinav Sharma 
AuthorDate: Wed Apr 3 15:12:56 2019 -0700

[MXNet-1343][Fit API]Add CNN integration test for fit() API (#14405)

* added cnn intg tests for fit api

* updated cnn intg tests

* added functions for nightly test

* updated runtime_function

* updated intg tests

* updated init, datapath, refs

* added validation data

* update cpu test

* refactor code

* updated context
---
 ci/docker/runtime_functions.sh|  14 +++
 tests/nightly/JenkinsfileForBinaries  |  16 +++
 tests/nightly/estimator/test_estimator_cnn.py | 153 ++
 3 files changed, 183 insertions(+)

diff --git a/ci/docker/runtime_functions.sh b/ci/docker/runtime_functions.sh
index 128ae2b..88cd972 100755
--- a/ci/docker/runtime_functions.sh
+++ b/ci/docker/runtime_functions.sh
@@ -1296,6 +1296,20 @@ nightly_scala_demo_test_cpu() {
 bash bin/run_im.sh
 }
 
+nightly_estimator_cnn_gpu() {
+set -ex
+cd /work/mxnet/tests/nightly/estimator
+export PYTHONPATH=/work/mxnet/python/
+python test_estimator_cnn.py --type gpu
+}
+
+nightly_estimator_cnn_cpu() {
+set -ex
+cd /work/mxnet/tests/nightly/estimator
+export PYTHONPATH=/work/mxnet/python/
+python test_estimator_cnn.py --type cpu
+}
+
 nightly_estimator_rnn_gpu() {
 set -ex
 cd /work/mxnet/tests/nightly/estimator
diff --git a/tests/nightly/JenkinsfileForBinaries 
b/tests/nightly/JenkinsfileForBinaries
index 53e1c30..53572c8 100755
--- a/tests/nightly/JenkinsfileForBinaries
+++ b/tests/nightly/JenkinsfileForBinaries
@@ -106,6 +106,22 @@ core_logic: {
   utils.docker_run('ubuntu_nightly_gpu', 
'nightly_tutorial_test_ubuntu_python3_gpu', true, '1500m')
 }
   }
+},
+'estimator: CNN GPU': {
+  node(NODE_LINUX_GPU) {
+ws('workspace/estimator-test-cnn-gpu') {
+  utils.unpack_and_init('gpu', mx_lib)
+  utils.docker_run('ubuntu_nightly_gpu', 
'nightly_estimator_test_cnn_gpu', true)
+}
+  }
+},
+'estimator: CNN CPU': {
+  node(NODE_LINUX_CPU) {
+ws('workspace/estimator-test-cnn-cpu') {
+  utils.unpack_and_init('cpu', mx_lib)
+  utils.docker_run('ubuntu_nightly_cpu', 
'nightly_estimator_test_cnn_cpu', true)
+}
+  }
 }
   }
 }
diff --git a/tests/nightly/estimator/test_estimator_cnn.py 
b/tests/nightly/estimator/test_estimator_cnn.py
new file mode 100644
index 000..b99e99a
--- /dev/null
+++ b/tests/nightly/estimator/test_estimator_cnn.py
@@ -0,0 +1,153 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# Test gluon estimator on CNN models
+
+import argparse
+import numpy as np
+import mxnet as mx
+from mxnet import gluon, init, nd
+from mxnet.gluon import data
+from mxnet.gluon.estimator import estimator
+from mxnet.gluon.model_zoo import vision
+
+def load_data_mnist(batch_size, resize=None, num_workers=4):
+'''
+Load MNIST dataset
+'''
+transformer = []
+if resize:
+transformer += [data.vision.transforms.Resize(resize)]
+transformer += [data.vision.transforms.ToTensor()]
+transformer = data.vision.transforms.Compose(transformer)
+mnist_train = data.vision.MNIST(train=True)
+mnist_test = data.vision.MNIST(train=False)
+train_iter = data.DataLoader(
+mnist_train.transform_first(transformer), batch_size, shuffle=True,
+num_workers=num_workers)
+test_iter = data.DataLoader(
+mnist_test.transform_first(transformer), batch_size, shuffle=False,
+num_workers=num_workers)
+return train_iter, te

[incubator-mxnet] branch fit-api updated: [MXNet-1375][Fit API]Added RNN integration test for fit() API (#14547)

2019-04-03 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/fit-api by this push:
 new 81ec379  [MXNet-1375][Fit API]Added RNN integration test for fit() API 
(#14547)
81ec379 is described below

commit 81ec37970858ce746760eda0954d86ce55d627a7
Author: Karan Jariwala 
AuthorDate: Wed Apr 3 14:28:08 2019 -0700

[MXNet-1375][Fit API]Added RNN integration test for fit() API (#14547)

* Added RNN integration test for fit() API

* Addressed review comments: change in JenkinFile, tmp directory, ctx with 
condense if/else, renamed imports

* CPU test doesn't require nvidiadocker container

* Modified the structure by removing the redundant code
---
 ci/docker/runtime_functions.sh|  14 ++
 tests/nightly/Jenkinsfile |  16 ++
 tests/nightly/estimator/test_sentiment_rnn.py | 276 ++
 3 files changed, 306 insertions(+)

diff --git a/ci/docker/runtime_functions.sh b/ci/docker/runtime_functions.sh
index de1b779..128ae2b 100755
--- a/ci/docker/runtime_functions.sh
+++ b/ci/docker/runtime_functions.sh
@@ -1296,6 +1296,20 @@ nightly_scala_demo_test_cpu() {
 bash bin/run_im.sh
 }
 
+nightly_estimator_rnn_gpu() {
+set -ex
+cd /work/mxnet/tests/nightly/estimator
+export PYTHONPATH=/work/mxnet/python/
+python test_sentiment_rnn.py --type gpu
+}
+
+nightly_estimator_rnn_cpu() {
+set -ex
+cd /work/mxnet/tests/nightly/estimator
+export PYTHONPATH=/work/mxnet/python/
+python test_sentiment_rnn.py --type cpu
+}
+
 # Deploy
 
 deploy_docs() {
diff --git a/tests/nightly/Jenkinsfile b/tests/nightly/Jenkinsfile
index 758c864..a65da2d 100755
--- a/tests/nightly/Jenkinsfile
+++ b/tests/nightly/Jenkinsfile
@@ -136,6 +136,22 @@ core_logic: {
   utils.docker_run('ubuntu_nightly_cpu', 'nightly_test_javascript', 
false)
 }
   }
+},
+'estimator: RNN GPU': {
+  node(NODE_LINUX_GPU) {
+ws('workspace/estimator-test-rnn-gpu') {
+  utils.unpack_and_init('gpu', mx_lib)
+  utils.docker_run('ubuntu_nightly_gpu', 
'nightly_estimator_test_rnn_gpu', true)
+}
+  }
+},
+'estimator: RNN CPU': {
+  node(NODE_LINUX_CPU) {
+ws('workspace/estimator-test-rnn-cpu') {
+  utils.unpack_and_init('cpu', mx_lib)
+  utils.docker_run('ubuntu_nightly_cpu', 
'nightly_estimator_test_rnn_cpu', false)
+}
+  }
 }
   }
 }
diff --git a/tests/nightly/estimator/test_sentiment_rnn.py 
b/tests/nightly/estimator/test_sentiment_rnn.py
new file mode 100644
index 000..7e42831
--- /dev/null
+++ b/tests/nightly/estimator/test_sentiment_rnn.py
@@ -0,0 +1,276 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""Gluon Text Sentiment Classification Example using RNN/CNN
+Example modified from below link:
+https://github.com/d2l-ai/d2l-en/blob/master/chapter_natural-language-processing/sentiment-analysis-rnn.md
+https://github.com/d2l-ai/d2l-en/blob/master/chapter_natural-language-processing/sentiment-analysis-cnn.md""";
+
+import argparse
+import os
+import tarfile
+import random
+import collections
+import mxnet as mx
+from mxnet import nd, gluon
+from mxnet.contrib import text
+from mxnet.gluon import nn, rnn
+from mxnet.gluon.estimator import estimator
+
+
+class TextCNN(nn.Block):
+def __init__(self, vocab, embed_size, kernel_sizes, num_channels,
+ **kwargs):
+super(TextCNN, self).__init__(**kwargs)
+self.embedding = nn.Embedding(len(vocab), embed_size)
+# The embedding layer does not participate in training
+self.constant_embedding = nn.Embedding(len(vocab), embed_size)
+self.dropout = nn.Dropout(0.5)
+self.decoder = nn.Dense(2)
+# The max-over-time pooling layer has no weight, so it can share an
+# instance
+self.pool = nn.GlobalMaxPool1D()
+# Create mu

[incubator-mxnet] branch fit-api updated: [MXNet-1340][Fit API]Update train stats (#14494)

2019-04-03 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/fit-api by this push:
 new ed7f6e5  [MXNet-1340][Fit API]Update train stats (#14494)
ed7f6e5 is described below

commit ed7f6e56a4e372d5d460031186145065f5657893
Author: Lai Wei 
AuthorDate: Wed Apr 3 14:18:13 2019 -0700

[MXNet-1340][Fit API]Update train stats (#14494)

* add train history

* update history

* update test

* avoid calling empty methods

* remove train history object

* fix pylint

* add unit test

* fix test

* update categorize handlers
---
 python/mxnet/gluon/estimator/estimator.py | 147 +---
 python/mxnet/gluon/estimator/event_handler.py | 102 +++-
 python/mxnet/gluon/trainer.py |   7 +
 tests/python/unittest/test_gluon_estimator.py | 193 ++
 tests/python/unittest/test_gluon_event_handler.py |  12 +-
 5 files changed, 280 insertions(+), 181 deletions(-)

diff --git a/python/mxnet/gluon/estimator/estimator.py 
b/python/mxnet/gluon/estimator/estimator.py
index e759fa7..c5da0c0 100644
--- a/python/mxnet/gluon/estimator/estimator.py
+++ b/python/mxnet/gluon/estimator/estimator.py
@@ -22,7 +22,7 @@
 import copy
 import warnings
 
-from .event_handler import LoggingHandler
+from .event_handler import EventHandler, LoggingHandler
 from ... import gluon, autograd
 from ...context import Context, cpu, gpu, num_gpus
 from ...io import DataIter
@@ -39,27 +39,26 @@ class Estimator(object):
 
 Parameters
 --
-loss : Loss or list of Loss
+loss : gluon.loss.Loss or list of gluon.loss.Loss
 Loss(objective functions) to calculate during training
 metrics : EvalMetric or list of EvalMetric
 Metrics for evaluating models
 initializer : Initializer
 initializer to initialize the network
-trainers : Trainer or list of Trainer
-Trainers to apply optimizers on network parameters
+trainer : Trainer
+Trainer to apply optimizer on network parameters
 context : Context or list of Context
 devices to run the training on
 """
 
 def __init__(self, net,
- loss=None,
+ loss,
  metrics=None,
  initializer=None,
- trainers=None,
+ trainer=None,
  context=None):
 
 self.net = net
-self.stop_training = False
 
 if isinstance(loss, gluon.loss.Loss):
 self.loss = [loss]
@@ -86,27 +85,14 @@ class Estimator(object):
 
 # store training statistics
 self.train_stats = {}
-self.train_stats['epochs'] = []
-self.train_stats['learning_rate'] = []
-# current step of the epoch
-self.train_stats['step'] = ''
-for metric in self.train_metrics:
-# record a history of metrics over each epoch
-self.train_stats['train_' + metric.name] = []
-# only record the latest metric numbers after each batch
-self.train_stats['batch_' + metric.name] = 0.
-for metric in self.val_metrics:
-self.train_stats['val_' + metric.name] = []
+
+# separate train and validation
 self.train_loss_metrics = []
 self.val_loss_metrics = []
 # using the metric wrapper for loss to record loss value
 for l in self.loss:
 self.train_loss_metrics.append(Loss(l.name))
 self.val_loss_metrics.append(Loss(l.name))
-self.train_stats['train_' + l.name] = []
-self.train_stats['val_' + l.name] = []
-# only record the latest loss numbers after each batch
-self.train_stats['batch_' + l.name] = 0.
 
 # handle context
 if isinstance(context, Context):
@@ -127,7 +113,6 @@ class Estimator(object):
 raise ValueError("context must be a Context or a list of Context, "
  "refer to mxnet.Context:{}".format(context))
 
-
 # initialize the network
 self.initializer = initializer
 if self.initializer:
@@ -135,7 +120,7 @@ class Estimator(object):
 # if already initialized, re-init with user specified 
initializer
 warnings.warn("Network already initialized, re-initializing 
with %s. "
   "You don't need to pass initializer if you 
already "
-  "initialized your net."% 
type(self.initializer).__name__)
+  "initialized your net." % 
type(self.initializer).__name__

[incubator-mxnet] branch master updated: Memory fixes. Resolves #10867, and resolves #14080 (#14372)

2019-03-28 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 102b46f  Memory fixes. Resolves #10867, and resolves #14080 (#14372)
102b46f is described below

commit 102b46feb5bf1061545ac79e4b114a180250740e
Author: Andrew Ayres 
AuthorDate: Thu Mar 28 11:57:09 2019 -0700

Memory fixes. Resolves #10867, and resolves #14080 (#14372)

* Fixes for memory leak when reshaping executor

* Fixed Adam Optimizer memory leak

* Cleanup for PR

* Added unit test for new ResourceScope method

* Removing import that was added by overzealous ide

* Add back in an import

* Added flags for executor to know whether or not it owns NDArrays for 
disposal

* Moving to ResourceScope.using implementation

* Changes to make ResourceScope.using work with existing scope

* Updating ResourceScope to work with existing scopes via 
usingIfScopeExists method

* Fix clojure unit tests

* Fixes to be compatibile with how clojure is using ResourceScope

* Removing some unnecessary changes

* Adding scope assertion in unit test
---
 .../src/main/scala/org/apache/mxnet/Executor.scala |  47 --
 .../main/scala/org/apache/mxnet/Optimizer.scala|  20 ++--
 .../scala/org/apache/mxnet/ResourceScope.scala |  21 -
 .../src/main/scala/org/apache/mxnet/Symbol.scala   |  21 +++--
 .../mxnet/module/DataParallelExecutorGroup.scala   |  11 ++-
 .../scala/org/apache/mxnet/optimizer/Adam.scala| 101 ++---
 .../org/apache/mxnet/ResourceScopeSuite.scala  |  33 +++
 7 files changed, 165 insertions(+), 89 deletions(-)

diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/Executor.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/Executor.scala
index 85f45bc..aec4402 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/Executor.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/Executor.scala
@@ -45,29 +45,47 @@ object Executor {
  * @see Symbol.bind : to create executor
  */
 class Executor private[mxnet](private[mxnet] val handle: ExecutorHandle,
-  private[mxnet] val symbol: Symbol) extends 
NativeResource {
-  private[mxnet] var argArrays: Array[NDArray] = null
-  private[mxnet] var gradArrays: Array[NDArray] = null
-  private[mxnet] var auxArrays: Array[NDArray] = null
+  private[mxnet] val symbol: Symbol,
+  private[mxnet] var argArrays: Array[NDArray] = 
null,
+  private[mxnet] var gradArrays: Array[NDArray] = 
null,
+  private[mxnet] var auxArrays: Array[NDArray] = 
null,
+  private var _ctx: Context = null,
+  private var _gradsReq: Iterable[_] = null,
+  private var _group2ctx: Map[String, Context] = 
null
+ ) extends NativeResource {
+
   val outputs: Array[NDArray] = getOutputs
   protected var _argDict: Map[String, NDArray] = null
   protected var _gradDict: Map[String, NDArray] = null
   protected var _auxDict: Map[String, NDArray] = null
   protected var monitorCallback: MXMonitorCallback = null
-  private[mxnet] var _ctx: Context = null
-  private[mxnet] var _gradsReq: Iterable[_] = null
-  private[mxnet] var _group2ctx: Map[String, Context] = null
   private val logger: Logger = LoggerFactory.getLogger(classOf[Executor])
 
+  private[mxnet] var ownsArgArrays = false
+  private[mxnet] var ownsGradArrays = false
+  private[mxnet] var ownsAuxArrays = false
+
   override def nativeAddress: CPtrAddress = handle
   override def nativeDeAllocator: (CPtrAddress => Int) = _LIB.mxExecutorFree
   // cannot determine the off-heap size of this object
   override val bytesAllocated: Long = 0
   override val ref: NativeResourceRef = super.register()
+
   override def dispose(): Unit = {
 if (!super.isDisposed) {
   super.dispose()
   outputs.foreach(o => o.dispose())
+  // Symbol.bind clones symbol when creating the executor so we need to 
dispose of the clone
+  symbol.dispose()
+  if (ownsArgArrays && argArrays != null) {argArrays.foreach(a => 
a.dispose())}
+  if (ownsGradArrays && gradArrays != null) {gradArrays.foreach(
+// Symbol will sometimes fill this with nulls so we've got to check 
the elements too
+a => if (a != null) {a.dispose()})
+  }
+  if (ownsAuxArrays && auxArrays != null) {auxArrays.foreach(a => 
a.dispose())}
+  if (_argDict != null) {_argDict.foreach(a => a._2.dispose())}
+  if (_gradDict != null) {_gradDict.foreach(a => a._2.dispose())}
+  if (_auxDict != null) {_auxDict.fo

[incubator-mxnet] branch fit-api updated: [MXNet-1349][Fit API]Add validation support and unit tests for fit() API (#14442)

2019-03-25 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/fit-api by this push:
 new 8186772  [MXNet-1349][Fit API]Add validation support and unit tests 
for fit() API (#14442)
8186772 is described below

commit 8186772f78c1a4e5c9412d1e860f31f05e31fcfd
Author: Abhinav Sharma 
AuthorDate: Mon Mar 25 11:17:31 2019 -0700

[MXNet-1349][Fit API]Add validation support and unit tests for fit() API 
(#14442)

* added estimator unittests

* add more tests for estimator

* added validation logic

* added error handlers, unittests

* improve val stats

* fix pylint

* fix pylint

* update unit test

* fix tests

* fix tests

* updated metrics, val logic

* trigger ci

* trigger ci

* update metric, batch_fn error handler

* update context logic, add default metric
---
 python/mxnet/gluon/estimator/estimator.py | 116 ---
 python/mxnet/gluon/estimator/event_handler.py |   2 +-
 tests/python/unittest/test_gluon_estimator.py | 277 ++
 3 files changed, 370 insertions(+), 25 deletions(-)

diff --git a/python/mxnet/gluon/estimator/estimator.py 
b/python/mxnet/gluon/estimator/estimator.py
index c160115..e759fa7 100644
--- a/python/mxnet/gluon/estimator/estimator.py
+++ b/python/mxnet/gluon/estimator/estimator.py
@@ -19,13 +19,14 @@
 # pylint: disable=wildcard-import
 """Gluon Estimator"""
 
+import copy
 import warnings
 
 from .event_handler import LoggingHandler
 from ... import gluon, autograd
 from ...context import Context, cpu, gpu, num_gpus
 from ...io import DataIter
-from ...metric import EvalMetric, Loss
+from ...metric import EvalMetric, Loss, Accuracy
 
 __all__ = ['Estimator']
 
@@ -62,44 +63,57 @@ class Estimator(object):
 
 if isinstance(loss, gluon.loss.Loss):
 self.loss = [loss]
+elif isinstance(loss, list) and all([isinstance(l, gluon.loss.Loss) 
for l in loss]):
+self.loss = loss
 else:
-self.loss = loss or []
-for l in self.loss:
-if not isinstance(loss, gluon.loss.Loss):
-raise ValueError("loss must be a Loss or a list of Loss, 
refer to gluon.loss.Loss")
+raise ValueError("loss must be a Loss or a list of Loss, "
+ "refer to gluon.loss.Loss:{}".format(loss))
 
 if isinstance(metrics, EvalMetric):
-self.metrics = [metrics]
+self.train_metrics = [metrics]
 else:
-self.metrics = metrics or []
-for metric in self.metrics:
-if not isinstance(metric, EvalMetric):
-raise ValueError("metrics must be a Metric or a list of 
Metric, refer to mxnet.metric.EvalMetric")
+self.train_metrics = metrics or []
+if not all([isinstance(metric, EvalMetric) for metric in 
self.train_metrics]):
+raise ValueError("metrics must be a Metric or a list of 
Metric, "
+ "refer to 
mxnet.metric.EvalMetric:{}".format(metrics))
+
+# Use default mx.metric.Accuracy() for 
gluon.loss.SoftmaxCrossEntropyLoss()
+if not self.train_metrics and any([isinstance(l, 
gluon.loss.SoftmaxCrossEntropyLoss) for l in self.loss]):
+self.train_metrics = [Accuracy()]
+
+# Use same metrics for validation
+self.val_metrics = copy.deepcopy(self.train_metrics)
 
-self.initializer = initializer
 # store training statistics
 self.train_stats = {}
 self.train_stats['epochs'] = []
 self.train_stats['learning_rate'] = []
 # current step of the epoch
 self.train_stats['step'] = ''
-for metric in self.metrics:
+for metric in self.train_metrics:
 # record a history of metrics over each epoch
 self.train_stats['train_' + metric.name] = []
 # only record the latest metric numbers after each batch
 self.train_stats['batch_' + metric.name] = 0.
-self.loss_metrics = []
+for metric in self.val_metrics:
+self.train_stats['val_' + metric.name] = []
+self.train_loss_metrics = []
+self.val_loss_metrics = []
 # using the metric wrapper for loss to record loss value
 for l in self.loss:
-self.loss_metrics.append(Loss(l.name))
+self.train_loss_metrics.append(Loss(l.name))
+self.val_loss_metrics.append(Loss(l.name))
 self.train_stats['train_' + l.name] = []
+self.train_stats[&#x

[incubator-mxnet] branch fit-api updated: [MXNet-1334][Fit API]base class for estimator and eventhandler (#14346)

2019-03-15 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/fit-api by this push:
 new 41392fa  [MXNet-1334][Fit API]base class for estimator and 
eventhandler (#14346)
41392fa is described below

commit 41392fa1cdc4c4a49451678d9df7fdbad5b42faa
Author: Lai Wei 
AuthorDate: Fri Mar 15 22:16:42 2019 -0700

[MXNet-1334][Fit API]base class for estimator and eventhandler (#14346)

* base class for estimator and eventhandler

* add license

* add event handlers

* fix pylint

* improve arg check

* fix pylint

* add unit tests
---
 python/mxnet/gluon/estimator/__init__.py  |  21 ++
 python/mxnet/gluon/estimator/estimator.py | 267 +++
 python/mxnet/gluon/estimator/event_handler.py | 307 ++
 tests/python/unittest/test_gluon_event_handler.py |  92 +++
 4 files changed, 687 insertions(+)

diff --git a/python/mxnet/gluon/estimator/__init__.py 
b/python/mxnet/gluon/estimator/__init__.py
new file mode 100644
index 000..58600da
--- /dev/null
+++ b/python/mxnet/gluon/estimator/__init__.py
@@ -0,0 +1,21 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# pylint: disable=wildcard-import
+"""Gluon Estimator Module"""
+from .estimator import *
+from .event_handler import *
diff --git a/python/mxnet/gluon/estimator/estimator.py 
b/python/mxnet/gluon/estimator/estimator.py
new file mode 100644
index 000..159f7e2
--- /dev/null
+++ b/python/mxnet/gluon/estimator/estimator.py
@@ -0,0 +1,267 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# coding: utf-8
+# pylint: disable=wildcard-import
+"""Gluon Estimator"""
+
+import warnings
+
+from .event_handler import LoggingHandler
+from ... import gluon, autograd
+from ...context import Context, cpu, gpu, num_gpus
+from ...io import DataIter
+from ...metric import EvalMetric, Loss
+
+__all__ = ['Estimator']
+
+
+class Estimator(object):
+"""Estimator Class for easy model training
+
+:py:class:`Estimator` can be used to facilitate the training & validation 
process
+
+
+Parameters
+--
+loss : Loss or list of Loss
+Loss(objective functions) to calculate during training
+metrics : EvalMetric or list of EvalMetric
+Metrics for evaluating models
+initializer : Initializer
+initializer to initialize the network
+trainers : Trainer or list of Trainer
+Trainers to apply optimizers on network parameters
+context : Context or list of Context
+devices to run the training on
+"""
+
+def __init__(self, net,
+ loss=None,
+ metrics=None,
+ initializer=None,
+ trainers=None,
+ context=None):
+
+self.net = net
+self.stop_training = False
+
+if isinstance(loss, gluon.loss.Loss):
+self.loss = [loss]
+else:
+self.loss = loss or []
+for l in self.loss:
+if not isinstance(loss, gluon.loss.Loss):
+raise ValueError("loss must be a Loss or a list of Loss, 
refer to gluon.loss.Loss")
+
+if 

[incubator-mxnet] branch fit-api created (now 0f88f61)

2019-03-05 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch fit-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


  at 0f88f61  [clojure-package] fix docstrings in `normal.clj` (#14295)

No new revisions were added by this update.



[incubator-mxnet] branch master updated: move choose_element_0index to operator (#14273)

2019-03-04 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 434a5fb  move choose_element_0index to operator (#14273)
434a5fb is described below

commit 434a5fbd3103b1a01f10aa7d51a10d15d286f432
Author: Lai Wei 
AuthorDate: Mon Mar 4 09:53:38 2019 -0800

move choose_element_0index to operator (#14273)
---
 src/ndarray/ndarray.cc   | 5 -
 src/operator/tensor/broadcast_reduce_op_index.cc | 1 +
 2 files changed, 1 insertion(+), 5 deletions(-)

diff --git a/src/ndarray/ndarray.cc b/src/ndarray/ndarray.cc
index b09d38a..3677127 100644
--- a/src/ndarray/ndarray.cc
+++ b/src/ndarray/ndarray.cc
@@ -2027,11 +2027,6 @@ MXNET_REGISTER_NDARRAY_FUN(_set_value)
 MXNET_REGISTER_NDARRAY_FUN(_onehot_encode)
 .set_function(BinaryOp);
 
-MXNET_REGISTER_NDARRAY_FUN(choose_element_0index)
-.set_function(BinaryOp)
-.describe("Choose one element from each line(row for python, column for 
R/Julia)"
-  " in lhs according to index indicated by rhs."
-  " This function assume rhs uses 0-based index.");
 
 MXNET_REGISTER_NDARRAY_FUN(fill_element_0index)
 .set_function(TernaryOp)
diff --git a/src/operator/tensor/broadcast_reduce_op_index.cc 
b/src/operator/tensor/broadcast_reduce_op_index.cc
index ed9a90d..f3d1013 100644
--- a/src/operator/tensor/broadcast_reduce_op_index.cc
+++ b/src/operator/tensor/broadcast_reduce_op_index.cc
@@ -109,6 +109,7 @@ Examples::
 .add_argument("data", "NDArray-or-Symbol", "The input array");
 
 NNVM_REGISTER_OP(pick)
+.add_alias("choose_element_0index")
 .describe(R"code(Picks elements from an input array according to the input 
indices along the given axis.
 
 Given an input array of shape ``(d0, d1)`` and indices of shape ``(i0,)``, the 
result will be



[incubator-mxnet] branch master updated: adding tolerance to flaky test (#13850)

2019-01-13 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new f554835  adding tolerance to flaky test (#13850)
f554835 is described below

commit f554835f33633bfdc93a240a8415dc061d127583
Author: Roshani Nagmote 
AuthorDate: Sun Jan 13 23:06:30 2019 -0800

adding tolerance to flaky test (#13850)

* adding tolerance

* retrigger ci

* retrigger ci
---
 tests/python-pytest/onnx/test_node.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/tests/python-pytest/onnx/test_node.py 
b/tests/python-pytest/onnx/test_node.py
index 18e..25fe9c9 100644
--- a/tests/python-pytest/onnx/test_node.py
+++ b/tests/python-pytest/onnx/test_node.py
@@ -204,7 +204,7 @@ class TestNode(unittest.TestCase):
 onnx_model = get_onnx_graph(test_name, names, input_tensors, 
onnx_name, output_shape, attrs)
 bkd_rep = backend.prepare(onnx_model, operation='import')
 mxnet_out = bkd_rep.run(inputs)
-npt.assert_almost_equal(np_out, mxnet_out)
+npt.assert_almost_equal(np_out, mxnet_out, decimal=4)
 
 # test_case = ("test_case_name", mxnet op, "ONNX_op_name", [input_list], 
attribute map, MXNet_specific=True/False,
 # fix_attributes = {'modify': {mxnet_attr_name: onnx_attr_name},



[incubator-mxnet] branch master updated (8ece68c -> ed92b8d)

2019-01-09 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 8ece68c  whitelist symbols for using MXNet error handling externally 
(#13812)
 add ed92b8d  fix for params with no dims in onnx (#13413)

No new revisions were added by this update.

Summary of changes:
 python/mxnet/contrib/onnx/onnx2mx/import_onnx.py |  9 +++--
 tests/python-pytest/onnx/test_models.py  | 11 +++
 2 files changed, 18 insertions(+), 2 deletions(-)



[incubator-mxnet] branch master updated: Update CODEOWNERS, add Pedro Larroy. (#13579)

2018-12-18 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 5e46db4  Update CODEOWNERS, add Pedro Larroy. (#13579)
5e46db4 is described below

commit 5e46db4c84ebbbdba0d375adf134e7e608c54676
Author: Pedro Larroy 
AuthorDate: Tue Dec 18 17:04:26 2018 +

Update CODEOWNERS, add Pedro Larroy. (#13579)
---
 CODEOWNERS | 22 +-
 1 file changed, 13 insertions(+), 9 deletions(-)

diff --git a/CODEOWNERS b/CODEOWNERS
index 5a88e89..ce648ef 100644
--- a/CODEOWNERS
+++ b/CODEOWNERS
@@ -13,14 +13,14 @@
 
 # Language bindings
 /R-package/@thirdwing
-/scala-package/@yzhliu @nswamy
+/scala-package/@yzhliu @nswamy @pllarroy
 /perl-package/ @sergeykolychev
-/python/   @szha
+/python/   @szha @pllarroy
 /contrib/clojure-package/  @gigasquid
 
 # C++ base
 /src/kvstore/ @rahul003 @anirudh2290
-/include/ @anirudh2290
+/include/ @anirudh2290 @pllarroy
 /src/c_api/   @anirudh2290
 /src/common/  @anirudh2290
 /src/engine/  @anirudh2290
@@ -33,13 +33,17 @@
 /src/profiler/@anirudh2290
 /src/storage/ @anirudh2290
 /tests/cpp/   @anirudh2290
-/cpp-package/ @nswamy
+/cpp-package/ @nswamy @pllarroy
+/src/ @pllarroy
+/plugin/  @pllarroy
 
 # CMake
-CMakeLists.txt@szha @rahul003
-/cmake/   @szha @rahul003
+CMakeLists.txt@szha @rahul003 @pllarroy
+/cmake/   @szha @rahul003 @pllarroy
 
 # MXNet CI
+dev_menu.py @pllarroy
+/ci/@pllarroy
 /tests/ci_build/@marcoabreu
 Jenkinsfile @marcoabreu
 .travis.yml @marcoabreu
@@ -50,16 +54,16 @@ Makefile  @szha
 prepare_mkl.sh@szha
 
 # Docs
-/docs/@szha
+/docs/@szha @pllarroy
 
 # Submodules
 .gitmodules   @szha
 
 # Examples
-/example/ @szha
+/example/ @szha @pllarroy
 
 # Tools
-/tools/   @szha
+/tools/   @szha @pllarroy
 
 # Github templates
 /.github/ @szha



[incubator-mxnet] branch master updated: Fix warning in waitall doc (#13618)

2018-12-11 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 9ce7eab  Fix warning in waitall doc (#13618)
9ce7eab is described below

commit 9ce7eabcbc9575128240f71f79f9f7cce1a19aa7
Author: Anirudh Subramanian 
AuthorDate: Tue Dec 11 17:22:02 2018 -0800

Fix warning in waitall doc (#13618)
---
 python/mxnet/ndarray/ndarray.py | 10 ++
 1 file changed, 6 insertions(+), 4 deletions(-)

diff --git a/python/mxnet/ndarray/ndarray.py b/python/mxnet/ndarray/ndarray.py
index 4e6d0cd..9a62620 100644
--- a/python/mxnet/ndarray/ndarray.py
+++ b/python/mxnet/ndarray/ndarray.py
@@ -157,11 +157,13 @@ def waitall():
 """Wait for all async operations to finish in MXNet.
 
 This function is used for benchmarking only.
+
 .. warning::
-If your code has exceptions, `waitall` can cause silent failures.
-For this reason you should avoid `waitall` in your code.
-Use it only if you are confident that your code is error free.
-Then make sure you call `wait_to_read` on all outputs after `waitall`.
+
+   If your code has exceptions, `waitall` can cause silent failures.
+   For this reason you should avoid `waitall` in your code.
+   Use it only if you are confident that your code is error free.
+   Then make sure you call `wait_to_read` on all outputs after `waitall`.
 """
 check_call(_LIB.MXNDArrayWaitAll())
 



[incubator-mxnet] branch master updated: add cpp example inception to nightly test (#13534)

2018-12-07 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new f2ca66f  add cpp example inception to nightly test (#13534)
f2ca66f is described below

commit f2ca66f2c537783aa60251080582793f42f395a7
Author: Lai Wei 
AuthorDate: Fri Dec 7 19:57:54 2018 -0800

add cpp example inception to nightly test (#13534)

* add inception test

* fix max iter for mlp

* rename and add comment

* rename epoch num
---
 cpp-package/example/mlp.cpp  | 10 +-
 cpp-package/tests/ci_test.sh |  3 +++
 2 files changed, 8 insertions(+), 5 deletions(-)

diff --git a/cpp-package/example/mlp.cpp b/cpp-package/example/mlp.cpp
index 595d75c..cc16f53 100644
--- a/cpp-package/example/mlp.cpp
+++ b/cpp-package/example/mlp.cpp
@@ -144,13 +144,13 @@ void MLP() {
grad_req_type, aux_states);
 
   std::cout << "Training" << std::endl;
-  int max_iters = 2;
+  int max_epoch = 15000;
   mx_float learning_rate = 0.0001;
-  for (int iter = 0; iter < max_iters; ++iter) {
+  for (int epoch_num = 0; epoch_num < max_epoch; ++epoch_num) {
 exe->Forward(true);
-
-if (iter % 100 == 0) {
-  std::cout << "epoch " << iter << std::endl;
+// print accuracy every 100 epoch
+if (epoch_num % 100 == 0) {
+  std::cout << "epoch " << epoch_num << std::endl;
   std::vector& out = exe->outputs;
   float* cptr = new float[128 * 10];
   out[0].SyncCopyToCPU(cptr, 128 * 10);
diff --git a/cpp-package/tests/ci_test.sh b/cpp-package/tests/ci_test.sh
index 7674e2d..4a17d8d 100755
--- a/cpp-package/tests/ci_test.sh
+++ b/cpp-package/tests/ci_test.sh
@@ -36,6 +36,9 @@ cp ../../build/cpp-package/example/lenet_with_mxdataiter .
 cp ../../build/cpp-package/example/resnet .
 ./resnet 5
 
+cp ../../build/cpp-package/example/inception_bn .
+./inception_bn 5
+
 cp ../../build/cpp-package/example/mlp .
 ./mlp
 



[incubator-mxnet] branch master updated (95f1e1c -> 7d2b804)

2018-12-07 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 95f1e1c  fix link for gluon model zoo (#13583)
 add 7d2b804  Fix exception handling api doc (#13519)

No new revisions were added by this update.

Summary of changes:
 python/mxnet/ndarray/ndarray.py | 5 +
 1 file changed, 5 insertions(+)



[incubator-mxnet] branch master updated: fix link for gluon model zoo (#13583)

2018-12-07 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 95f1e1c  fix link for gluon model zoo (#13583)
95f1e1c is described below

commit 95f1e1c51a38d34e62baa08975c7fc3548ae82e0
Author: Steffen Rochel 
AuthorDate: Fri Dec 7 19:41:06 2018 -0800

fix link for gluon model zoo (#13583)
---
 docs/community/ecosystem.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/community/ecosystem.md b/docs/community/ecosystem.md
index 54f8c89..100ed97 100644
--- a/docs/community/ecosystem.md
+++ b/docs/community/ecosystem.md
@@ -62,7 +62,7 @@ Community contributions to MXNet have added many new valuable 
features and funct
 
 ## Model Zoos
 
-* [Gluon Model Zoo](https://github.com/awslabs/mxnet-model-server) - models 
trained in Gluon and available through Gluon's model zoo API.
+* [Gluon Model 
Zoo](https://mxnet.incubator.apache.org/api/python/gluon/model_zoo.html) - 
models trained in Gluon and available through Gluon's model zoo API.
 * [ONNX Model Zoo](https://github.com/onnx/models) - ONNX models from a 
variety of ONNX-supported frameworks.
 
 



[incubator-mxnet] branch master updated: ONNX import/export: Size (#13112)

2018-12-07 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 636933d  ONNX import/export: Size (#13112)
636933d is described below

commit 636933d424d789661d9e954ebfb569e1a2945a78
Author: Vandana Kannan 
AuthorDate: Fri Dec 7 19:39:47 2018 -0800

ONNX import/export: Size (#13112)
---
 python/mxnet/contrib/onnx/mx2onnx/_op_translations.py | 8 
 python/mxnet/contrib/onnx/onnx2mx/_import_helper.py   | 3 ++-
 python/mxnet/contrib/onnx/onnx2mx/_op_translations.py | 4 
 tests/python-pytest/onnx/export/onnx_backend_test.py  | 3 ++-
 tests/python-pytest/onnx/import/test_cases.py | 3 ++-
 5 files changed, 18 insertions(+), 3 deletions(-)

diff --git a/python/mxnet/contrib/onnx/mx2onnx/_op_translations.py 
b/python/mxnet/contrib/onnx/mx2onnx/_op_translations.py
index 0f4b448..0d20c76 100644
--- a/python/mxnet/contrib/onnx/mx2onnx/_op_translations.py
+++ b/python/mxnet/contrib/onnx/mx2onnx/_op_translations.py
@@ -1647,3 +1647,11 @@ def convert_logical_not(node, **kwargs):
 and return the created node.
 """
 return create_basic_op_node('Not', node, kwargs)
+
+
+@mx_op.register("size_array")
+def convert_size(node, **kwargs):
+"""Map MXNet's size_array operator attributes to onnx's Size operator
+and return the created node.
+"""
+return create_basic_op_node('Size', node, kwargs)
diff --git a/python/mxnet/contrib/onnx/onnx2mx/_import_helper.py 
b/python/mxnet/contrib/onnx/onnx2mx/_import_helper.py
index f61910f..2ceabae 100644
--- a/python/mxnet/contrib/onnx/onnx2mx/_import_helper.py
+++ b/python/mxnet/contrib/onnx/onnx2mx/_import_helper.py
@@ -21,7 +21,7 @@
 from ._op_translations import identity, random_uniform, random_normal
 from ._op_translations import add, subtract, multiply, divide, absolute, 
negative, add_n
 from ._op_translations import tanh, arccos, arcsin, arctan, _cos, _sin, _tan
-from ._op_translations import softplus, shape, gather, lp_pooling
+from ._op_translations import softplus, shape, gather, lp_pooling, size
 from ._op_translations import ceil, floor, hardsigmoid, global_lppooling
 from ._op_translations import concat
 from ._op_translations import leaky_relu, _elu, _prelu, _selu, softmax, 
fully_connected
@@ -139,6 +139,7 @@ _convert_map = {
 'Softplus'  : softplus,
 'Tan'   : _tan,
 'Shape' : shape,
+'Size'  : size,
 'Gather': gather,
 'HardSigmoid'   : hardsigmoid,
 'LpPool': lp_pooling,
diff --git a/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py 
b/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py
index 368b98d..7028325 100644
--- a/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py
+++ b/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py
@@ -642,6 +642,10 @@ def shape(attrs, inputs, proto_obj):
 """Returns shape of input array."""
 return 'shape_array', attrs, inputs
 
+def size(attrs, inputs, proto_obj):
+"""Returns array containing size of data."""
+return "size_array", attrs, inputs
+
 def reduce_l2(attrs, inputs, proto_obj):
 """Reduce input tensor by l2 normalization."""
 new_attrs = translation_utils._fix_attribute_names(attrs, {'axes':'axis'})
diff --git a/tests/python-pytest/onnx/export/onnx_backend_test.py 
b/tests/python-pytest/onnx/export/onnx_backend_test.py
index be9273e..c9926c4 100644
--- a/tests/python-pytest/onnx/export/onnx_backend_test.py
+++ b/tests/python-pytest/onnx/export/onnx_backend_test.py
@@ -97,7 +97,8 @@ IMPLEMENTED_OPERATORS_TEST = [
 'test_depthtospace',
 'test_hardsigmoid',
 'test_instancenorm',
-'test_shape'
+'test_shape',
+'test_size'
 ]
 
 BASIC_MODEL_TESTS = [
diff --git a/tests/python-pytest/onnx/import/test_cases.py 
b/tests/python-pytest/onnx/import/test_cases.py
index f41fe92..e0b26cc 100644
--- a/tests/python-pytest/onnx/import/test_cases.py
+++ b/tests/python-pytest/onnx/import/test_cases.py
@@ -85,7 +85,8 @@ IMPLEMENTED_OPERATORS_TEST = [
 'test_operator_maxpool',
 'test_operator_params',
 'test_operator_permute2',
-'test_depthtospace'
+'test_depthtospace',
+'test_size'
 ]
 
 BASIC_MODEL_TESTS = [



[incubator-mxnet] branch master updated (4f61c32 -> 7d74452)

2018-12-07 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 4f61c32  License update  (#13565)
 add 7d74452  Fix use-before-assignment in convert_dot (#13511)

No new revisions were added by this update.

Summary of changes:
 python/mxnet/contrib/onnx/mx2onnx/_op_translations.py | 2 ++
 1 file changed, 2 insertions(+)



[incubator-mxnet] branch v1.4.x updated: Add resiliency to onnx export code (#13426) (#13567)

2018-12-06 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.4.x
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/v1.4.x by this push:
 new 2d08816  Add resiliency to onnx export code (#13426) (#13567)
2d08816 is described below

commit 2d08816c393e4172e89aa265493fb2a40111d39f
Author: Sina Afrooze 
AuthorDate: Thu Dec 6 18:06:36 2018 -0800

Add resiliency to onnx export code (#13426) (#13567)

* Added resiliency to onnx export code

- With previous infer-shape implementation, if input shape was list instead 
of tuple or if extra non-existent parameters were provided, the code would 
still work. The fixes in this commit make sure that behavior is restored to 
prevent any compatibility issues with existing export code.

* Fixed name of net in unittest

* Fix pylint
---
 python/mxnet/contrib/onnx/mx2onnx/export_onnx.py|  5 +++--
 .../python-pytest/onnx/export/mxnet_export_test.py  | 21 +++--
 2 files changed, 22 insertions(+), 4 deletions(-)

diff --git a/python/mxnet/contrib/onnx/mx2onnx/export_onnx.py 
b/python/mxnet/contrib/onnx/mx2onnx/export_onnx.py
index 14c674f..84db5de 100644
--- a/python/mxnet/contrib/onnx/mx2onnx/export_onnx.py
+++ b/python/mxnet/contrib/onnx/mx2onnx/export_onnx.py
@@ -134,9 +134,10 @@ class MXNetGraph(object):
 # remove any input listed in params from sym.list_inputs() and bind 
them to the input shapes provided
 # by user. Also remove in_label, which is the name of the label symbol 
that may have been used
 # as the label for loss during training.
-inputs = {n: s for n, s in zip([n for n in sym.list_inputs() if n not 
in params and n != in_label], in_shape)}
+inputs = {n: tuple(s) for n, s in zip([n for n in sym.list_inputs() if 
n not in params and n != in_label],
+  in_shape)}
 # Add params and their shape to list of inputs
-inputs.update({n: v.shape for n, v in params.items()})
+inputs.update({n: v.shape for n, v in params.items() if n in 
sym.list_inputs()})
 # Provide input data as well as input params to infer_shape()
 _, out_shapes, _ = sym.infer_shape(**inputs)
 
diff --git a/tests/python-pytest/onnx/export/mxnet_export_test.py 
b/tests/python-pytest/onnx/export/mxnet_export_test.py
index f4144fd6..964d0e7 100644
--- a/tests/python-pytest/onnx/export/mxnet_export_test.py
+++ b/tests/python-pytest/onnx/export/mxnet_export_test.py
@@ -286,18 +286,19 @@ def _optional_group(symbols, group=False):
 return symbols
 
 
-def _check_onnx_export(net, group_outputs=False):
+def _check_onnx_export(net, group_outputs=False, shape_type=tuple, 
extra_params={}):
 net.initialize()
 data = nd.random.uniform(0, 1, (1, 1024))
 output = _force_list(net(data))  # initialize weights
 net_sym = _optional_group(net(sym.Variable('data')), group_outputs)
 net_params = {name:param._reduce() for name, param in 
net.collect_params().items()}
+net_params.update(extra_params)
 with tempfile.TemporaryDirectory() as tmpdirname:
 onnx_file_path = os.path.join(tmpdirname, 'net.onnx')
 export_path = onnx_mxnet.export_model(
 sym=net_sym,
 params=net_params,
-input_shape=[data.shape],
+input_shape=[shape_type(data.shape)],
 onnx_file_path=onnx_file_path)
 assert export_path == onnx_file_path
 # Try importing the model to symbol
@@ -340,6 +341,22 @@ def test_onnx_export_multi_output():
 _check_onnx_export(net, group_outputs=True)
 
 
+@with_seed()
+def test_onnx_export_list_shape():
+net = nn.HybridSequential(prefix='list_shape_net')
+with net.name_scope():
+net.add(nn.Dense(100, activation='relu'), nn.Dense(10))
+_check_onnx_export(net, shape_type=list)
+
+
+@with_seed()
+def test_onnx_export_extra_params():
+net = nn.HybridSequential(prefix='extra_params_net')
+with net.name_scope():
+net.add(nn.Dense(100, activation='relu'), nn.Dense(10))
+_check_onnx_export(net, extra_params={'extra_param': nd.array([1, 2])})
+
+
 if __name__ == '__main__':
 test_models("bvlc_googlenet", (1, 3, 224, 224), (1, 1000))
 test_models("bvlc_reference_caffenet", (1, 3, 224, 224), (1, 1000))



[incubator-mxnet] branch master updated (f390f0c -> 9c0d173)

2018-12-06 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from f390f0c  Adding test for softmaxoutput (#13116)
 add 9c0d173  Add workspace cleaning after job finished (#13490)

No new revisions were added by this update.

Summary of changes:
 ci/Jenkinsfile_utils.groovy | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)



[incubator-mxnet] branch master updated (b684c65 -> 7d44deb)

2018-12-02 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from b684c65  #13453 [Clojure] - Add Spec Validations to the Optimizer 
namespace (#13499)
 add 7d44deb  ONNX export: Logical operators (#12852)

No new revisions were added by this update.

Summary of changes:
 .../mxnet/contrib/onnx/mx2onnx/_op_translations.py | 32 ++
 .../python-pytest/onnx/export/mxnet_export_test.py | 39 ++
 tests/python-pytest/onnx/import/test_cases.py  |  1 -
 3 files changed, 71 insertions(+), 1 deletion(-)



[incubator-mxnet] branch v1.4.x updated: [MXNET-1158] JVM Memory Management Documentation (#13105) (#13494)

2018-11-30 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.4.x
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/v1.4.x by this push:
 new 5a2b7cf  [MXNET-1158] JVM Memory Management Documentation (#13105) 
(#13494)
5a2b7cf is described below

commit 5a2b7cf7f4bc16e82f34ae25cb9adc42c8696953
Author: Naveen Swamy 
AuthorDate: Fri Nov 30 12:06:32 2018 -0800

[MXNET-1158] JVM Memory Management Documentation (#13105) (#13494)

* update train_mnist

* Add documentation for JVM Memory Management

* update doc

* address nit picks

* address nit picks

* Grammar and clarity edits for memory management doc

* Edits for scala memory management

* Update memory-management.md

* Update memory-management.md

* Update memory-management.md

* capitalization fix
---
 scala-package/examples/scripts/run_train_mnist.sh |  24 -
 scala-package/memory-management.md| 118 ++
 2 files changed, 138 insertions(+), 4 deletions(-)

diff --git a/scala-package/examples/scripts/run_train_mnist.sh 
b/scala-package/examples/scripts/run_train_mnist.sh
index ea53c1a..d27b7cb 100755
--- a/scala-package/examples/scripts/run_train_mnist.sh
+++ b/scala-package/examples/scripts/run_train_mnist.sh
@@ -19,15 +19,31 @@
 
 set -e
 
+hw_type=cpu
+if [[ $1 = gpu ]]
+then
+hw_type=gpu
+fi
+
+platform=linux-x86_64
+
+if [[ $OSTYPE = [darwin]* ]]
+then
+platform=osx-x86_64
+hw_type=cpu
+fi
+
 MXNET_ROOT=$(cd "$(dirname $0)/../../.."; pwd)
 echo $MXNET_ROOT
-CLASS_PATH=$MXNET_ROOT/scala-package/assembly/linux-x86_64-cpu/target/*:$MXNET_ROOT/scala-package/examples/target/*:$MXNET_ROOT/scala-package/examples/target/classes/lib/*:$MXNET_ROOT/scala-package/infer/target/*
+CLASS_PATH=$MXNET_ROOT/scala-package/assembly/$platform-$hw_type/target/*:$MXNET_ROOT/scala-package/examples/target/*:$MXNET_ROOT/scala-package/examples/target/classes/lib/*
 
 # model dir
 DATA_PATH=$2
 
-java -XX:+PrintGC -Xms256M -Xmx512M -Dmxnet.traceLeakedObjects=false -cp 
$CLASS_PATH \
-org.apache.mxnetexamples.imclassification.TrainMnist \
---data-dir /home/ubuntu/mxnet_scala/scala-package/examples/mnist/ \
+java -XX:+PrintGC -Dmxnet.traceLeakedObjects=false -cp $CLASS_PATH \
+org.apache.mxnetexamples.imclassification.TrainModel \
+--data-dir $MXNET_ROOT/scala-package/examples/mnist/ \
+--network mlp \
+--num-layers 50 \
 --num-epochs 1000 \
 --batch-size 1024
\ No newline at end of file
diff --git a/scala-package/memory-management.md 
b/scala-package/memory-management.md
new file mode 100644
index 000..33c36b6
--- /dev/null
+++ b/scala-package/memory-management.md
@@ -0,0 +1,118 @@
+# JVM Memory Management
+The Scala and Java bindings of Apache MXNet use native memory (memory from the 
C++ heap in either RAM or GPU memory) for most of the MXNet objects such as 
NDArray, Symbol, Executor, KVStore, Data Iterators, etc.
+The associated Scala classes act only as wrappers. The operations done on 
these wrapper objects are then directed to the high performance MXNet C++ 
backend via the Java Native Interface (JNI). Therefore, the bytes are stored in 
the C++ native heap which allows for fast access.
+
+However, the JVM Garbage Collector only manages objects allocated in the JVM 
Heap and is not aware of the memory footprint of these objects in the native 
memory. Hence, the allocation/deallocation of native memory must be managed by 
MXNet Scala.
+Allocating native memory is straight forward and is done during the 
construction of the object by calling the associated C++ API through JNI. 
However, since JVM languages do not have destructors, the deallocation of these 
objects must be done explicitly.
+MXNet Scala provides a few easy modes of operation which are explained in 
detail below.
+
+## Memory Management in Scala 
+### 1.  
[ResourceScope.using](https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/org/apache/mxnet/ResourceScope.scala#L106)
 (Recommended)
+`ResourceScope.using` provides the familiar Java try-with-resources primitive 
in Scala and will automatically manage the memory of all the MXNet objects 
created in the associated code block (`body`). It works by tracking the 
allocations performed inside the code block deallocating when exiting the 
block. 
+Passing MXNet objects out of a using block can be easily accomplished by 
simply returning an object or an iterable containing multiple MXNet objects. If 
you have nested using blocks, then the returned objects will be moved into the 
parent scope as well.
+
+**Usage** 
+```scala
+ResourceScope.using() {
+ResourceScope.using() {
+val r1 = NDArray.ones(Shape(2, 2))
+val r2 = NDArray.ones(Shape(3, 4))
+val r3 = NDArray.on

[incubator-mxnet] branch master updated: [MXNET-1158] JVM Memory Management Documentation (#13105)

2018-11-30 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 55acf56  [MXNET-1158] JVM Memory Management Documentation (#13105)
55acf56 is described below

commit 55acf569da0eddef61ff7d7b0a042b7e3781847e
Author: Naveen Swamy 
AuthorDate: Fri Nov 30 10:54:12 2018 -0800

[MXNET-1158] JVM Memory Management Documentation (#13105)

* update train_mnist

* Add documentation for JVM Memory Management

* update doc

* address nit picks

* address nit picks

* Grammar and clarity edits for memory management doc

* Edits for scala memory management

* Update memory-management.md

* Update memory-management.md

* Update memory-management.md

* capitalization fix
---
 scala-package/examples/scripts/run_train_mnist.sh |  24 -
 scala-package/memory-management.md| 118 ++
 2 files changed, 138 insertions(+), 4 deletions(-)

diff --git a/scala-package/examples/scripts/run_train_mnist.sh 
b/scala-package/examples/scripts/run_train_mnist.sh
index ea53c1a..d27b7cb 100755
--- a/scala-package/examples/scripts/run_train_mnist.sh
+++ b/scala-package/examples/scripts/run_train_mnist.sh
@@ -19,15 +19,31 @@
 
 set -e
 
+hw_type=cpu
+if [[ $1 = gpu ]]
+then
+hw_type=gpu
+fi
+
+platform=linux-x86_64
+
+if [[ $OSTYPE = [darwin]* ]]
+then
+platform=osx-x86_64
+hw_type=cpu
+fi
+
 MXNET_ROOT=$(cd "$(dirname $0)/../../.."; pwd)
 echo $MXNET_ROOT
-CLASS_PATH=$MXNET_ROOT/scala-package/assembly/linux-x86_64-cpu/target/*:$MXNET_ROOT/scala-package/examples/target/*:$MXNET_ROOT/scala-package/examples/target/classes/lib/*:$MXNET_ROOT/scala-package/infer/target/*
+CLASS_PATH=$MXNET_ROOT/scala-package/assembly/$platform-$hw_type/target/*:$MXNET_ROOT/scala-package/examples/target/*:$MXNET_ROOT/scala-package/examples/target/classes/lib/*
 
 # model dir
 DATA_PATH=$2
 
-java -XX:+PrintGC -Xms256M -Xmx512M -Dmxnet.traceLeakedObjects=false -cp 
$CLASS_PATH \
-org.apache.mxnetexamples.imclassification.TrainMnist \
---data-dir /home/ubuntu/mxnet_scala/scala-package/examples/mnist/ \
+java -XX:+PrintGC -Dmxnet.traceLeakedObjects=false -cp $CLASS_PATH \
+org.apache.mxnetexamples.imclassification.TrainModel \
+--data-dir $MXNET_ROOT/scala-package/examples/mnist/ \
+--network mlp \
+--num-layers 50 \
 --num-epochs 1000 \
 --batch-size 1024
\ No newline at end of file
diff --git a/scala-package/memory-management.md 
b/scala-package/memory-management.md
new file mode 100644
index 000..33c36b6
--- /dev/null
+++ b/scala-package/memory-management.md
@@ -0,0 +1,118 @@
+# JVM Memory Management
+The Scala and Java bindings of Apache MXNet use native memory (memory from the 
C++ heap in either RAM or GPU memory) for most of the MXNet objects such as 
NDArray, Symbol, Executor, KVStore, Data Iterators, etc.
+The associated Scala classes act only as wrappers. The operations done on 
these wrapper objects are then directed to the high performance MXNet C++ 
backend via the Java Native Interface (JNI). Therefore, the bytes are stored in 
the C++ native heap which allows for fast access.
+
+However, the JVM Garbage Collector only manages objects allocated in the JVM 
Heap and is not aware of the memory footprint of these objects in the native 
memory. Hence, the allocation/deallocation of native memory must be managed by 
MXNet Scala.
+Allocating native memory is straight forward and is done during the 
construction of the object by calling the associated C++ API through JNI. 
However, since JVM languages do not have destructors, the deallocation of these 
objects must be done explicitly.
+MXNet Scala provides a few easy modes of operation which are explained in 
detail below.
+
+## Memory Management in Scala 
+### 1.  
[ResourceScope.using](https://github.com/apache/incubator-mxnet/blob/master/scala-package/core/src/main/scala/org/apache/mxnet/ResourceScope.scala#L106)
 (Recommended)
+`ResourceScope.using` provides the familiar Java try-with-resources primitive 
in Scala and will automatically manage the memory of all the MXNet objects 
created in the associated code block (`body`). It works by tracking the 
allocations performed inside the code block deallocating when exiting the 
block. 
+Passing MXNet objects out of a using block can be easily accomplished by 
simply returning an object or an iterable containing multiple MXNet objects. If 
you have nested using blocks, then the returned objects will be moved into the 
parent scope as well.
+
+**Usage** 
+```scala
+ResourceScope.using() {
+ResourceScope.using() {
+val r1 = NDArray.ones(Shape(2, 2))
+val r2 = NDArray.ones(Shape(3, 4))
+val r3 = NDArray.ones(Shape(5, 6))
+

svn commit: r31177 - /dev/incubator/mxnet/1.3.1.rc0/

2018-11-27 Thread nswamy
Author: nswamy
Date: Tue Nov 27 22:55:01 2018
New Revision: 31177

Log:
remove 1.3.1.rc0 and move to releases repo

Removed:
dev/incubator/mxnet/1.3.1.rc0/



svn commit: r31176 - in /release/incubator/mxnet/1.3.1: ./ apache-mxnet-src-1.3.1-incubating.tar.gz apache-mxnet-src-1.3.1-incubating.tar.gz.asc apache-mxnet-src-1.3.1-incubating.tar.gz.sha512

2018-11-27 Thread nswamy
Author: nswamy
Date: Tue Nov 27 22:53:58 2018
New Revision: 31176

Log:
mxnet 1.3.1

Added:
release/incubator/mxnet/1.3.1/
release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz   
(with props)
release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz.asc

release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz.sha512

Added: release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz
==
Binary file - no diff available.

Propchange: 
release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz
--
svn:mime-type = application/octet-stream

Added: 
release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz.asc
==
--- release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz.asc 
(added)
+++ release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz.asc 
Tue Nov 27 22:53:58 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEqj68w+Zadorj0qZLjvR7hyDoxUkFAlv9yrgACgkQjvR7hyDo
+xUl6TA/7BUi/XrAl/tInEF1vZANmNVpG8Z8fsosAMy4t/Gk+ZAUqQM7Cx5KPwMLZ
+Z9TUNvN+6sxUKCiMpNmn4G+6vgIKjLqdPKawtb/dUxdWFQNalPTx4ktPS/cUUuDy
+XlpuKgUhUT+/2Ay/+S+lzVumbIY2DoA/E9NZwc7cAEdvHvHYhK8OZXIgJc49O360
+Tmkpl1bB91wS+sK+6L0SOac5VkvQNf+TLceKY7LD9q/lAEq1ZKQICi4hjfdiJJKE
+ouT9gWnj9DX6MF/P9zdPCelJrl6M/aMPtC/3ossZE/0dUaMTFHfzEXr7/l8tiBZ4
+zelzu3nSzDtiebv/Decy6DKL654aAMnjECmVuTfFGxjE/Ud+CMZFGLhl5ZqBK9Hl
+cJCJFxdVl+DeV6j2NmvKs342k+3kRWZUilPHekYtsUDGcjzpJaOh20po4X/HKcD8
+ok8UCqo6U5qAMGc1gkEedfWAGNHtVojatEqEINRF4TLdORZvO/rh+vpCgDJW1PHe
+2hEXUTdQNyjO/VoKTr9o5MQSbi50sgdfrDVy69y4eZTgMQXSuKJo5AhZCY2gJwzO
+OHgl2qbGlXlLJ01FqbrYf9ZfcIxxZnfj6m0ww1XiIAq+xD9Bt46lwfr15K15ecF0
+HCulX0hKWtr44j+PczSbSAHmurlY8HJpRrId6e/szdC95N3VEX4=
+=GVZj
+-END PGP SIGNATURE-

Added: 
release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz.sha512
==
--- 
release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz.sha512 
(added)
+++ 
release/incubator/mxnet/1.3.1/apache-mxnet-src-1.3.1-incubating.tar.gz.sha512 
Tue Nov 27 22:53:58 2018
@@ -0,0 +1 @@
+916b27684c1fad611c3212c434c87219c7eeaf41b9fb3eaf26284ac6c51e633a73f81e0e6a60a256b68bb77503efd1b7c81ff8ae8a14da32dbae3dfd2b1d703d
  apache-mxnet-src-1.3.1-incubating.tar.gz




[incubator-mxnet] branch master updated (7542b2b -> 2d1c627)

2018-11-27 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 7542b2b  Java demo file-path fix (#13358)
 add 2d1c627  Updated README and NEWS with 1.3.1 release information 
(#13423)

No new revisions were added by this update.

Summary of changes:
 NEWS.md   | 141 --
 README.md |   1 +
 2 files changed, 120 insertions(+), 22 deletions(-)



[incubator-mxnet] branch master updated (8a94dbd -> 7542b2b)

2018-11-27 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 8a94dbd  [MXNET-1029] Feature request: randint operator (#12749)
 add 7542b2b  Java demo file-path fix (#13358)

No new revisions were added by this update.

Summary of changes:
 scala-package/mxnet-demo/java-demo/Makefile|  9 ++--
 scala-package/mxnet-demo/java-demo/README.md   | 52 +-
 .../mxnet-demo/java-demo/bin/java_sample.sh|  4 +-
 scala-package/mxnet-demo/java-demo/bin/run_od.sh   |  5 +--
 scala-package/mxnet-demo/java-demo/pom.xml | 15 ++-
 .../main/java/{sample => mxnet}/HelloWorld.java|  2 +-
 .../java/{sample => mxnet}/ObjectDetection.java|  4 +-
 7 files changed, 53 insertions(+), 38 deletions(-)
 rename scala-package/mxnet-demo/java-demo/src/main/java/{sample => 
mxnet}/HelloWorld.java (98%)
 rename scala-package/mxnet-demo/java-demo/src/main/java/{sample => 
mxnet}/ObjectDetection.java (98%)



[incubator-mxnet] branch master updated: [Example]Fix mlp_csv example (#13273)

2018-11-27 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 4f8aa09  [Example]Fix mlp_csv example (#13273)
4f8aa09 is described below

commit 4f8aa092a6fdaa98fe6f1afac24eabb2c98f5e2a
Author: Jake Lee 
AuthorDate: Tue Nov 27 10:32:17 2018 -0800

[Example]Fix mlp_csv example (#13273)

* add instruction to get the data and fix typo

* fix typo

* update file name

* trigger CI

* add unit_test for unit_test_mlp_csv

* add mlp_csv to jenkinsfile

* revert jenkinsfile to another PR

* trigger CI

* trigger CI
---
 cpp-package/example/README.md   | 8 +++-
 cpp-package/example/mlp_csv.cpp | 2 +-
 cpp-package/tests/ci_test.sh| 2 ++
 3 files changed, 10 insertions(+), 2 deletions(-)

diff --git a/cpp-package/example/README.md b/cpp-package/example/README.md
index 06ea17b..c7223e9 100644
--- a/cpp-package/example/README.md
+++ b/cpp-package/example/README.md
@@ -69,7 +69,13 @@ build/mlp_gpu
 The code implements a multilayer perceptron to train the MNIST data. The code 
demonstrates the use of the "SimpleBind"  C++ API and CSVIter. The CSVIter can 
iterate data that is in CSV format. The example can be run on CPU or GPU. The 
example usage is as follows:
 
 ```
-build/mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv 
--epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]"
+build/mlp_csv --train data/mnist_data/mnist_train.csv --test 
data/mnist_data/mnist_test.csv --epochs 10 --batch_size 100 --hidden_units "128 
64 64" --gpu
+```
+* To get the `mnist_training_set.csv` and `mnist_test_set.csv` please run the 
following command:
+```python
+# in incubator-mxnet/cpp-package/example directory
+python mnist_to_csv.py ./data/mnist_data/train-images-idx3-ubyte 
./data/mnist_data/train-labels-idx1-ubyte ./data/mnist_data/mnist_train.csv 
6
+python mnist_to_csv.py ./data/mnist_data/t10k-images-idx3-ubyte 
./data/mnist_data/t10k-labels-idx1-ubyte ./data/mnist_data/mnist_test.csv 1
 ```
 
 ### 
[resnet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/resnet.cpp>)
diff --git a/cpp-package/example/mlp_csv.cpp b/cpp-package/example/mlp_csv.cpp
index 8aec4b7..43a14c8 100644
--- a/cpp-package/example/mlp_csv.cpp
+++ b/cpp-package/example/mlp_csv.cpp
@@ -72,7 +72,7 @@ std::vector getLayers(const std::string& 
hidden_units_string) {
 void printUsage() {
 std::cout << "Usage:" << std::endl;
 std::cout << "mlp_csv --train mnist_training_set.csv --test 
mnist_test_set.csv --epochs 10 "
-<< "--batch_size 100 --hidden_units \"128 64 64\" [--gpu]" << std::endl;
+<< "--batch_size 100 --hidden_units \"128 64 64\" --gpu" << std::endl;
 std::cout << "The example uses mnist data in CSV format. The MNIST data in 
CSV format assumes "
 << "the column 0 to be label and the rest 784 column to be data." << 
std::endl;
 std::cout << "By default, the example uses 'cpu' context. If '--gpu' is 
specified, "
diff --git a/cpp-package/tests/ci_test.sh b/cpp-package/tests/ci_test.sh
index 57007f3..7674e2d 100755
--- a/cpp-package/tests/ci_test.sh
+++ b/cpp-package/tests/ci_test.sh
@@ -50,3 +50,5 @@ cp ../../build/cpp-package/example/mlp_gpu .
 
 cp ../../build/cpp-package/example/test_score .
 ./test_score 0.93
+
+sh unittests/unit_test_mlp_csv.sh



[incubator-mxnet] branch v1.3.x updated: Updated README and NEWS with 1.3.1 release information (#13422)

2018-11-27 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.3.x
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/v1.3.x by this push:
 new 96b4b6e  Updated README and NEWS with 1.3.1 release information 
(#13422)
96b4b6e is described below

commit 96b4b6ef3c60c63644a7c4d672109b97561b839d
Author: Anton Chernov 
AuthorDate: Tue Nov 27 19:17:21 2018 +0100

Updated README and NEWS with 1.3.1 release information (#13422)
---
 README.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/README.md b/README.md
index 23b9d32..369df9b 100644
--- a/README.md
+++ b/README.md
@@ -33,6 +33,7 @@ How to Contribute
 
 What's New
 --
+* [Version 1.3.1 
Release](https://github.com/apache/incubator-mxnet/releases/tag/1.3.1) - MXNet 
1.3.1 Patch Release.
 * [Version 1.3.0 
Release](https://github.com/apache/incubator-mxnet/releases/tag/1.3.0) - MXNet 
1.3.0 Release.
 * [Version 1.2.0 
Release](https://github.com/apache/incubator-mxnet/releases/tag/1.2.0) - MXNet 
1.2.0 Release.
 * [Version 1.1.0 
Release](https://github.com/apache/incubator-mxnet/releases/tag/1.1.0) - MXNet 
1.1.0 Release.



[incubator-mxnet] branch master updated: [Example] fix cpp example inception-bn and training acc issue (#13284)

2018-11-27 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new ab71205  [Example] fix cpp example inception-bn and training acc issue 
(#13284)
ab71205 is described below

commit ab712056b6f84ecb0f645e37ea8699df38965255
Author: Lai Wei 
AuthorDate: Tue Nov 27 09:48:25 2018 -0800

[Example] fix cpp example inception-bn and training acc issue (#13284)

* fix inception-bn and training acc issue

* add parameter initialization, fix lint

* fix comparison

* change optimizer to sgd

* update sgd and update model name

* add inception_bn in jenkins build

* make max epoch an argument

* remove inception_bn test

* trigger ci

* remove ci test

* trigger ci
---
 cpp-package/example/alexnet.cpp   |  2 +-
 cpp-package/example/charRNN.cpp   |  2 +-
 cpp-package/example/googlenet.cpp |  2 +-
 cpp-package/example/inception_bn.cpp  | 26 +++---
 cpp-package/example/lenet_with_mxdataiter.cpp |  2 +-
 cpp-package/example/resnet.cpp| 19 +++
 cpp-package/include/mxnet-cpp/symbol.hpp  |  4 ++--
 7 files changed, 40 insertions(+), 17 deletions(-)

diff --git a/cpp-package/example/alexnet.cpp b/cpp-package/example/alexnet.cpp
index a5f4952..7564d43 100644
--- a/cpp-package/example/alexnet.cpp
+++ b/cpp-package/example/alexnet.cpp
@@ -249,7 +249,7 @@ int main(int argc, char const *argv[]) {
   auto val_iter = MXDataIter("MNISTIter");
   setDataIter(&val_iter, "Label", data_files, batch_size);
 
-  Optimizer* opt = OptimizerRegistry::Find("ccsgd");
+  Optimizer* opt = OptimizerRegistry::Find("sgd");
   opt->SetParam("momentum", 0.9)
  ->SetParam("rescale_grad", 1.0 / batch_size)
  ->SetParam("clip_gradient", 10)
diff --git a/cpp-package/example/charRNN.cpp b/cpp-package/example/charRNN.cpp
index ad564f6..54b8eea 100644
--- a/cpp-package/example/charRNN.cpp
+++ b/cpp-package/example/charRNN.cpp
@@ -465,7 +465,7 @@ void train(const std::string file, int batch_size, int 
max_epoch, int start_epoc
 
   mx_float learning_rate = 0.0002;
   mx_float weight_decay = 0.02;
-  Optimizer* opt = OptimizerRegistry::Find("ccsgd");
+  Optimizer* opt = OptimizerRegistry::Find("sgd");
   opt->SetParam("lr", learning_rate)
  ->SetParam("wd", weight_decay);
 //  opt->SetParam("momentum", 0.9)->SetParam("rescale_grad", 1.0 / batch_size)
diff --git a/cpp-package/example/googlenet.cpp 
b/cpp-package/example/googlenet.cpp
index ad9212c..4bd3be2 100644
--- a/cpp-package/example/googlenet.cpp
+++ b/cpp-package/example/googlenet.cpp
@@ -144,7 +144,7 @@ int main(int argc, char const *argv[]) {
   auto val_iter = MXDataIter("MNISTIter");
   setDataIter(&val_iter, "Label", data_files, batch_size);
 
-  Optimizer* opt = OptimizerRegistry::Find("ccsgd");
+  Optimizer* opt = OptimizerRegistry::Find("sgd");
   opt->SetParam("momentum", 0.9)
  ->SetParam("rescale_grad", 1.0 / batch_size)
  ->SetParam("clip_gradient", 10)
diff --git a/cpp-package/example/inception_bn.cpp 
b/cpp-package/example/inception_bn.cpp
index c499df7..5b444e4 100644
--- a/cpp-package/example/inception_bn.cpp
+++ b/cpp-package/example/inception_bn.cpp
@@ -91,7 +91,8 @@ Symbol InceptionFactoryB(Symbol data, int num_3x3red, int 
num_3x3,
 Shape(1, 1), name + "_double_3x3_1");
   Symbol pooling = Pooling("max_pool_" + name + "_pool", data,
Shape(3, 3), PoolingPoolType::kMax,
-   false, false, PoolingPoolingConvention::kValid, 
Shape(2, 2));
+   false, false, PoolingPoolingConvention::kValid,
+   Shape(2, 2), Shape(1, 1));
   std::vector lst;
   lst.push_back(c3x3);
   lst.push_back(cd3x3);
@@ -143,8 +144,8 @@ Symbol InceptionSymbol(int num_classes) {
 
 int main(int argc, char const *argv[]) {
   int batch_size = 40;
-  int max_epoch = 100;
-  float learning_rate = 1e-4;
+  int max_epoch = argc > 1 ? strtol(argv[1], NULL, 10) : 100;
+  float learning_rate = 1e-2;
   float weight_decay = 1e-4;
 
   auto ctx = Context::gpu();
@@ -172,7 +173,13 @@ int main(int argc, char const *argv[]) {
   auto val_iter = MXDataIter("MNISTIter");
   setDataIter(&val_iter, "Label", data_files, batch_size);
 
-  Optimizer* opt = OptimizerRegistry::Find("ccsgd");
+  // initialize parameters
+  Xavier xavier = Xavier(Xavier::gaussian, Xavier::in, 2);
+  for (auto &arg

[incubator-mxnet] tag 1.3.1 created (now 19c5016)

2018-11-27 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to tag 1.3.1
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


  at 19c5016  (commit)
No new revisions were added by this update.



[incubator-mxnet] branch master updated: fix broken links and reorganize build from source page (#12962)

2018-11-21 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new f6317c9  fix broken links and reorganize build from source page 
(#12962)
f6317c9 is described below

commit f6317c9a92fe2b7dddaecf7891f5ff7f215ae329
Author: Aaron Markham 
AuthorDate: Wed Nov 21 09:02:43 2018 -0800

fix broken links and reorganize build from source page (#12962)
---
 docs/install/build_from_source.md   | 231 +---
 docs/tutorials/unsupervised_learning/gan.md |   7 +-
 2 files changed, 145 insertions(+), 93 deletions(-)

diff --git a/docs/install/build_from_source.md 
b/docs/install/build_from_source.md
index eff..b28fca3 100644
--- a/docs/install/build_from_source.md
+++ b/docs/install/build_from_source.md
@@ -1,93 +1,83 @@
 # Build MXNet from Source
 
-This document explains how to build MXNet from source code. Building MXNet 
from source is a two step process.
-
-1. Build the MXNet shared library, `libmxnet.so`, from [C++ source 
files](#build-the-shared-library)
-2. Install the [language bindings](#installing-mxnet-language-bindings) for 
MXNet. MXNet supports the following languages:
-- Python
-- C++
-- Clojure
-- Julia
-- Perl
-- R
-- Scala
+This document explains how to build MXNet from source code.
+
+
+## Overview
+
+Building from source follows this general two-step flow of building the shared 
library, then installing your preferred language binding. Use the following 
links to jump to the different sections of this guide.
+
+1. Build the MXNet shared library, `libmxnet.so`.
+* [Clone the repository](#clone-the-mxnet-project)
+* [Prerequisites](#prerequisites)
+* [Math library selection](#math-library-selection)
+* [Install GPU software](#install-gpu-software)
+* [Install optional software](#install-optional-software)
+* [Adjust your build configuration](#build-configurations)
+* [Build MXNet](#build-mxnet)
+* [with NCCL](#build-mxnet-with-nccl) (optional)
+* [for C++](#build-mxnet-with-c++) (optional)
+* [Usage Examples](#usage-examples)
+* [systems with GPUs and Intel 
CPUs](#recommended-for-Systems-with-NVIDIA-GPUs-and-Intel-CPUs)
+* [GPUs with non-Intel 
CPUs](#recommended-for-Systems-with-Intel-CPUs)
+* [Intel CPUs](#recommended-for-Systems-with-Intel-CPUs)
+* [non-Intel CPUs](#recommended-for-Systems-with-non-Intel-CPUs)
+2. [Install the language API binding(s)](#installing-mxnet-language-bindings) 
you would like to use for MXNet.
+MXNet's newest and most popular API is Gluon. Gluon is built into the Python 
binding. If Python isn't your preference, you still have more options. MXNet 
supports several other language APIs:
+- [Python (includes Gluon)](../api/python/index.html)
+- [C++](../api/c++/index.html)
+- [Clojure](../api/clojure/index.html)
+- Java (coming soon)
+- [Julia](../api/julia/index.html)
+- [Perl](../api/perl/index.html)
+- [R](../api/r/index.html)
+- [Scala](../api/scala/index.html)
+
+
 
-## Prerequisites
-
-You need C++ build tools and a BLAS library to build the MXNet shared library. 
If you want to run MXNet with GPUs, you will need to install [NVDIA CUDA and 
cuDNN](https://developer.nvidia.com/cuda-downloads) first.
+## Build Instructions by Operating System
 
-You may use [GNU Make](https://www.gnu.org/software/make/) to build the 
library but [cmake](https://cmake.org/) is required when building with MKLDNN
+Detailed instructions are provided per operating system. Each of these guides 
also covers how to install the specific [Language 
Bindings](#installing-mxnet-language-bindings) you require.
+You may jump to those, but it is recommended that you continue reading to 
understand more general "build from source" options.
 
+* [Amazon Linux / CentOS / RHEL](centos_setup.html)
+* [macOS](osx_setup.html)
+* [Raspbian](raspian_setup.html)
+* [TX2](tx2_setup.html)
+* [Ubuntu](ubuntu_setup.html)
+* [Windows](windows_setup.html)
 
-### C++ build tools
 
-1. A C++ compiler that supports C++ 11.
-[G++ (4.8 or later)](https://gcc.gnu.org/gcc-4.8/) or
-[Clang](http://clang.llvm.org/) is required.
+
 
-2. [Git](https://git-scm.com/downloads) for downloading the sources from 
Github repository.
+## Clone the MXNet Project
 
+1. Clone or fork the MXNet project.
+```bash
+git clone --recursive https://github.com/apache/incubator-mxnet mxnet
+cd mxnet
+```
 
+
 
+## Prerequisites
 
-### BLAS library
+The following sections will help you decide which specific prerequisites you 
need to install.
 
+ Math Library Selection
+It is useful to consider your math library selection prior to your other 
prerequisites.
 MXNet relies on the
 [BLAS](https://e

[incubator-mxnet] branch master updated: CMake: Do not touch CMAKE_GENERATOR_TOOLSET (#13321)

2018-11-20 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new afc4703  CMake: Do not touch CMAKE_GENERATOR_TOOLSET (#13321)
afc4703 is described below

commit afc4703306bdb7918227237c7b45ea19a42339d4
Author: Ruslan Baratov 
AuthorDate: Wed Nov 21 02:22:45 2018 +

CMake: Do not touch CMAKE_GENERATOR_TOOLSET (#13321)

From documentation:

  The value of this variable should never be modified by project code.
  ... changing the value has undefined behavior.

* https://cmake.org/cmake/help/latest/variable/CMAKE_GENERATOR_TOOLSET.html
---
 CMakeLists.txt | 1 -
 1 file changed, 1 deletion(-)

diff --git a/CMakeLists.txt b/CMakeLists.txt
index 42f6bff..2b8dda2 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -253,7 +253,6 @@ if(USE_CUDA)
 if(NOT CUDA_TOOLSET)
   set(CUDA_TOOLSET "${CUDA_VERSION_STRING}")
 endif()
-set(CMAKE_GENERATOR_TOOLSET "cuda=${CUDA_TOOLSET},host=x64")
   else()
 set(FIRST_CUDA FALSE)
   endif()



[incubator-mxnet] branch master updated: [Example] Update C++ tutorials (#13316) (#13317)

2018-11-20 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 8e888c1  [Example] Update C++ tutorials (#13316) (#13317)
8e888c1 is described below

commit 8e888c124149158d20a2b65fa2583ded55839087
Author: zhaoyao73 
AuthorDate: Tue Nov 20 21:10:20 2018 -0500

[Example] Update C++ tutorials (#13316) (#13317)

Update C++ tutorials up to date
---
 docs/tutorials/c++/basics.md | 12 ++--
 1 file changed, 6 insertions(+), 6 deletions(-)

diff --git a/docs/tutorials/c++/basics.md b/docs/tutorials/c++/basics.md
index d3231e7..aa73a73 100644
--- a/docs/tutorials/c++/basics.md
+++ b/docs/tutorials/c++/basics.md
@@ -8,9 +8,9 @@ The following contents assume that the working directory is 
`/path/to/mxnet/cpp-
 
 Load Data
 
-Before going into codes, we need to fetch MNIST data. You can either use the 
script `get_mnist.sh`,
+Before going into codes, we need to fetch MNIST data. You can either use the 
script `/path/to/mxnet/cpp-package/example/get_data.sh`,
 or download mnist data by yourself from Lecun's 
[website](http://yann.lecun.com/exdb/mnist/)
-and decompress them into `mnist_data` folder.
+and decompress them into `data/mnist_data` folder.
 
 Except linking the MXNet shared library, the C++ package itself is a 
header-only package,
 which means all you need to do is to include the header files. Among the 
header files,
@@ -36,14 +36,14 @@ The digits in MNIST are 2-dimension arrays, so we should 
set `flat` to true to f
 
 ```cpp
 auto train_iter = MXDataIter("MNISTIter")
-.SetParam("image", "./mnist_data/train-images-idx3-ubyte")
-.SetParam("label", "./mnist_data/train-labels-idx1-ubyte")
+.SetParam("image", "./data/mnist_data/train-images-idx3-ubyte")
+.SetParam("label", "./data/mnist_data/train-labels-idx1-ubyte")
 .SetParam("batch_size", batch_size)
 .SetParam("flat", 1)
 .CreateDataIter();
 auto val_iter = MXDataIter("MNISTIter")
-.SetParam("image", "./mnist_data/t10k-images-idx3-ubyte")
-.SetParam("label", "./mnist_data/t10k-labels-idx1-ubyte")
+.SetParam("image", "./data/mnist_data/t10k-images-idx3-ubyte")
+.SetParam("label", "./data/mnist_data/t10k-labels-idx1-ubyte")
 .SetParam("batch_size", batch_size)
 .SetParam("flat", 1)
 .CreateDataIter();



[incubator-mxnet] branch master updated: [Example]Refactor alexnet cpp example (#13278)

2018-11-20 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 4f481f5  [Example]Refactor alexnet cpp example (#13278)
4f481f5 is described below

commit 4f481f523b9b66b6fcc5d57d2a272ab829c659d2
Author: Jake Lee 
AuthorDate: Tue Nov 20 18:09:02 2018 -0800

[Example]Refactor alexnet cpp example (#13278)

* delete print to make the log more clean

* delete the log user don't need
---
 cpp-package/example/alexnet.cpp | 10 --
 1 file changed, 10 deletions(-)

diff --git a/cpp-package/example/alexnet.cpp b/cpp-package/example/alexnet.cpp
index 3d6e685..a5f4952 100644
--- a/cpp-package/example/alexnet.cpp
+++ b/cpp-package/example/alexnet.cpp
@@ -234,15 +234,6 @@ int main(int argc, char const *argv[]) {
  * initializer to call*/
 xavier(arg.first, &arg.second);
   }
-  /*print out to check the shape of the net*/
-  for (const auto &s : Net.ListArguments()) {
-LG << s;
-const auto &k = args_map[s].GetShape();
-for (const auto &i : k) {
-  std::cout << i << " ";
-}
-std::cout << std::endl;
-  }
 
   /*these binary files should be generated using im2rc tools, which can be 
found
* in mxnet/bin*/
@@ -275,7 +266,6 @@ int main(int argc, char const *argv[]) {
 train_iter.Reset();
 while (train_iter.Next()) {
   auto batch = train_iter.GetDataBatch();
-  LG << train_iter.GetDataBatch().index.size();
   /*use copyto to feed new data and label to the executor*/
   batch.data.CopyTo(&args_map["data"]);
   batch.label.CopyTo(&args_map["label"]);



[incubator-mxnet] branch master updated: modify code for working in gpu context. (#13302)

2018-11-16 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 64657c2  modify code for working in gpu context. (#13302)
64657c2 is described below

commit 64657c2c46f38ac3e87db8a342d3c2bfe7786f28
Author: pilhoon 
AuthorDate: Sat Nov 17 14:36:52 2018 +0900

modify code for working in gpu context. (#13302)
---
 docs/tutorials/python/predict_image.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/docs/tutorials/python/predict_image.md 
b/docs/tutorials/python/predict_image.md
index a9a0d29..8be98d9 100644
--- a/docs/tutorials/python/predict_image.md
+++ b/docs/tutorials/python/predict_image.md
@@ -69,6 +69,7 @@ def get_image(url, show=False):
 img = mx.image.imresize(img, 224, 224) # resize
 img = img.transpose((2, 0, 1)) # Channel first
 img = img.expand_dims(axis=0) # batchify
+img = img.astype('float32') # for gpu context
 return img
 
 def predict(url):



[incubator-mxnet] branch master updated: fix file lock issue (#13296)

2018-11-16 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 98830d5  fix file lock issue (#13296)
98830d5 is described below

commit 98830d53a0decc6f3998b27e8aa6688abf3bd749
Author: Ankit Khedia <36249596+ankkhe...@users.noreply.github.com>
AuthorDate: Fri Nov 16 21:29:07 2018 -0800

fix file lock issue (#13296)
---
 Makefile | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/Makefile b/Makefile
index b123117..ad7f0ff 100644
--- a/Makefile
+++ b/Makefile
@@ -597,7 +597,7 @@ rpkg:
 
 rpkgtest:
Rscript -e 
'require(testthat);res<-test_dir("R-package/tests/testthat");if(!testthat:::all_passed(res)){stop("Test
 failures", call. = FALSE)}'
-   Rscript -e 
'res<-covr:::package_coverage("R-package");fileConn<-file("r-package_coverage.json");writeLines(covr:::to_codecov(res),
 fileConn);close(fileConn)'
+   Rscript -e 
'res<-covr:::package_coverage("R-package");fileConn<-file(paste("r-package_coverage_",toString(runif(1)),".json"));writeLines(covr:::to_codecov(res),
 fileConn);close(fileConn)'
 
 scalaclean:
(cd $(ROOTDIR)/scala-package; \



[incubator-mxnet] branch master updated: [MXNET-1213] add Cent OS build for Scala (#13279)

2018-11-16 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new f87db9e  [MXNET-1213] add Cent OS build for Scala (#13279)
f87db9e is described below

commit f87db9eed45c208f9c13098205193c4cd41d04ee
Author: Lanking 
AuthorDate: Fri Nov 16 21:10:38 2018 -0800

[MXNET-1213] add Cent OS build for Scala (#13279)

* add centos build for Scala

* migrate the build portion to docker

* update build script and chmod +x

* address Jenkins change

* allow CentOS provide all depdencies
---
 Jenkinsfile| 13 -
 ci/docker/Dockerfile.build.centos7_cpu |  2 ++
 .../centos7_scala.sh}  | 34 --
 ci/docker/install/ubuntu_scala.sh  |  6 
 ci/docker/runtime_functions.sh |  7 +
 5 files changed, 33 insertions(+), 29 deletions(-)

diff --git a/Jenkinsfile b/Jenkinsfile
index 3f72843..fca8539 100644
--- a/Jenkinsfile
+++ b/Jenkinsfile
@@ -152,7 +152,7 @@ core_logic: {
   timeout(time: max_time, unit: 'MINUTES') {
 utils.init_git()
 utils.docker_run('centos7_cpu', 'build_centos7_cpu', false)
-utils.pack_lib('centos7_cpu', mx_lib, true)
+utils.pack_lib('centos7_cpu', mx_dist_lib, true)
   }
 }
   }
@@ -698,6 +698,17 @@ core_logic: {
 }
   }
 },
+'Scala: CentOS CPU': {
+  node(NODE_LINUX_CPU) {
+ws('workspace/ut-scala-centos7-cpu') {
+  timeout(time: max_time, unit: 'MINUTES') {
+utils.unpack_and_init('centos7_cpu', mx_dist_lib, true)
+utils.docker_run('centos7_cpu', 'unittest_centos7_cpu_scala', 
false)
+utils.publish_test_coverage()
+  }
+}
+  }
+},
 'Clojure: CPU': {
   node(NODE_LINUX_CPU) {
 ws('workspace/ut-clojure-cpu') {
diff --git a/ci/docker/Dockerfile.build.centos7_cpu 
b/ci/docker/Dockerfile.build.centos7_cpu
index 076ef5d..e2802aa 100644
--- a/ci/docker/Dockerfile.build.centos7_cpu
+++ b/ci/docker/Dockerfile.build.centos7_cpu
@@ -28,6 +28,8 @@ COPY install/centos7_ccache.sh /work/
 RUN /work/centos7_ccache.sh
 COPY install/centos7_python.sh /work/
 RUN /work/centos7_python.sh
+COPY install/centos7_scala.sh /work/
+RUN /work/centos7_scala.sh
 COPY install/ubuntu_mklml.sh /work/
 RUN /work/ubuntu_mklml.sh
 
diff --git a/ci/docker/Dockerfile.build.centos7_cpu 
b/ci/docker/install/centos7_scala.sh
old mode 100644
new mode 100755
similarity index 59%
copy from ci/docker/Dockerfile.build.centos7_cpu
copy to ci/docker/install/centos7_scala.sh
index 076ef5d..ea46de9
--- a/ci/docker/Dockerfile.build.centos7_cpu
+++ b/ci/docker/install/centos7_scala.sh
@@ -1,4 +1,5 @@
-# -*- mode: dockerfile -*-
+#!/usr/bin/env bash
+
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -15,27 +16,16 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-#
-# Dockerfile to build and run MXNet on CentOS 7 for CPU
-
-FROM centos:7
-
-WORKDIR /work/deps
-
-COPY install/centos7_core.sh /work/
-RUN /work/centos7_core.sh
-COPY install/centos7_ccache.sh /work/
-RUN /work/centos7_ccache.sh
-COPY install/centos7_python.sh /work/
-RUN /work/centos7_python.sh
-COPY install/ubuntu_mklml.sh /work/
-RUN /work/ubuntu_mklml.sh
 
-ARG USER_ID=0
-COPY install/centos7_adduser.sh /work/
-RUN /work/centos7_adduser.sh 
+# build and install are separated so changes to build don't invalidate
+# the whole docker cache for the image
 
-ENV PYTHONPATH=./python/
-WORKDIR /work/mxnet
+set -ex
 
-COPY runtime_functions.sh /work/
+yum install -y java-1.8.0-openjdk-devel
+# Build from source with Maven
+wget 
http://www.eu.apache.org/dist/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz
+tar xzf apache-maven-3.3.9-bin.tar.gz
+mkdir /usr/local/maven
+mv apache-maven-3.3.9/ /usr/local/maven/
+alternatives --install /usr/bin/mvn mvn 
/usr/local/maven/apache-maven-3.3.9/bin/mvn 1
diff --git a/ci/docker/install/ubuntu_scala.sh 
b/ci/docker/install/ubuntu_scala.sh
index c71c751..6ecb8d8 100755
--- a/ci/docker/install/ubuntu_scala.sh
+++ b/ci/docker/install/ubuntu_scala.sh
@@ -30,13 +30,7 @@ apt-get update || true
 apt-get install -y openjdk-8-jdk
 apt-get install -y openjdk-8-jre
 
-echo "deb https://dl.bintray.com/sbt/debian /" | tee -a 
/etc/apt/sources.list.d/sbt.list
-# ubuntu keyserver is very flaky
-#apt-k

[incubator-mxnet] branch master updated (96a2a09 -> 0e9a1ff)

2018-11-16 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 96a2a09  [Example]update NER example readme on module prediction 
(#13184)
 add 0e9a1ff  [MXNET-1198] MXNet Java API (#13162)

No new revisions were added by this update.

Summary of changes:
 docs/tutorials/index.md|   7 +
 docs/tutorials/java/mxnet_java_on_intellij.md  | 171 +
 docs/tutorials/java/ssd_inference.md   | 186 ++
 scala-package/.gitignore   |   3 +
 scala-package/core/pom.xml |  15 +-
 .../scala/org/apache/mxnet/javaapi/Context.scala   |  15 +-
 .../main/scala/org/apache/mxnet/javaapi/IO.scala   |  11 +-
 .../scala/org/apache/mxnet/javaapi/NDArray.scala   | 397 +
 .../scala/org/apache/mxnet/javaapi/Shape.scala |   2 +-
 .../java/org/apache/mxnet/javaapi/NDArrayTest.java |  85 +
 .../mxnet/javaapi/ResourceScopeTestSuite.java  | 110 ++
 scala-package/examples/pom.xml |  10 +-
 .../run_java_inference_bm.sh}  |  32 +-
 .../infer/objectdetector/run_ssd_example.sh|  14 +-
 .../objectdetector/run_ssd_java_example.sh}|  36 +-
 .../predictor/run_predictor_java_example.sh}   |  35 +-
 .../javaapi/benchmark/InferBase.java}  |  40 +--
 .../javaapi/benchmark/JavaBenchmark.java   | 129 +++
 .../benchmark/ObjectDetectionBenchmark.java|  64 
 .../javaapi}/infer/objectdetector/README.md|  35 +-
 .../infer/objectdetector/SSDClassifierExample.java | 199 +++
 .../javaapi/infer/predictor/PredictorExample.java  | 200 +++
 .../javaapi/infer/predictor/README.md  |  61 
 .../mxnetexamples/infer/objectdetector/README.md   |  20 +-
 .../objectdetector/SSDClassifierExample.scala  |   4 +-
 scala-package/infer/pom.xml|  10 +-
 .../mxnet/infer/javaapi/ObjectDetector.scala   | 128 +++
 .../infer/javaapi/ObjectDetectorOutput.scala}  |  19 +-
 .../org/apache/mxnet/infer/javaapi/Predictor.scala |  99 +
 .../scala/org/apache/mxnet/APIDocGenerator.scala   |  84 +
 .../scala/org/apache/mxnet/GeneratorBase.scala |  20 +-
 .../apache/mxnet/javaapi/JavaNDArrayMacro.scala| 125 +++
 .../org/apache/mxnet/utils/CToScalaUtils.scala |  21 +-
 .../test/scala/org/apache/mxnet/MacrosSuite.scala  |   2 +-
 scala-package/mxnet-demo/{ => java-demo}/Makefile  |   6 +-
 scala-package/mxnet-demo/{ => java-demo}/README.md |  39 +-
 .../{bin/demo.sh => java-demo/bin/java_sample.sh}  |   4 +-
 .../{bin/demo.sh => java-demo/bin/run_od.sh}   |   5 +-
 scala-package/mxnet-demo/java-demo/pom.xml |  25 ++
 .../src/main/java/sample/HelloWorld.java}  |  15 +-
 .../src/main/java/sample/ObjectDetection.java  | 101 ++
 scala-package/mxnet-demo/{ => scala-demo}/Makefile |   2 +-
 .../mxnet-demo/{ => scala-demo}/README.md  |  12 +-
 .../mxnet-demo/{ => scala-demo}/bin/demo.sh|   0
 .../mxnet-demo/{ => scala-demo}/bin/run_im.sh  |   0
 scala-package/mxnet-demo/{ => scala-demo}/pom.xml  |   0
 .../src/main/scala/sample/HelloWorld.scala |   0
 .../scala/sample/ImageClassificationExample.scala  |   0
 scala-package/pom.xml  |   4 +-
 tests/tutorials/test_sanity_tutorials.py   |   5 +-
 50 files changed, 2387 insertions(+), 220 deletions(-)
 create mode 100644 docs/tutorials/java/mxnet_java_on_intellij.md
 create mode 100644 docs/tutorials/java/ssd_inference.md
 create mode 100644 
scala-package/core/src/main/scala/org/apache/mxnet/javaapi/NDArray.scala
 create mode 100644 
scala-package/core/src/test/java/org/apache/mxnet/javaapi/NDArrayTest.java
 create mode 100644 
scala-package/core/src/test/java/org/apache/mxnet/javaapi/ResourceScopeTestSuite.java
 copy scala-package/examples/scripts/{neuralstyle_end2end/run_test_end2end.sh 
=> benchmark/run_java_inference_bm.sh} (67%)
 copy scala-package/examples/scripts/{benchmark/run_image_inference_bm.sh => 
infer/objectdetector/run_ssd_java_example.sh} (71%)
 copy scala-package/examples/scripts/{benchmark/run_image_inference_bm.sh => 
infer/predictor/run_predictor_java_example.sh} (69%)
 copy scala-package/{core/src/test/scala/org/apache/mxnet/TestUtil.scala => 
examples/src/main/java/org/apache/mxnetexamples/javaapi/benchmark/InferBase.java}
 (56%)
 create mode 100644 
scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/benchmark/JavaBenchmark.java
 create mode 100644 
scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/benchmark/ObjectDetectionBenchmark.java
 copy scala-package/examples/src/main/{scala/org/apache/mxnetexamples => 
java/org/apache/mxnetexamples/javaapi}/infer/objectdetector/README.md (65%)
 create mode 100644

[incubator-mxnet] branch java-api updated (bb7bbaf -> ab8772c)

2018-11-16 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from bb7bbaf  [MXNET-1182] Predictor example (#13237)
 add ab8772c  Reducing the length of setup tutorial (#13306)

No new revisions were added by this update.

Summary of changes:
 docs/tutorials/java/mxnet_java_on_intellij.md | 78 +--
 1 file changed, 13 insertions(+), 65 deletions(-)



[incubator-mxnet] branch master updated: fix train mnist for inception-bn and resnet (#13239)

2018-11-15 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 1ef83c9  fix train mnist for inception-bn and resnet (#13239)
1ef83c9 is described below

commit 1ef83c953ff58bf8c444ad84926a1e84745aa00f
Author: Lai Wei 
AuthorDate: Thu Nov 15 11:46:15 2018 -0800

fix train mnist for inception-bn and resnet (#13239)
---
 example/image-classification/train_mnist.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/example/image-classification/train_mnist.py 
b/example/image-classification/train_mnist.py
index 2bc4289..17a5a37 100644
--- a/example/image-classification/train_mnist.py
+++ b/example/image-classification/train_mnist.py
@@ -72,6 +72,7 @@ if __name__ == '__main__':
 help='the number of training examples')
 
 parser.add_argument('--add_stn',  action="store_true", default=False, 
help='Add Spatial Transformer Network Layer (lenet only)')
+parser.add_argument('--image_shape', default='1, 28, 28', help='shape of 
training images')
 
 fit.add_fit_args(parser)
 parser.set_defaults(



[incubator-mxnet] branch java-api updated (6f940cf -> 218a7a9)

2018-11-14 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 6f940cf  Java Benchmark failure (#13258)
 add 218a7a9  Addressing PR feedback for merging Java API into master 
(#13277)

No new revisions were added by this update.

Summary of changes:
 .../scala/org/apache/mxnet/javaapi/Context.scala   |   2 +-
 .../main/scala/org/apache/mxnet/javaapi/IO.scala   |   2 +-
 .../scala/org/apache/mxnet/javaapi/NDArray.scala   |   2 +-
 .../scala/org/apache/mxnet/javaapi/Shape.scala |   2 +-
 .../infer/objectdetector/run_ssd_example.sh|   2 +-
 .../javaapi/infer/objectdetector/README.md |  31 +-
 .../infer/objectdetector/SSDClassifierExample.java | 316 ++---
 .../mxnetexamples/infer/objectdetector/README.md   |  16 +-
 .../objectdetector/SSDClassifierExample.scala  |   4 +-
 .../mxnet/infer/javaapi/ObjectDetector.scala   |   2 +-
 .../org/apache/mxnet/infer/javaapi/Predictor.scala |   2 +-
 11 files changed, 175 insertions(+), 206 deletions(-)



[incubator-mxnet] branch java-api updated (6b39c6b -> 6f940cf)

2018-11-14 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 6b39c6b  Fixed missing break statement (#13257)
 add 6f940cf  Java Benchmark failure (#13258)

No new revisions were added by this update.

Summary of changes:
 scala-package/.gitignore  |  3 +++
 .../apache/mxnetexamples/javaapi/benchmark/JavaBenchmark.java | 11 ++-
 2 files changed, 5 insertions(+), 9 deletions(-)



[incubator-mxnet] branch master updated: Fix scaladoc build errors (#13189)

2018-11-14 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 8cb73ef  Fix scaladoc build errors (#13189)
8cb73ef is described below

commit 8cb73efb521f3653bd262022c5840afb854b98e2
Author: Zach Kimberg 
AuthorDate: Wed Nov 14 15:01:05 2018 -0800

Fix scaladoc build errors (#13189)

* Fix scaladoc errors from missing classpath

Remove duplicate scalastyle plugin

* Fix scaladoc warnings

Also enable and fix all feature and deprecation warnings
---
 docs/mxdoc.py| 9 +++--
 scala-package/core/pom.xml   | 4 
 scala-package/core/src/main/scala/org/apache/mxnet/Context.scala | 2 ++
 .../core/src/main/scala/org/apache/mxnet/Executor.scala  | 5 -
 scala-package/core/src/main/scala/org/apache/mxnet/IO.scala  | 7 ---
 scala-package/core/src/main/scala/org/apache/mxnet/KVStore.scala | 2 +-
 scala-package/core/src/main/scala/org/apache/mxnet/NDArray.scala | 1 +
 .../core/src/main/scala/org/apache/mxnet/Optimizer.scala | 2 +-
 .../core/src/main/scala/org/apache/mxnet/ResourceScope.scala | 6 +++---
 scala-package/core/src/main/scala/org/apache/mxnet/Symbol.scala  | 1 +
 .../core/src/main/scala/org/apache/mxnet/Visualization.scala | 1 +
 .../core/src/main/scala/org/apache/mxnet/io/MXDataIter.scala | 4 ++--
 .../core/src/main/scala/org/apache/mxnet/io/NDArrayIter.scala| 4 ++--
 .../src/main/scala/org/apache/mxnet/io/PrefetchingIter.scala | 4 ++--
 .../core/src/main/scala/org/apache/mxnet/io/ResizeIter.scala | 4 ++--
 .../core/src/main/scala/org/apache/mxnet/javaapi/Context.scala   | 1 +
 .../core/src/main/scala/org/apache/mxnet/javaapi/IO.scala| 2 ++
 .../core/src/main/scala/org/apache/mxnet/javaapi/Shape.scala | 1 +
 .../core/src/main/scala/org/apache/mxnet/module/BaseModule.scala | 6 +++---
 .../src/main/scala/org/apache/mxnet/module/BucketingModule.scala | 4 ++--
 .../org/apache/mxnet/module/DataParallelExecutorGroup.scala  | 4 ++--
 .../core/src/main/scala/org/apache/mxnet/module/Module.scala | 4 ++--
 .../main/scala/org/apache/mxnet/module/SequentialModule.scala| 4 ++--
 23 files changed, 44 insertions(+), 38 deletions(-)

diff --git a/docs/mxdoc.py b/docs/mxdoc.py
index 8570cae..8b26c89 100644
--- a/docs/mxdoc.py
+++ b/docs/mxdoc.py
@@ -110,8 +110,13 @@ def build_scala(app):
 def build_scala_docs(app):
 """build scala doc and then move the outdir"""
 scala_path = app.builder.srcdir + '/../scala-package'
-# scaldoc fails on some apis, so exit 0 to pass the check
-_run_cmd('cd ' + scala_path + '; scaladoc `find . -type f -name "*.scala" 
| egrep \"\/core|\/infer\" | egrep -v \"Suite|javaapi\"`; exit 0')
+scala_doc_sources = 'find . -type f -name "*.scala" | egrep 
\"\.\/core|\.\/infer\" | egrep -v \"Suite\"'
+scala_doc_classpath = ':'.join([
+'`find native -name "*.jar" | grep "target/lib/" | tr "\\n" ":" `',
+'`find macros -name "*-SNAPSHOT.jar" | tr "\\n" ":" `'
+])
+_run_cmd('cd {}; scaladoc `{}` -classpath {} -feature -deprecation'
+ .format(scala_path, scala_doc_sources, scala_doc_classpath))
 dest_path = app.builder.outdir + '/api/scala/docs'
 _run_cmd('rm -rf ' + dest_path)
 _run_cmd('mkdir -p ' + dest_path)
diff --git a/scala-package/core/pom.xml b/scala-package/core/pom.xml
index e93169f..56ff4db 100644
--- a/scala-package/core/pom.xml
+++ b/scala-package/core/pom.xml
@@ -93,10 +93,6 @@
 org.scalastyle
 scalastyle-maven-plugin
   
-  
-org.scalastyle
-scalastyle-maven-plugin
-  
 
   
   
diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/Context.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/Context.scala
index beeb430..ab44f43 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/Context.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/Context.scala
@@ -17,6 +17,8 @@
 
 package org.apache.mxnet
 
+import scala.language.implicitConversions
+
 object Context {
   val devtype2str = Map(1 -> "cpu", 2 -> "gpu", 3 -> "cpu_pinned")
   val devstr2type = Map("cpu" -> 1, "gpu" -> 2, "cpu_pinned" -> 3)
diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/Executor.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/Executor.scala
index 19fb6fe..b342a96 100644
--- a/scala-package/co

[incubator-mxnet] branch java-api updated (efd925e -> 6b39c6b)

2018-11-13 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from efd925e  Merge branch 'master' into java-api
 add 6b39c6b  Fixed missing break statement (#13257)

No new revisions were added by this update.

Summary of changes:
 .../java/org/apache/mxnetexamples/javaapi/benchmark/JavaBenchmark.java   | 1 +
 1 file changed, 1 insertion(+)



[incubator-mxnet] branch master updated: Addressed sphinx build issue (#13246)

2018-11-13 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new f79bb18  Addressed sphinx build issue (#13246)
f79bb18 is described below

commit f79bb18a060f796ebe3cb46d0c607b6576e77cc0
Author: vdantu <36211508+vda...@users.noreply.github.com>
AuthorDate: Tue Nov 13 15:46:14 2018 -0800

Addressed sphinx build issue (#13246)
---
 docs/_static/js/auto_module_index.js | 16 +---
 docs/api/python/ndarray/ndarray.md   |  7 ++-
 docs/api/python/ndarray/random.md|  1 +
 3 files changed, 16 insertions(+), 8 deletions(-)

diff --git a/docs/_static/js/auto_module_index.js 
b/docs/_static/js/auto_module_index.js
index 7f4e185..8df9a20 100644
--- a/docs/_static/js/auto_module_index.js
+++ b/docs/_static/js/auto_module_index.js
@@ -10,15 +10,17 @@ function auto_index(module) {
 var html = "";
 
 for (var i = 0; i < targets.length; ++i) {
-  var id = $(targets[i]).attr('id');
-  // remove 'mxnet.' prefix to make menus shorter
-  var id_simple = id.replace(/^mxnet\./, '');
-  html += "" + id_simple + "";
+   var id = $(targets[i]).attr('id');
+   if ( id ) {
+   // remove 'mxnet.' prefix to make menus shorter
+   var id_simple = id.replace(/^mxnet\./, '');
+   html += "" + id_simple + "";
+   }
 }
 
 html += "";
 li_node.append(html);
   });
-}
\ No newline at end of file
+}
diff --git a/docs/api/python/ndarray/ndarray.md 
b/docs/api/python/ndarray/ndarray.md
index 01a1544..37965e9 100644
--- a/docs/api/python/ndarray/ndarray.md
+++ b/docs/api/python/ndarray/ndarray.md
@@ -706,9 +706,14 @@ The `ndarray` package provides several classes:
 :members:
 :imported-members:
 :special-members:
-:exclude-members: CachedOp, NDArray
+:exclude-members: CachedOp, NDArray, save, load
+
+.. automodule:: mxnet.ndarray
+:noindex:
+:members: save, load
 
 .. automodule:: mxnet.random
+:noindex:
 :members:
 
 ```
diff --git a/docs/api/python/ndarray/random.md 
b/docs/api/python/ndarray/random.md
index 4341a3c..3ea611f 100644
--- a/docs/api/python/ndarray/random.md
+++ b/docs/api/python/ndarray/random.md
@@ -51,6 +51,7 @@ In the rest of this document, we list routines provided by 
the `ndarray.random`
 
 .. automodule:: mxnet.random
 :members:
+:noindex:
 
 ```
 



[incubator-mxnet] branch java-api updated (fb4cad9 -> efd925e)

2018-11-13 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from fb4cad9  [MXNET-918] [Introduce Random module / Refact code generation 
(#13038)][Cherry pick]  (#13242)
 add c90d16c  Fix mismatch shapes (#12793)
 add 3eff8e8  Make Gluon download function to be atomic (#12572)
 add 5c74e3a  Re-enables test_dropout (#12717)
 add efa7d3a  [MXNET -1004] Poisson NegativeLog Likelihood loss (#12697)
 add c9a9db6  Update osx.mk - Added "apple" to USE_BLAS comment (#12819)
 add 698bbec  [MXNet-1002] Add GluonCV and NLP tookits, Keras, and 
developer wiki to navigation (#12704)
 add b89a36d  fixed symbols naming in RNNCell, LSTMCell, GRUCell (#12794)
 add 5961dce  simplify mac mkldnn build (#12724)
 add 13030b6  Change the way NDArrayIter handle the last batch (#12545)
 add 815f36c  [MXNET-707] Add unit test for mxnet to coreml converter 
(#11952)
 add d096aa5  Add embedding to print_summary (#12796)
 add 527e6a0  Scala Docs - Replace old Symbol api usages (#12759)
 add eee72d9  [MXNET-892] ONNX export/import: DepthToSpace, SpaceToDepth 
operators (#12731)
 add 3b5b2b2  R install instructions update for macOS (#12832)
 add fe2c4d8  Fixed __setattr__ method of _MXClassPropertyMetaClass (#12811)
 add 5a52374  Fixed regex for matching platform type in Scala Benchmark 
scripts (#12826)
 add 8271005  Added context object to run TestCharRnn example (#12841)
 add 5a680fc  [MXNET-703] Show perf info for TensorRT during tests (#12656)
 add 89eb24b  Update Operator Implementation Tutorial (#12230)
 add 7463810  Fix broken links (#12856)
 add 1ebbf94  Fix Flaky Topk (#12798)
 add 775870f  Add Psroipooling CPU implementation (#12738)
 add 6376c86  ONNX export: Fully connected operator w/o bias, ReduceSum, 
Square (#12646)
 add 97e86ff  Undefined name: load_model() --> utils.load_model() (#12867)
 add 673e31f  ONNX export/import: Selu (#12785)
 add 2e04aab  Sparse support for logic ops (#12860)
 add c5a7331  add a tutorial for the subgraph API. (#12698)
 add 441fdb7  MKL-DNN Quantization Examples and README (#12808)
 add 42e7110  [MXNET-1033] Fix a bug in MultiboxTarget GPU implementation 
(#12840)
 add 3154ec3  [MXNET-1107] Fix CPUPinned unexpected behaviour (#12031)
 add 137b6f5  NativeResource Management in Scala (#12647)
 add 76d5197  add/update infer_range docs (#12879)
 add daada21  Fix __all__ in optimizer/optimizer.py (#12886)
 add 9c2810e  Add index_copy() operator (#12810)
 add 0ba259f  sparse support for take(csr, axis=0)  (#12889)
 add 524d01f  Add more models to benchmark_score (#12780)
 add d8c7375  [MXNET-1025] Add Jetpack 3.3 support to Jetson (#12735)
 add 58f4117  Fix Batch input issue with Scala Benchmark (#12848)
 add d3d343c  fix type inference in index_copy. (#12890)
 add 0137483  Extending the DCGAN example implemented by gluon API to 
provide a more straight-forward evaluation on the generated image (#12790)
 add d1234a4  [MXNET-674] Speed up GPU builds in CI (#12782)
 add 5b86701  [MXNET-793] ★ Virtualized testing in CI with QEMU ★ (#12094)
 add 3c81b3f  [MXNET-1017] Updating the readme file for cpp-package and 
adding readme file for example directory. (#12773)
 add be9ca1b  Fail the broken link job when broken links are found (#12905)
 add fce5154  Fix typo in formula in docstring for GRU cell and layer and 
add clarification to description (gluon.rnn) (#12896)
 add 38e32bd  fix the paths issue for downloading script (#12913)
 add 0874677  Ignore generated scala files. (#12928)
 add 6b4df85  use ResourceScope in Model/Trainer/FeedForward.scala (#12882)
 add 7d0f7d6  Disabled flaky test: 
test_gluon_gpu.test_slice_batchnorm_reshape_batchnorm (#12768)
 add af55104  Fix the operator API documentation (#12942)
 add 57176cd  fix indpt[0] for take(csr) (#12927)
 add 1b96fc9  getnnz operator  for CSR matrix (#12908)
 add 50028e9  fix broken docs (#12871)
 add 96df2c5  Add bytearray support back to imdecode (#12855, #12868) 
(#12912)
 add d93467e  Update tree lstm example (#12960)
 add a39152b  Update bilstm integer array sorting example (#12929)
 add ffe551e  Fix the bug of assigning large integer to NDArray (#12921)
 add fef9b5c  Refactor mkldnn test files (#12410)
 add ee5f699  CudnnFind() usage improvements (#12804)
 add ffeaf31  fix mac r install and windows python build from source docs 
(#12919)
 add 2f6d224  enable batchnorm unit tests (#12986)
 add 148819b  Update CONTRIBUTORS.md (#12996)
 add 1555735  fix Sphinx errors for tutorials and install ToCs (#12945)
 add 1f971c2  [MXNET -1030] Cosine Embedding Loss (#12750)
 add a362df1  [MXNET-1173] Debug operators - isfinite, isinf and isnan 
(#12967)
   

[incubator-mxnet] branch master updated (ea6ee0d -> 7baad6f)

2018-11-13 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from ea6ee0d  Add Java API docs generation (#13071)
 add 7baad6f  Fix Sphinx error in ONNX file (#13251)

No new revisions were added by this update.

Summary of changes:
 python/mxnet/contrib/onnx/onnx2mx/import_model.py | 10 --
 1 file changed, 4 insertions(+), 6 deletions(-)



[incubator-mxnet] branch master updated: update log4j version of Scala package (#13131)

2018-11-12 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new e1f221b  update log4j version of Scala package (#13131)
e1f221b is described below

commit e1f221bbd430f4ea9d2d9521212d18becd6675e7
Author: Lanking 
AuthorDate: Mon Nov 12 16:03:29 2018 -0800

update log4j version of Scala package (#13131)
---
 scala-package/pom.xml | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/scala-package/pom.xml b/scala-package/pom.xml
index 34a3f60..be28f0f 100644
--- a/scala-package/pom.xml
+++ b/scala-package/pom.xml
@@ -332,9 +332,9 @@
   1.10
 
 
-  log4j
-  log4j
-  1.2.17
+  org.apache.logging.log4j
+  log4j-core
+  2.11.1
   provided
 
 



[incubator-mxnet] branch java-api updated (3664a7c -> 1bb5b7f)

2018-11-12 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 3664a7c  [MXNET-1202] Change Builder class into a better way (#13159)
 add 1bb5b7f  [MXNET-1041] Add Java benchmark (#13095)

No new revisions were added by this update.

Summary of changes:
 .../run_java_inference_bm.sh}  |  32 +++--
 .../infer/objectdetector/run_ssd_java_example.sh   |   2 +-
 .../javaapi/benchmark/InferBase.java}  |  33 +++--
 .../javaapi/benchmark/JavaBenchmark.java   | 135 +
 .../benchmark/ObjectDetectionBenchmark.java|  64 ++
 .../infer}/objectdetector/README.md|   0
 .../objectdetector/SSDClassifierExample.java   |   2 +-
 .../mxnet/infer/javaapi/ObjectDetector.scala   |  10 +-
 8 files changed, 243 insertions(+), 35 deletions(-)
 copy scala-package/examples/scripts/{neuralstyle_end2end/run_test_end2end.sh 
=> benchmark/run_java_inference_bm.sh} (67%)
 copy 
scala-package/{spark/src/main/scala/org/apache/mxnet/spark/MXNDArray.scala => 
examples/src/main/java/org/apache/mxnetexamples/javaapi/benchmark/InferBase.java}
 (56%)
 create mode 100644 
scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/benchmark/JavaBenchmark.java
 create mode 100644 
scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/benchmark/ObjectDetectionBenchmark.java
 rename 
scala-package/examples/src/main/java/org/apache/mxnetexamples/{infer/javapi => 
javaapi/infer}/objectdetector/README.md (100%)
 rename 
scala-package/examples/src/main/java/org/apache/mxnetexamples/{infer/javapi => 
javaapi/infer}/objectdetector/SSDClassifierExample.java (99%)



[incubator-mxnet] branch java-api updated (149ea17 -> 3664a7c)

2018-11-12 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 149ea17  [MXNET-1187] Added Tutorial for Java under 
mxnet.io/docs/tutorials (#13183)
 add 3664a7c  [MXNET-1202] Change Builder class into a better way (#13159)

No new revisions were added by this update.

Summary of changes:
 .../scala/org/apache/mxnet/javaapi/NDArray.scala   | 14 -
 .../java/org/apache/mxnet/javaapi/NDArrayTest.java |  6 +-
 .../scala/org/apache/mxnet/APIDocGenerator.scala   | 56 +
 .../apache/mxnet/javaapi/JavaNDArrayMacro.scala| 73 +++---
 4 files changed, 82 insertions(+), 67 deletions(-)



[incubator-mxnet] tag 1.3.1 deleted (was c1327f3)

2018-11-12 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to tag 1.3.1
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


*** WARNING: tag 1.3.1 was deleted! ***

 was c1327f3  news, readme update for v1.3.1 release (#13225)

The revisions that were on this tag are still contained in
other references; therefore, this change does not discard any commits
from the repository.



[incubator-mxnet] tag 1.3.1.rc0 created (now c1327f3)

2018-11-12 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to tag 1.3.1.rc0
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


  at c1327f3  (commit)
No new revisions were added by this update.



[incubator-mxnet] tag 1.3.1 created (now c1327f3)

2018-11-12 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to tag 1.3.1
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


  at c1327f3  (commit)
No new revisions were added by this update.



svn commit: r30861 - in /dev/incubator/mxnet/1.3.1.rc0: ./ apache-mxnet-src-1.3.1.rc0-incubating.tar.gz apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.asc apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.sh

2018-11-12 Thread nswamy
Author: nswamy
Date: Mon Nov 12 22:46:28 2018
New Revision: 30861

Log:
Add mxnet-1.3.1.rc0

Added:
dev/incubator/mxnet/1.3.1.rc0/
dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz  
 (with props)

dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.asc

dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.sha512

Added: 
dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz
==
Binary file - no diff available.

Propchange: 
dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz
--
svn:mime-type = application/octet-stream

Added: 
dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.asc
==
--- 
dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.asc 
(added)
+++ 
dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.asc 
Mon Nov 12 22:46:28 2018
@@ -0,0 +1,16 @@
+-BEGIN PGP SIGNATURE-
+
+iQIzBAABCAAdFiEEqj68w+Zadorj0qZLjvR7hyDoxUkFAlvqATwACgkQjvR7hyDo
+xUkjIA//Q1cjC0mAlGOAVXQiW8BjpWgJAPCnuFrTGiu3vQUg/ajYEUwRPfesfcmZ
+UfFlOhh9g4v0ZLjb8HevgzgM668sDqbZvhM7Ti5KI5ich9WYVSscsNGnn1O20X6C
+IahSGA4Ly0Ks5L6GunqiGAtsqzMp4Hf87T1P70s49PKPnqLptK5shu/uWn/E8hUg
+AvIKBKaM4J2A/O/kGD2U9HxOHFDhjEEWXOfEletZ7SiNuJXXMqk5OT2fmnLv2/oj
+AdWJW2ps2uFrwjNuJ9jW4uuOZMNlTaagG7KX3IuoPxY7iXfYqDxBd3BRteiuy+3S
+tT+w709qMFRuhvW7EInHRspTTDQBzIGNqKEzdrEhRYVM8BlW9I2/kQHROzuEIJP+
+U5ivkefBDcN/Kim/b+Z7VJY214sbzP0p4x6ULXGDC7ZC+nMItxCyGa3xgyDhHb+9
+jmYz+ad+fHwsjhcF/LUd9v9eGy1JgoMIwqh/TgKkChlmqEVYN+fAXz+V1sz71lQG
+ALsSj9N6NIi3OLzdegxJCjHEEIYFBIy8OQ17cBBolgx7u2zFD+wOx03O0OSBwgeO
+22RZXZ/N31l6lXUs3TkaRmZ3UrMmax7Pt+yLI2ieFKJK7Flh+zAAqSporsdYE9TQ
+Ppa2Cud4KiAUeIkFfyVwOSi6z+XS7mU3gmt93yq1+esJ4cvkxlk=
+=VMny
+-END PGP SIGNATURE-

Added: 
dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.sha512
==
--- 
dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.sha512
 (added)
+++ 
dev/incubator/mxnet/1.3.1.rc0/apache-mxnet-src-1.3.1.rc0-incubating.tar.gz.sha512
 Mon Nov 12 22:46:28 2018
@@ -0,0 +1 @@
+1b581839e0924e98b4215dee0a8553c2f6bce6ca1d00901b4df79c15bb9df0cd6700608b03f2a3e53ac4f7445416f2b62bde66c97434d1656574e4e68926a6a6
  apache-mxnet-src-1.3.1.rc0-incubating.tar.gz




[incubator-mxnet] branch master updated (6c82829 -> 3c5fa16)

2018-11-12 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 6c82829  remove unused variable rotateM_ (#10803)
 add 3c5fa16  Revert "Sphinx failure fixes" (#13230)

No new revisions were added by this update.

Summary of changes:
 docs/api/python/ndarray/ndarray.md | 2 --
 python/mxnet/ndarray/utils.py  | 4 ++--
 2 files changed, 2 insertions(+), 4 deletions(-)



[incubator-mxnet] branch v1.3.x updated: news, readme update for v1.3.1 release (#13225)

2018-11-12 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.3.x
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/v1.3.x by this push:
 new c1327f3  news, readme update for v1.3.1 release (#13225)
c1327f3 is described below

commit c1327f317a7f371a9803a3f27be6ff90dc6af707
Author: Anton Chernov 
AuthorDate: Mon Nov 12 23:03:12 2018 +0100

news, readme update for v1.3.1 release (#13225)

* news, readme update for v1.3.1 release

* Added release notes
---
 NEWS.md| 97 ++
 R-package/DESCRIPTION  |  2 +-
 contrib/clojure-package/README.md  | 10 +--
 .../examples/cnn-text-classification/project.clj   |  2 +-
 contrib/clojure-package/examples/gan/project.clj   |  2 +-
 .../examples/imclassification/project.clj  |  2 +-
 .../clojure-package/examples/module/project.clj|  2 +-
 .../examples/multi-label/project.clj   |  2 +-
 .../examples/neural-style/project.clj  |  2 +-
 .../examples/pre-trained-models/project.clj|  2 +-
 .../clojure-package/examples/profiler/project.clj  |  2 +-
 contrib/clojure-package/examples/rnn/project.clj   |  2 +-
 .../clojure-package/examples/tutorial/project.clj  |  2 +-
 .../examples/visualization/project.clj |  2 +-
 contrib/clojure-package/project.clj|  4 +-
 include/mxnet/base.h   |  2 +-
 python/mxnet/libinfo.py|  2 +-
 scala-package/assembly/linux-x86_64-cpu/pom.xml|  8 +-
 scala-package/assembly/linux-x86_64-gpu/pom.xml|  8 +-
 scala-package/assembly/osx-x86_64-cpu/pom.xml  |  8 +-
 scala-package/assembly/pom.xml |  2 +-
 scala-package/core/pom.xml |  6 +-
 scala-package/examples/pom.xml |  6 +-
 scala-package/infer/pom.xml|  4 +-
 scala-package/init-native/linux-x86_64/pom.xml |  4 +-
 scala-package/init-native/osx-x86_64/pom.xml   |  4 +-
 scala-package/init-native/pom.xml  |  2 +-
 scala-package/init/pom.xml |  2 +-
 scala-package/macros/pom.xml   |  6 +-
 scala-package/native/linux-x86_64-cpu/pom.xml  |  4 +-
 scala-package/native/linux-x86_64-gpu/pom.xml  |  4 +-
 scala-package/native/osx-x86_64-cpu/pom.xml|  4 +-
 scala-package/native/pom.xml   |  2 +-
 scala-package/pom.xml  |  2 +-
 scala-package/spark/pom.xml|  4 +-
 snapcraft.yaml |  2 +-
 36 files changed, 159 insertions(+), 62 deletions(-)

diff --git a/NEWS.md b/NEWS.md
index b9770ca..68cb2b0 100644
--- a/NEWS.md
+++ b/NEWS.md
@@ -1,5 +1,102 @@
 MXNet Change Log
 
+
+## 1.3.1
+
+### Bug fixes
+
+* [MXNET-953] Fix oob memory read (v1.3.x) / 
[#13118](https://github.com/apache/incubator-mxnet/pull/13118)  
+Simple bugfix addressing an out-of-bounds memory read.
+
+
+* [MXNET-969] Fix buffer overflow in RNNOp (v1.3.x) / 
[#13119](https://github.com/apache/incubator-mxnet/pull/13119)  
+This fixes an buffer overflow detected by ASAN.
+
+
+* CudnnFind() usage improvements (v1.3.x) / 
[#13123](https://github.com/apache/incubator-mxnet/pull/13123)  
+  This PR improves the MXNet's use of cudnnFind() to address a few issues:
+  1. With the gluon imperative style, cudnnFind() is called during forward(), 
and so might have its timings perturbed by other GPU activity (including 
potentially other cudnnFind() calls).
+  2. With some cuda drivers versions, care is needed to ensure that the large 
I/O and workspace cudaMallocs() performed by cudnnFind() are immediately 
released and available to MXNet.
+  3. cudnnFind() makes both conv I/O and workspace allocations that must be 
covered by the GPU global memory headroom defined by 
MXNET_GPU_MEM_POOL_RESERVE. Per issue #12662, large convolutions can result in 
out-of-memory errors, even when MXNet's storage allocator has free memory in 
its pool.  
+  
+  This PR addresses these issues, providing the following benefits:
+  1. Consistent algo choice for a given convolution type in a model, both for 
instances in the same GPU and in other GPUs in a multi-GPU training setting.
+  2. Consistent algo choice from run to run, based on eliminating sources of 
interference of the cudnnFind() timing process.
+  3. Consistent model global memory footprint, both because of the consistent 
algo choice (algo's can have markedly different workspace requirements) and 
changes to MXNet's use of cudaMalloc.
+  4. Increased training performance based on being able to consistently run 
with models that approach the GPU's full global memory footprint.
+  5. Adds a unittest for and solves issue #12662.
+
+

[incubator-mxnet] branch v1.3.x updated (7fc344c -> 0cb2ad6)

2018-11-12 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch v1.3.x
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 7fc344c  Disable flaky test test_operator.test_dropout (#13200)
 add 0cb2ad6  Revert "Set correct update on kvstore flag in 
dist_device_sync mode (v1.3.x) (#13121)" (#13228)

No new revisions were added by this update.

Summary of changes:
 python/mxnet/gluon/trainer.py | 17 +++--
 tests/nightly/dist_device_sync_kvstore.py | 19 ---
 2 files changed, 3 insertions(+), 33 deletions(-)



[incubator-mxnet] branch master updated: Sphinx failure fixes (#13213)

2018-11-11 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 2e4d6c8  Sphinx failure fixes (#13213)
2e4d6c8 is described below

commit 2e4d6c8c1064b74d4e1c1b3441c2ecf12b81c6e2
Author: vdantu <36211508+vda...@users.noreply.github.com>
AuthorDate: Sun Nov 11 23:50:38 2018 -0800

Sphinx failure fixes (#13213)
---
 docs/api/python/ndarray/ndarray.md | 2 ++
 python/mxnet/ndarray/utils.py  | 4 ++--
 2 files changed, 4 insertions(+), 2 deletions(-)

diff --git a/docs/api/python/ndarray/ndarray.md 
b/docs/api/python/ndarray/ndarray.md
index 01a1544..9d3059f 100644
--- a/docs/api/python/ndarray/ndarray.md
+++ b/docs/api/python/ndarray/ndarray.md
@@ -704,12 +704,14 @@ The `ndarray` package provides several classes:
 
 .. automodule:: mxnet.ndarray
 :members:
+:noindex:
 :imported-members:
 :special-members:
 :exclude-members: CachedOp, NDArray
 
 .. automodule:: mxnet.random
 :members:
+:noindex:
 
 ```
 
diff --git a/python/mxnet/ndarray/utils.py b/python/mxnet/ndarray/utils.py
index ff93d0b..b366dfd 100644
--- a/python/mxnet/ndarray/utils.py
+++ b/python/mxnet/ndarray/utils.py
@@ -244,9 +244,9 @@ def save(fname, data):
 >>> mx.nd.save('my_list', [x,y])
 >>> mx.nd.save('my_dict', {'x':x, 'y':y})
 >>> mx.nd.load('my_list')
-[, ]
+``[, ]``
 >>> mx.nd.load('my_dict')
-{'y': , 'x': }
+``{'y': , 'x': }``
 """
 if isinstance(data, NDArray):
 data = [data]



[incubator-mxnet] branch master updated: Fix Sphinx python docstring error: text contrib module (#12949) (#13149)

2018-11-11 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 042472a  Fix Sphinx python docstring error: text contrib module 
(#12949) (#13149)
042472a is described below

commit 042472aa27711f293b0262826ed1790aed434fa7
Author: Frank Liu 
AuthorDate: Sun Nov 11 23:46:33 2018 -0800

Fix Sphinx python docstring error: text contrib module (#12949) (#13149)
---
 python/mxnet/contrib/text/embedding.py | 78 +-
 python/mxnet/contrib/text/vocab.py | 12 +++---
 2 files changed, 8 insertions(+), 82 deletions(-)

diff --git a/python/mxnet/contrib/text/embedding.py 
b/python/mxnet/contrib/text/embedding.py
index 277f782..e2a05c8 100644
--- a/python/mxnet/contrib/text/embedding.py
+++ b/python/mxnet/contrib/text/embedding.py
@@ -161,7 +161,7 @@ class _TokenEmbedding(vocab.Vocabulary):
 pre-trained token embedding file, are taken as the indexed tokens of the 
embedding.
 
 
-Properties
+Attributes
 --
 token_to_idx : dict mapping str to int
 A dict mapping each token to its index integer.
@@ -506,25 +506,6 @@ class GloVe(_TokenEmbedding):
 embedding vectors, such as loaded from a pre-trained token embedding 
file. If None, all the
 tokens from the loaded embedding vectors, such as loaded from a 
pre-trained token embedding
 file, will be indexed.
-
-
-Properties
---
-token_to_idx : dict mapping str to int
-A dict mapping each token to its index integer.
-idx_to_token : list of strs
-A list of indexed tokens where the list indices and the token indices 
are aligned.
-unknown_token : hashable object
-The representation for any unknown token. In other words, any unknown 
token will be indexed
-as the same representation.
-reserved_tokens : list of strs or None
-A list of reserved tokens that will always be indexed.
-vec_len : int
-The length of the embedding vector for each token.
-idx_to_vec : mxnet.ndarray.NDArray
-For all the indexed tokens in this embedding, this NDArray maps each 
token's index to an
-embedding vector. The largest valid index maps to the initialized 
embedding vector for every
-reserved token, such as an unknown_token token and a padding token.
 """
 
 # Map a pre-trained token embedding archive file and its SHA-1 hash.
@@ -610,25 +591,6 @@ class FastText(_TokenEmbedding):
 embedding vectors, such as loaded from a pre-trained token embedding 
file. If None, all the
 tokens from the loaded embedding vectors, such as loaded from a 
pre-trained token embedding
 file, will be indexed.
-
-
-Properties
---
-token_to_idx : dict mapping str to int
-A dict mapping each token to its index integer.
-idx_to_token : list of strs
-A list of indexed tokens where the list indices and the token indices 
are aligned.
-unknown_token : hashable object
-The representation for any unknown token. In other words, any unknown 
token will be indexed
-as the same representation.
-reserved_tokens : list of strs or None
-A list of reserved tokens that will always be indexed.
-vec_len : int
-The length of the embedding vector for each token.
-idx_to_vec : mxnet.ndarray.NDArray
-For all the indexed tokens in this embedding, this NDArray maps each 
token's index to an
-embedding vector. The largest valid index maps to the initialized 
embedding vector for every
-reserved token, such as an unknown_token token and a padding token.
 """
 
 # Map a pre-trained token embedding archive file and its SHA-1 hash.
@@ -687,25 +649,6 @@ class CustomEmbedding(_TokenEmbedding):
 embedding vectors, such as loaded from a pre-trained token embedding 
file. If None, all the
 tokens from the loaded embedding vectors, such as loaded from a 
pre-trained token embedding
 file, will be indexed.
-
-
-Properties
---
-token_to_idx : dict mapping str to int
-A dict mapping each token to its index integer.
-idx_to_token : list of strs
-A list of indexed tokens where the list indices and the token indices 
are aligned.
-unknown_token : hashable object
-The representation for any unknown token. In other words, any unknown 
token will be indexed
-as the same representation.
-reserved_tokens : list of strs or None
-A list of reserved tokens that will always be indexed.
-vec_len : int
-The length of the embedding vector for each token.
-idx_to_vec : mxnet.ndarray.NDArray
-For all the indexed tokens in this embedding, this NDArray maps each 
token'

[incubator-mxnet] branch master updated: Fix Sphinx python docstring error: initializer.InitDesc (#12939) (#13148)

2018-11-11 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new f971d64  Fix Sphinx python docstring error: initializer.InitDesc 
(#12939) (#13148)
f971d64 is described below

commit f971d64bfc262d064dcad55dc4d1172ae0f36cd8
Author: Frank Liu 
AuthorDate: Sun Nov 11 23:44:50 2018 -0800

Fix Sphinx python docstring error: initializer.InitDesc (#12939) (#13148)
---
 python/mxnet/initializer.py | 9 +
 1 file changed, 5 insertions(+), 4 deletions(-)

diff --git a/python/mxnet/initializer.py b/python/mxnet/initializer.py
index 357e75b..b67ab62 100755
--- a/python/mxnet/initializer.py
+++ b/python/mxnet/initializer.py
@@ -32,10 +32,11 @@ from . import ndarray
 
 # inherit str for backward compatibility
 class InitDesc(str):
-"""Descriptor for the initialization pattern.
+"""
+Descriptor for the initialization pattern.
 
-Parameter
--
+Parameters
+--
 name : str
 Name of variable.
 attrs : dict of str to str
@@ -67,7 +68,7 @@ class Initializer(object):
 print_func : function
 A function that computes statistics of initialized arrays.
 Takes an `NDArray` and returns an `str`. Defaults to mean
-absolute value str((|x|/size(x)).asscalar()).
+absolute value str((abs(x)/size(x)).asscalar()).
 """
 self._verbose = verbose
 if print_func is None:



[incubator-mxnet] branch master updated: Fix Sphinx doc errors (#13170)

2018-11-11 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new e402242  Fix Sphinx doc errors (#13170)
e402242 is described below

commit e4022426ed0ea5b1dd8e4e3f8dd47f01cdb5e800
Author: Vandana Kannan 
AuthorDate: Sun Nov 11 23:43:19 2018 -0800

Fix Sphinx doc errors (#13170)
---
 docs/api/python/image/image.md|  2 +-
 docs/api/python/module/module.md  |  2 +-
 docs/api/python/symbol/symbol.md  |  6 +++---
 python/mxnet/contrib/svrg_optimization/svrg_module.py |  1 +
 python/mxnet/rnn/rnn.py   |  6 +++---
 python/mxnet/rnn/rnn_cell.py  |  2 +-
 python/mxnet/symbol/symbol.py |  2 +-
 python/mxnet/symbol_doc.py| 13 -
 src/operator/contrib/adaptive_avg_pooling.cc  |  4 ++--
 9 files changed, 13 insertions(+), 25 deletions(-)

diff --git a/docs/api/python/image/image.md b/docs/api/python/image/image.md
index d5adaea..0622a30 100644
--- a/docs/api/python/image/image.md
+++ b/docs/api/python/image/image.md
@@ -57,7 +57,7 @@ Iterators support loading image from binary `Record IO` and 
raw image files.
 
 We use helper function to initialize augmenters
 ```eval_rst
-.. currentmodule:: mxnet
+.. currentmodule:: mxnet
 .. autosummary::
 :nosignatures:
 
diff --git a/docs/api/python/module/module.md b/docs/api/python/module/module.md
index 5a874ac..7a86ecc 100644
--- a/docs/api/python/module/module.md
+++ b/docs/api/python/module/module.md
@@ -176,7 +176,7 @@ additional functionality. We summarize them in this section.
 .. autosummary::
 :nosignatures:
 
-BucketModule.switch_bucket
+BucketingModule.switch_bucket
 ```
 
 ### Class `SequentialModule`
diff --git a/docs/api/python/symbol/symbol.md b/docs/api/python/symbol/symbol.md
index 7c78cbd..583f174 100644
--- a/docs/api/python/symbol/symbol.md
+++ b/docs/api/python/symbol/symbol.md
@@ -297,8 +297,8 @@ Composite multiple symbols into a new one by an operator.
 Symbol.take
 Symbol.one_hot
 Symbol.pick
-Symbol.ravel_multi_index
-Symbol.unravel_index
+ravel_multi_index
+unravel_index
 ```
 
 ### Get internal and output symbol
@@ -577,7 +577,7 @@ Composite multiple symbols into a new one by an operator.
 broadcast_logical_and
 broadcast_logical_or
 broadcast_logical_xor
-broadcast_logical_not
+logical_not
 ```
 
 ### Random sampling
diff --git a/python/mxnet/contrib/svrg_optimization/svrg_module.py 
b/python/mxnet/contrib/svrg_optimization/svrg_module.py
index 5d6b5dd..47d0e57 100644
--- a/python/mxnet/contrib/svrg_optimization/svrg_module.py
+++ b/python/mxnet/contrib/svrg_optimization/svrg_module.py
@@ -401,6 +401,7 @@ class SVRGModule(Module):
 force_rebind=False, force_init=False, begin_epoch=0, 
num_epoch=None,
 validation_metric=None, monitor=None, sparse_row_id_fn=None):
 """Trains the module parameters.
+
 Parameters
 --
 train_data : DataIter
diff --git a/python/mxnet/rnn/rnn.py b/python/mxnet/rnn/rnn.py
index 47307c5..0255c55 100644
--- a/python/mxnet/rnn/rnn.py
+++ b/python/mxnet/rnn/rnn.py
@@ -35,7 +35,7 @@ def save_rnn_checkpoint(cells, prefix, epoch, symbol, 
arg_params, aux_params):
 
 Parameters
 --
-cells : RNNCell or list of RNNCells
+cells : mxnet.rnn.RNNCell or list of RNNCells
 The RNN cells used by this symbol.
 prefix : str
 Prefix of model name.
@@ -65,7 +65,7 @@ def load_rnn_checkpoint(cells, prefix, epoch):
 
 Parameters
 --
-cells : RNNCell or list of RNNCells
+cells : mxnet.rnn.RNNCell or list of RNNCells
 The RNN cells used by this symbol.
 prefix : str
 Prefix of model name.
@@ -100,7 +100,7 @@ def do_rnn_checkpoint(cells, prefix, period=1):
 
 Parameters
 --
-cells : RNNCell or list of RNNCells
+cells : mxnet.rnn.RNNCell or list of RNNCells
 The RNN cells used by this symbol.
 prefix : str
 The file prefix to checkpoint to
diff --git a/python/mxnet/rnn/rnn_cell.py b/python/mxnet/rnn/rnn_cell.py
index 9097cba..3f8a459 100644
--- a/python/mxnet/rnn/rnn_cell.py
+++ b/python/mxnet/rnn/rnn_cell.py
@@ -716,7 +716,7 @@ class FusedRNNCell(BaseRNNCell):
 
 Returns
 ---
-cell : SequentialRNNCell
+cell : mxnet.rnn.SequentialRNNCell
 unfused cell that can be used for stepping, and can run on CPU.
 """
 stack = SequentialRNNCell()
diff --git a/python/mxnet/symbol/symbol.py b/python/mxnet/symbol/symbol.py
index c657507..530d727 100644
--- a/python/mxnet/symbol/symbol.py
+++ b/python/mxnet/symbol/symbol.py
@@ -

[incubator-mxnet] branch master updated: update the doc (#13205)

2018-11-11 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new c18ad7d  update the doc (#13205)
c18ad7d is described below

commit c18ad7decdae712d4681d0bb2a0eec38d1d8f627
Author: Jake Lee 
AuthorDate: Sun Nov 11 23:36:12 2018 -0800

update the doc (#13205)
---
 example/rcnn/README.md | 1 +
 1 file changed, 1 insertion(+)

diff --git a/example/rcnn/README.md b/example/rcnn/README.md
index ab3c8fb..b528418 100644
--- a/example/rcnn/README.md
+++ b/example/rcnn/README.md
@@ -32,6 +32,7 @@ Make a directory `data` and follow `py-faster-rcnn` for data 
preparation instruc
 ### Training and evaluation
 Use `python3 train.py --dataset $Dataset$ --network $Network$ --pretrained 
$IMAGENET_MODEL_FILE$ --gpus $GPUS$` to train,
 for example, `python3 train.py --dataset voc --network vgg16 --pretrained 
model/vgg16-.params --gpus 0,1`.
+use `python3 train.py --dataset voc --imageset 2007_trainval+2012_trainval 
--network vgg16 --pretrained model/vgg16-.params --gpus 0,1` to train on 
both of voc2007 and voc2012.
 Use `python3 test.py --dataset $Dataset$ --network $Network$ --params 
$MODEL_FILE$ --gpu $GPU$` to evaluate,
 for example, `python3 test.py --dataset voc --network vgg16 --params 
model/vgg16-0010.params --gpu 0`.
 



[incubator-mxnet] branch v1.3.x updated: Disable flaky test test_operator.test_dropout (#13200)

2018-11-11 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch v1.3.x
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/v1.3.x by this push:
 new 7fc344c  Disable flaky test test_operator.test_dropout (#13200)
7fc344c is described below

commit 7fc344c76930e13c3ae2ab49121262216db3306a
Author: Anton Chernov 
AuthorDate: Mon Nov 12 07:08:43 2018 +0100

Disable flaky test test_operator.test_dropout (#13200)
---
 tests/python/unittest/test_operator.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/tests/python/unittest/test_operator.py 
b/tests/python/unittest/test_operator.py
index c385c57..f58f0e4 100644
--- a/tests/python/unittest/test_operator.py
+++ b/tests/python/unittest/test_operator.py
@@ -5736,6 +5736,7 @@ def test_stack():
 check_numeric_gradient(out, inputs)
 
 
+@unittest.skip("Flaky test 
https://github.com/apache/incubator-mxnet/issues/12329";)
 @with_seed()
 def test_dropout():
 def zero_count(array, ratio):



[incubator-mxnet] branch master updated (b441628 -> 517ede1)

2018-11-11 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from b441628  Port of scala Image API to clojure (#13107)
 add 517ede1  Fixed Documentation issues (#13215)

No new revisions were added by this update.

Summary of changes:
 python/mxnet/metric.py   |  2 +-
 python/mxnet/module/sequential_module.py | 15 ---
 2 files changed, 9 insertions(+), 8 deletions(-)



[incubator-mxnet] branch master updated: Fixed Sparse astype doc string formatting error (#13171)

2018-11-09 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 6ab06e6  Fixed Sparse astype doc string formatting error (#13171)
6ab06e6 is described below

commit 6ab06e6d0cecbe6ba8cb510ce65864bdb59b0e45
Author: Rakesh Vasudevan 
AuthorDate: Fri Nov 9 22:57:56 2018 -0800

Fixed Sparse astype doc string formatting error (#13171)
---
 python/mxnet/ndarray/sparse.py | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/python/mxnet/ndarray/sparse.py b/python/mxnet/ndarray/sparse.py
index 7b4cc90..3d18a59 100644
--- a/python/mxnet/ndarray/sparse.py
+++ b/python/mxnet/ndarray/sparse.py
@@ -195,7 +195,8 @@ class BaseSparseNDArray(NDArray):
 return self.tostype('default').asnumpy()
 
 def astype(self, dtype, copy=True):
-"""Returns a copy of the array after casting to a specified type.
+"""Return a copy of the array after casting to a specified type.
+
 Parameters
 --
 dtype : numpy.dtype or str
@@ -205,6 +206,7 @@ class BaseSparseNDArray(NDArray):
 allocated ndarray on the same context. If this is set to
 `False`, and the dtype requested is the same as the ndarray's
 dtype, the ndarray is returned instead of a copy.
+
 Examples
 
 >>> x = mx.nd.sparse.zeros('row_sparse', (2,3), dtype='float32')



[incubator-mxnet] branch master updated: Fix #13013, Fix Sphinx python docstring error. (#13173)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 24b8de0  Fix #13013, Fix Sphinx python docstring error. (#13173)
24b8de0 is described below

commit 24b8de08251eb8a08650744e022deb6378f41f1a
Author: Frank Liu 
AuthorDate: Thu Nov 8 21:13:00 2018 -0800

Fix #13013, Fix Sphinx python docstring error. (#13173)
---
 python/mxnet/ndarray/contrib.py | 22 +++---
 1 file changed, 11 insertions(+), 11 deletions(-)

diff --git a/python/mxnet/ndarray/contrib.py b/python/mxnet/ndarray/contrib.py
index b663e58..6bbee8a 100644
--- a/python/mxnet/ndarray/contrib.py
+++ b/python/mxnet/ndarray/contrib.py
@@ -140,9 +140,9 @@ def foreach(body, data, init_states):
 NDArrays.
 
 body takes two arguments as input and outputs a tuple of two elements,
-as illustrated below:
+as illustrated below::
 
-out, states = body(data1, states)
+out, states = body(data1, states)
 
 data1 can be either an NDArray or a list of NDArrays. If data is an 
NDArray,
 data1 is an NDArray. Otherwise, data1 is a list of NDArrays and has the 
same
@@ -152,15 +152,15 @@ def foreach(body, data, init_states):
 are the second output of foreach.
 
 The computation done by this operator is equivalent to the pseudo code 
below
-when the input data is NDArray:
-
-states = init_states
-outs = []
-for i in data.shape[0]:
-s = data[i]
-out, states = body(s, states)
-outs.append(out)
-outs = stack(*outs)
+when the input data is NDArray::
+
+states = init_states
+outs = []
+for i in data.shape[0]:
+s = data[i]
+out, states = body(s, states)
+outs.append(out)
+outs = stack(*outs)
 
 
 Parameters



[incubator-mxnet] branch master updated: Fix #12944, Fix Sphinx python docstring formatting error. (#13174)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 34d24b4  Fix #12944, Fix Sphinx python docstring formatting error. 
(#13174)
34d24b4 is described below

commit 34d24b4af3bb3128f72de067c829fdd9472f4988
Author: Frank Liu 
AuthorDate: Thu Nov 8 21:11:38 2018 -0800

Fix #12944, Fix Sphinx python docstring formatting error. (#13174)
---
 python/mxnet/gluon/parameter.py | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/python/mxnet/gluon/parameter.py b/python/mxnet/gluon/parameter.py
index f53eeb0..b57defa 100644
--- a/python/mxnet/gluon/parameter.py
+++ b/python/mxnet/gluon/parameter.py
@@ -740,9 +740,9 @@ class ParameterDict(object):
 return param
 
 def get_constant(self, name, value=None):
-"""Retrieves a :py:class:`Constant` with name ``self.prefix+name``. If 
not found,
+"""Retrieves a :py:class:`.Constant` with name ``self.prefix+name``. 
If not found,
 :py:func:`get` will first try to retrieve it from "shared" dict. If 
still not
-found, :py:func:`get` will create a new :py:class:`Constant` with 
key-word
+found, :py:func:`get` will create a new :py:class:`.Constant` with 
key-word
 arguments and insert it to self.
 
 Parameters
@@ -756,7 +756,7 @@ class ParameterDict(object):
 Returns
 ---
 Constant
-The created or retrieved :py:class:`Constant`.
+The created or retrieved :py:class:`.Constant`.
 """
 name = self.prefix + name
 param = self._get_impl(name)



[incubator-mxnet] branch master updated: Fix Sphinx docstring formatting error. (#13004, #13005, #13006) (#13175)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 0166793  Fix Sphinx docstring formatting error. (#13004, #13005, 
#13006) (#13175)
0166793 is described below

commit 0166793dbb680ce07c77f7e213bc4b4b246c3591
Author: Frank Liu 
AuthorDate: Thu Nov 8 21:08:57 2018 -0800

Fix Sphinx docstring formatting error. (#13004, #13005, #13006) (#13175)
---
 python/mxnet/gluon/rnn/rnn_cell.py |  6 +++---
 python/mxnet/rnn/rnn_cell.py   | 18 +-
 2 files changed, 12 insertions(+), 12 deletions(-)

diff --git a/python/mxnet/gluon/rnn/rnn_cell.py 
b/python/mxnet/gluon/rnn/rnn_cell.py
index b57dc93..98e96fc 100644
--- a/python/mxnet/gluon/rnn/rnn_cell.py
+++ b/python/mxnet/gluon/rnn/rnn_cell.py
@@ -333,7 +333,7 @@ class RNNCell(HybridRecurrentCell):
 Initializer for the bias vector.
 h2h_bias_initializer : str or Initializer, default 'zeros'
 Initializer for the bias vector.
-prefix : str, default 'rnn_'
+prefix : str, default ``'rnn_'``
 Prefix for name of `Block`s
 (and name of weight if params is `None`).
 params : Parameter or None
@@ -440,7 +440,7 @@ class LSTMCell(HybridRecurrentCell):
 Initializer for the bias vector.
 h2h_bias_initializer : str or Initializer, default 'zeros'
 Initializer for the bias vector.
-prefix : str, default 'lstm_'
+prefix : str, default ``'lstm_'``
 Prefix for name of `Block`s
 (and name of weight if params is `None`).
 params : Parameter or None, default None
@@ -565,7 +565,7 @@ class GRUCell(HybridRecurrentCell):
 Initializer for the bias vector.
 h2h_bias_initializer : str or Initializer, default 'zeros'
 Initializer for the bias vector.
-prefix : str, default 'gru_'
+prefix : str, default ``'gru_'``
 prefix for name of `Block`s
 (and name of weight if params is `None`).
 params : Parameter or None, default None
diff --git a/python/mxnet/rnn/rnn_cell.py b/python/mxnet/rnn/rnn_cell.py
index 3301102..9097cba 100644
--- a/python/mxnet/rnn/rnn_cell.py
+++ b/python/mxnet/rnn/rnn_cell.py
@@ -368,7 +368,7 @@ class RNNCell(BaseRNNCell):
 Number of units in output symbol.
 activation : str or Symbol, default 'tanh'
 Type of activation function. Options are 'relu' and 'tanh'.
-prefix : str, default 'rnn_'
+prefix : str, default ``'rnn_'``
 Prefix for name of layers (and name of weight if params is None).
 params : RNNParams, default None
 Container for weight sharing between cells. Created if None.
@@ -412,7 +412,7 @@ class LSTMCell(BaseRNNCell):
 --
 num_hidden : int
 Number of units in output symbol.
-prefix : str, default 'lstm_'
+prefix : str, default ``'lstm_'``
 Prefix for name of layers (and name of weight if params is None).
 params : RNNParams, default None
 Container for weight sharing between cells. Created if None.
@@ -475,7 +475,7 @@ class GRUCell(BaseRNNCell):
 --
 num_hidden : int
 Number of units in output symbol.
-prefix : str, default 'gru_'
+prefix : str, default ``'gru_'``
 Prefix for name of layers (and name of weight if params is None).
 params : RNNParams, default None
 Container for weight sharing between cells. Created if None.
@@ -554,7 +554,7 @@ class FusedRNNCell(BaseRNNCell):
 Whether to return the states that can be used as starting states next 
time.
 forget_bias : bias added to forget gate, default 1.0.
 Jozefowicz et al. 2015 recommends setting this to 1.0
-prefix : str, default '$mode_' such as 'lstm_'
+prefix : str, default ``'$mode_'`` such as ``'lstm_'``
 Prefix for names of layers
 (this prefix is also used for names of weights if `params` is None
 i.e. if `params` are being created and not reused)
@@ -832,7 +832,7 @@ class DropoutCell(BaseRNNCell):
 dropout : float
 Percentage of elements to drop out, which
 is 1 - percentage to retain.
-prefix : str, default 'dropout_'
+prefix : str, default ``'dropout_'``
 Prefix for names of layers
 (this prefix is also used for names of weights if `params` is None
 i.e. if `params` are being created and not reused)
@@ -1007,7 +1007,7 @@ class BidirectionalCell(BaseRNNCell):
 params : RNNParams, default None.
 Container for weight sharing between cells.
 A new RNNParams container is created if `params` is None.
-output_prefix

[incubator-mxnet] branch master updated: Fix #13090, Add image.imread to python API doc. (#13176)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 3a6dcc7  Fix #13090, Add image.imread to python API doc. (#13176)
3a6dcc7 is described below

commit 3a6dcc7d3ba93628549502adedd1505df5364bc0
Author: Frank Liu 
AuthorDate: Thu Nov 8 21:07:15 2018 -0800

Fix #13090, Add image.imread to python API doc. (#13176)
---
 docs/api/python/image/image.md | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/docs/api/python/image/image.md b/docs/api/python/image/image.md
index 11fff4f..d5adaea 100644
--- a/docs/api/python/image/image.md
+++ b/docs/api/python/image/image.md
@@ -16,6 +16,7 @@ images provided in
 .. autosummary::
 :nosignatures:
 
+image.imread
 image.imdecode
 image.scale_down
 image.resize_short
@@ -163,6 +164,7 @@ and a list of augmenters specific for `Object detection` is 
provided
 .. autoclass:: mxnet.image.ImageIter
 :members:
 
+.. automethod:: mxnet.image.imread
 .. automethod:: mxnet.image.imdecode
 .. automethod:: mxnet.image.scale_down
 .. automethod:: mxnet.image.resize_short



[incubator-mxnet] branch master updated (7f1d53e -> 3bbbf6d)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 7f1d53e  Improve cpp-package example project build files. (#13093)
 add 3bbbf6d  Fix Sphinx document parsing error. (#13195)

No new revisions were added by this update.

Summary of changes:
 docs/api/python/gluon/model_zoo.md | 2 ++
 docs/api/python/io/io.md   | 5 +++--
 2 files changed, 5 insertions(+), 2 deletions(-)



[incubator-mxnet] branch master updated: Improve cpp-package example project build files. (#13093)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 7f1d53e  Improve cpp-package example project build files. (#13093)
7f1d53e is described below

commit 7f1d53e6d023a65b2fe83364417a99012de83ea4
Author: Frank Liu 
AuthorDate: Thu Nov 8 16:01:27 2018 -0800

Improve cpp-package example project build files. (#13093)

1. Change output to build folder.
2. Remove files that not been deleted by make clean.
---
 cpp-package/example/Makefile   |  6 --
 cpp-package/example/README.md  | 20 ++--
 cpp-package/example/example.mk |  2 +-
 3 files changed, 15 insertions(+), 13 deletions(-)

diff --git a/cpp-package/example/Makefile b/cpp-package/example/Makefile
index eb0676c..6b64469 100644
--- a/cpp-package/example/Makefile
+++ b/cpp-package/example/Makefile
@@ -16,6 +16,7 @@
 # under the License.
 
 prebuild :
+   @mkdir -p build
$(shell ./get_data.sh)
$(shell cp -r ../../lib ./)
 CPPEX_SRC = $(wildcard *.cpp)
@@ -38,8 +39,9 @@ debug: CPPEX_CFLAGS += -DDEBUG -g
 debug: prebuild all
 
 
+
 $(CPPEX_EXE):% : %.cpp
-   $(CXX) -std=c++0x $(CFLAGS)  $(CPPEX_CFLAGS) -o $@ $(filter %.cpp %.a, 
$^) $(CPPEX_EXTRA_LDFLAGS)
+   $(CXX) -std=c++0x $(CFLAGS)  $(CPPEX_CFLAGS) -o build/$@ $(filter %.cpp 
%.a, $^) $(CPPEX_EXTRA_LDFLAGS)
 
 clean:
-   rm -f $(CPPEX_EXE)
+   @rm -rf build
diff --git a/cpp-package/example/README.md b/cpp-package/example/README.md
index 5d2f3b0..64f6044 100644
--- a/cpp-package/example/README.md
+++ b/cpp-package/example/README.md
@@ -27,7 +27,7 @@ This directory contains following examples. In order to run 
the examples, ensure
 The example implements the C++ version of AlexNet. The networks trains on 
MNIST data. The number of epochs can be specified as a command line argument. 
For example to train with 10 epochs use the following:
 
```
-   ./alexnet 10
+   build/alexnet 10
```
 
 ### 
[googlenet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/googlenet.cpp>)
@@ -35,7 +35,7 @@ The example implements the C++ version of AlexNet. The 
networks trains on MNIST
 The code implements a GoogLeNet/Inception network using the C++ API. The 
example uses MNIST data to train the network. By default, the example trains 
the model for 100 epochs. The number of epochs can also be specified in the 
command line. For example, to train the model for 10 epochs use the following:
 
 ```
-./googlenet 10
+build/googlenet 10
 ```
 
 ### 
[mlp.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp.cpp>)
@@ -44,7 +44,7 @@ The code implements a multilayer perceptron from scratch. The 
example creates it
 To run the example use the following command:
 
 ```
-./mlp
+build/mlp
 ```
 
 ### 
[mlp_cpu.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_cpu.cpp>)
@@ -53,7 +53,7 @@ The code implements a multilayer perceptron to train the 
MNIST data. The code de
 To run the example use the following command:
 
 ```
-./mlp_cpu
+build/mlp_cpu
 ```
 
 ### 
[mlp_gpu.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_gpu.cpp>)
@@ -61,7 +61,7 @@ To run the example use the following command:
 The code implements a multilayer perceptron to train the MNIST data. The code 
demonstrates the use of the "SimpleBind"  C++ API and MNISTIter. The example is 
designed to work on GPU. The example does not require command line arguments. 
To run the example execute following command:
 
 ```
-./mlp_gpu
+build/mlp_gpu
 ```
 
 ### 
[mlp_csv.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_csv.cpp>)
@@ -69,7 +69,7 @@ The code implements a multilayer perceptron to train the 
MNIST data. The code de
 The code implements a multilayer perceptron to train the MNIST data. The code 
demonstrates the use of the "SimpleBind"  C++ API and CSVIter. The CSVIter can 
iterate data that is in CSV format. The example can be run on CPU or GPU. The 
example usage is as follows:
 
 ```
-mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 
--batch_size 100 --hidden_units "128,64,64 [--gpu]"
+build/mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv 
--epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]"
 ```
 
 ### 
[resnet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/resnet.cpp>)
@@ -77,7 +77,7 @@ mlp_csv --train mnist_training_set.csv --test 
mnist_test_set.csv --epochs 10 --b
 The code implements a resnet model using the C++ API. The model is used to 
train MNIST data. The number of epochs for training the model can be specified 
on the command line. By default, mo

[incubator-mxnet] branch master updated: Update scala intellij tutorial (#12827)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 99534c9  Update scala intellij tutorial (#12827)
99534c9 is described below

commit 99534c97cd767af13e45c6934b13912ce2f6c892
Author: Zach Kimberg 
AuthorDate: Thu Nov 8 12:16:33 2018 -0800

Update scala intellij tutorial (#12827)

* Update scala intellij tutorial

Update mxnet version
log4j fixes
Instructions from source

* Remove version numbers and various improvements
---
 docs/tutorials/scala/mxnet_scala_on_intellij.md | 64 ++---
 1 file changed, 58 insertions(+), 6 deletions(-)

diff --git a/docs/tutorials/scala/mxnet_scala_on_intellij.md 
b/docs/tutorials/scala/mxnet_scala_on_intellij.md
index 497b1cd..e28359b 100644
--- a/docs/tutorials/scala/mxnet_scala_on_intellij.md
+++ b/docs/tutorials/scala/mxnet_scala_on_intellij.md
@@ -73,7 +73,6 @@ The configuration you should update is in the pom file's 
dependency for MXNet:
 
   org.apache.mxnet
   mxnet-full_2.11-osx-x86_64-cpu
-  1.2.0
 
 ```
 
@@ -158,7 +157,7 @@ The project's `pom.xml` will be open for editing.
 
 **Step 3.** Replace the pom file's content with the following code. Changes 
include:
   - Project properties: `scala.version`, upgrading from `2.11.5` to `2.11.8`
-  - Project dependencies: adding the MXNet package from Maven and updating the 
dependency for JUnitRunner (specs2-junit_)
+  - Project dependencies: adding the MXNet package from Maven and updating the 
dependency for JUnitRunner (specs2-junit_) and logging
   - Build options: removing '-make:transitive'
 
 
@@ -204,19 +203,25 @@ The project's `pom.xml` will be open for editing.
 UTF-8
 2.11.8
 2.11
+1.7.7
+osx-x86_64-cpu
   
 
   
 
   org.apache.mxnet
   mxnet-full_2.11-osx-x86_64-cpu
-  1.2.0
 
 
   org.scala-lang
   scala-library
   ${scala.version}
 
+
+  args4j
+  args4j
+  2.0.29
+
 
 
 
@@ -237,6 +242,18 @@ The project's `pom.xml` will be open for editing.
   2.2.4
   test
 
+
+
+
+  org.slf4j
+  slf4j-api
+  ${slf4jVersion}
+
+
+  org.slf4j
+  slf4j-log4j12
+  ${slf4jVersion}
+
   
 
   
@@ -292,11 +309,24 @@ The project's `pom.xml` will be open for editing.
 
 Click "Import Changes" in this prompt.
 
-**Step 5.** Build the project:
+**Step 5.** Setup log4j configuration
+
+Create a folder `src/main/resources` and a new file in it 
`src/main/resources/log4j.properties` with the contents:
+
+```
+log4j.rootLogger = info, stdout
+
+log4j.appender.stdout = org.apache.log4j.ConsoleAppender
+log4j.appender.stdout.Target = System.out
+log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
+log4j.appender.stdout.layout.ConversionPattern=%d{-MM-dd HH:mm:ss,SSS} 
[%t] [%c] [%p] - %m%n
+```
+
+**Step 6.** Build the project:
 - To build the project, from the menu choose Build, and then choose Build 
Project.
 
 
-**Step 6.** Run the Hello World App:
+**Step 7.** Run the Hello World App:
 
 ![hello world 
app](https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/scala/intellij-project-hello-world-app.png)
 
@@ -306,7 +336,7 @@ Navigate to the App included with the project.
 
 Run the App by clicking the green arrow, and verify the Hello World output
 
-**Step 7.** Run Sample MXNet Code in the App:
+**Step 8.** Run Sample MXNet Code in the App:
 
 ![run hello 
mxnet](https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/scala/intellij-project-hello-mxnet.png)
 
@@ -347,6 +377,28 @@ Library not loaded: 
/usr/local/opt/opencv/lib/libopencv_calib3d.x.x.dylib
 This can be resolved be installing OpenCV.
 
 
+### Using MXNet from source
+
+If you chose to "Build from Source" when following the [install 
instructions](https://mxnet.incubator.apache.org/install/index.html) (or the 
detailed [build from source 
instructions](https://mxnet.incubator.apache.org/install/build_from_source.html#installing-mxnet-language-bindings)),
 you can use your custom build instead of the build from maven.  Use your build 
by editing the `pom.xml` file and replacing the `org.apache.mxnet` dependency 
with the following:
+
+```
+  org.apache.mxnet
+  mxnet-core_${scala.version}-${platform}-sources
+  system
+  
/PathToMXNetSource/incubator-mxnet/scala-package/assembly/osx-x86_64-cpu/target/mxnet-full_${scala.version}-osx-x86_64-cpu-1.3.1-SNAPSHOT-sources.jar
+
+
+
+  org.apache.mxnet
+  mxnet-full_${scala.version}-${platform}
+  system
+  
/PathToMXNetSource/incubator-mxnet/scala-package/assembly/osx-x86_64-cpu/target/mxnet-full_${scala.version}-osx-x86_64-cpu-1.3.1-SNAPSHOT.jar
+
+```
+
+Note that you have to edit both of the `

[incubator-mxnet] branch master updated: [Doc] Fix repo paths in Ubuntu build doc (#13101)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new d424c0e  [Doc] Fix repo paths in Ubuntu build doc (#13101)
d424c0e is described below

commit d424c0ee664ee6d70e48c3d85802902701957d0c
Author: Holger Kohr 
AuthorDate: Thu Nov 8 21:15:37 2018 +0100

[Doc] Fix repo paths in Ubuntu build doc (#13101)

* [Doc] Fix repo paths in Ubuntu build doc

* [Doc] Use relative path in Ubuntu build doc
---
 docs/install/ubuntu_setup.md | 8 
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/install/ubuntu_setup.md b/docs/install/ubuntu_setup.md
index 804887a..8aac143 100644
--- a/docs/install/ubuntu_setup.md
+++ b/docs/install/ubuntu_setup.md
@@ -162,7 +162,7 @@ If building on CPU and using OpenBLAS:
 
 ```bash
 git clone --recursive https://github.com/apache/incubator-mxnet.git
-cd mxnet
+cd incubator-mxnet
 make -j $(nproc) USE_OPENCV=1 USE_BLAS=openblas
 ```
 
@@ -170,7 +170,7 @@ If building on CPU and using MKL and MKL-DNN (make sure MKL 
is installed accordi
 
 ```bash
 git clone --recursive https://github.com/apache/incubator-mxnet.git
-cd mxnet
+cd incubator-mxnet
 make -j $(nproc) USE_OPENCV=1 USE_BLAS=mkl USE_MKLDNN=1
 ```
 
@@ -178,7 +178,7 @@ If building on GPU and you want OpenCV and OpenBLAS (make 
sure you have installe
 
 ```bash
 git clone --recursive https://github.com/apache/incubator-mxnet.git
-cd mxnet
+cd incubator-mxnet
 make -j $(nproc) USE_OPENCV=1 USE_BLAS=openblas USE_CUDA=1 
USE_CUDA_PATH=/usr/local/cuda USE_CUDNN=1
 ```
 
@@ -189,7 +189,7 @@ Building from source creates a library called 
```libmxnet.so``` in the `lib` fol
 You may also want to add the MXNet shared library to your `LD_LIBRARY_PATH`:
 
 ```bash
-export LD_LIBRARY_PATH=~/incubator-mxnet/lib
+export LD_LIBRARY_PATH=$PWD/lib
 ```
 
 After building the MXNet library, you may install language bindings.



[incubator-mxnet] branch master updated (012288f -> 55ee9b3)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 012288f  Updates to several examples (#13068)
 add 55ee9b3  Fix Sphinx python docstring formatting error. (#13177)

No new revisions were added by this update.

Summary of changes:
 python/mxnet/gluon/nn/basic_layers.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



[incubator-mxnet] branch java-api updated (2df7a61 -> 149ea17)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 2df7a61  use ResourceScope in Model/Trainer/FeedForward.scala (#12882) 
(#13164)
 add 149ea17  [MXNET-1187] Added Tutorial for Java under 
mxnet.io/docs/tutorials (#13183)

No new revisions were added by this update.

Summary of changes:
 docs/tutorials/index.md   |   6 +
 docs/tutorials/java/mxnet_java_on_intellij.md | 210 ++
 tests/tutorials/test_sanity_tutorials.py  |   3 +-
 3 files changed, 218 insertions(+), 1 deletion(-)
 create mode 100644 docs/tutorials/java/mxnet_java_on_intellij.md



[incubator-mxnet] branch v1.3.x updated (edfcfcf -> 27dc5c8)

2018-11-08 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch v1.3.x
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from edfcfcf  [MXNET-1179] Enforce deterministic algorithms in convolution 
layers (v1.3.x) (#13152)
 add 27dc5c8  Remove test for non existing index copy operator (#13180)

No new revisions were added by this update.

Summary of changes:
 tests/python/unittest/test_operator.py | 25 +
 1 file changed, 1 insertion(+), 24 deletions(-)



[incubator-mxnet] branch master updated (5d6a7ac -> 498e03d)

2018-11-07 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 5d6a7ac  [MXNET-1194] Reenable nightly tutorials tests for Python2 and 
Python3 (#13099)
 add 498e03d  Update dec example (#12950)

No new revisions were added by this update.

Summary of changes:
 example/deep-embedded-clustering/README.md | 11 ++-
 example/deep-embedded-clustering/data.py   | 15 ---
 example/deep-embedded-clustering/dec.py| 25 -
 3 files changed, 30 insertions(+), 21 deletions(-)



[incubator-mxnet] branch java-api updated (62d2800 -> 2df7a61)

2018-11-07 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 62d2800  Maven Surefire bug workaround (#13097)
 add 2df7a61  use ResourceScope in Model/Trainer/FeedForward.scala (#12882) 
(#13164)

No new revisions were added by this update.

Summary of changes:
 .../main/scala/org/apache/mxnet/FeedForward.scala  | 152 +
 .../scala/org/apache/mxnet/NativeResource.scala|   8 +-
 .../scala/org/apache/mxnet/ResourceScope.scala |  35 +++--
 .../imclassification/TrainModel.scala  |  80 +--
 .../imclassification/util/Trainer.scala| 133 +-
 5 files changed, 230 insertions(+), 178 deletions(-)



[incubator-mxnet] branch master updated: Fix example for mxnet.nd.contrib.cond and fix typo in src/engine (#12954)

2018-11-04 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 974a04c  Fix example for mxnet.nd.contrib.cond and fix typo in 
src/engine (#12954)
974a04c is described below

commit 974a04cdb97fbabe079f79f1d9ebc2d14a793034
Author: JackieWu 
AuthorDate: Mon Nov 5 01:57:32 2018 +0800

Fix example for mxnet.nd.contrib.cond and fix typo in src/engine (#12954)

* fix typo in src/engine

* fix example for mx.nd.contrib.cond
---
 python/mxnet/ndarray/contrib.py | 4 ++--
 src/engine/engine.cc| 2 +-
 src/engine/threaded_engine.h| 2 +-
 3 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/python/mxnet/ndarray/contrib.py b/python/mxnet/ndarray/contrib.py
index 5dcc54a..b663e58 100644
--- a/python/mxnet/ndarray/contrib.py
+++ b/python/mxnet/ndarray/contrib.py
@@ -437,8 +437,8 @@ def cond(pred, then_func, else_func):
 
 >>> a, b = mx.nd.array([1]), mx.nd.array([2])
 >>> pred = a * b < 5
->>> then_func = lambda a, b: (a + 5) * (b + 5)
->>> else_func = lambda a, b: (a - 5) * (b - 5)
+>>> then_func = lambda: (a + 5) * (b + 5)
+>>> else_func = lambda: (a - 5) * (b - 5)
 >>> outputs = mx.nd.contrib.cond(pred, then_func, else_func)
 >>> outputs[0]
 [42.]
diff --git a/src/engine/engine.cc b/src/engine/engine.cc
index 1c72f33..a33f0b2 100644
--- a/src/engine/engine.cc
+++ b/src/engine/engine.cc
@@ -48,7 +48,7 @@ inline Engine* CreateEngine() {
   ret = CreateNaiveEngine();
   #endif
 
-  if (ret ==nullptr) {
+  if (ret == nullptr) {
 LOG(FATAL) << "Cannot find Engine " << type;
   }
   if (!default_engine) {
diff --git a/src/engine/threaded_engine.h b/src/engine/threaded_engine.h
index a2c1a2b..ccfd09d 100644
--- a/src/engine/threaded_engine.h
+++ b/src/engine/threaded_engine.h
@@ -182,7 +182,7 @@ class ThreadedVar final
  private:
   // TODO(hotpxl) change this to spinlock for faster runtime
   // TODO(hotpxl) consider rename head
-  /*! \brief inetrnal mutex of the ThreadedVar */
+  /*! \brief internal mutex of the ThreadedVar */
   std::mutex mutex_;
   /*!
* \brief number of pending reads operation in the variable.



[incubator-mxnet] branch master updated: ONNX export: Scalar, Reshape - Set appropriate tensor type (#13067)

2018-11-04 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new f5cef48  ONNX export: Scalar, Reshape - Set appropriate tensor type 
(#13067)
f5cef48 is described below

commit f5cef48161b5864ce49692b0e4b3bf4b452b7d72
Author: Vandana Kannan 
AuthorDate: Sun Nov 4 09:30:28 2018 -0800

ONNX export: Scalar, Reshape - Set appropriate tensor type (#13067)

np.array sets default dtype to float64 which is
not supported by ONNX. Setting these to appropriate type.
---
 python/mxnet/contrib/onnx/mx2onnx/_op_translations.py | 14 +++---
 1 file changed, 7 insertions(+), 7 deletions(-)

diff --git a/python/mxnet/contrib/onnx/mx2onnx/_op_translations.py 
b/python/mxnet/contrib/onnx/mx2onnx/_op_translations.py
index 11e75d9..fb2e697 100644
--- a/python/mxnet/contrib/onnx/mx2onnx/_op_translations.py
+++ b/python/mxnet/contrib/onnx/mx2onnx/_op_translations.py
@@ -843,7 +843,9 @@ def scalar_op_helper(node, op_name, **kwargs):
 """Helper function for scalar arithmetic operations"""
 name, input_nodes, attrs = get_inputs(node, kwargs)
 
-scalar_value = [float(attrs.get("scalar", 1))]
+input_type = kwargs["in_type"]
+scalar_value = np.array([attrs.get("scalar", 1)],
+
dtype=onnx.mapping.TENSOR_TYPE_TO_NP_TYPE[input_type])
 
 initializer = kwargs["initializer"]
 flag = True
@@ -864,17 +866,15 @@ def scalar_op_helper(node, op_name, **kwargs):
 
 # else create a new tensor of the scalar value, add it in initializer
 if flag is True:
-np_arr = np.array(scalar_value)
-data_type = onnx.mapping.NP_TYPE_TO_TENSOR_TYPE[np_arr.dtype]
-dims = np.shape(np_arr)
+dims = np.shape(scalar_value)
 
 scalar_op_name = "scalar_op" + str(kwargs["idx"])
-tensor_node = onnx.helper.make_tensor_value_info(scalar_op_name, 
data_type, dims)
+tensor_node = onnx.helper.make_tensor_value_info(scalar_op_name, 
input_type, dims)
 
 initializer.append(
 onnx.helper.make_tensor(
 name=scalar_op_name,
-data_type=data_type,
+data_type=input_type,
 dims=dims,
 vals=scalar_value,
 raw=False,
@@ -1249,7 +1249,7 @@ def convert_reshape(node, **kwargs):
 output_shape_list = convert_string_to_list(attrs["shape"])
 
 initializer = kwargs["initializer"]
-output_shape_np = np.array(output_shape_list)
+output_shape_np = np.array(output_shape_list, dtype='int64')
 data_type = onnx.mapping.NP_TYPE_TO_TENSOR_TYPE[output_shape_np.dtype]
 dims = np.shape(output_shape_np)
 



[incubator-mxnet] branch master updated: Update module example (#12961)

2018-11-04 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 976fd00  Update module example (#12961)
976fd00 is described below

commit 976fd00d96bdc0e09ea226a8ce6f76bc4707f903
Author: Thomas Delteil 
AuthorDate: Sun Nov 4 09:27:18 2018 -0800

Update module example (#12961)

* Update Module example

* trigger CI
---
 example/{ => module}/utils/__init__.py | 0
 example/{ => module}/utils/get_data.py | 6 +++---
 2 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/example/utils/__init__.py b/example/module/utils/__init__.py
similarity index 100%
rename from example/utils/__init__.py
rename to example/module/utils/__init__.py
diff --git a/example/utils/get_data.py b/example/module/utils/get_data.py
similarity index 94%
rename from example/utils/get_data.py
rename to example/module/utils/get_data.py
index 861d16c..2a585ea 100644
--- a/example/utils/get_data.py
+++ b/example/module/utils/get_data.py
@@ -17,6 +17,7 @@
 
 import os
 import mxnet as mx
+import zipfile
 
 def get_mnist(data_dir):
 if not os.path.isdir(data_dir):
@@ -28,7 +29,7 @@ def get_mnist(data_dir):
(not os.path.exists('t10k-labels-idx1-ubyte')):
 import urllib, zipfile
 zippath = os.path.join(os.getcwd(), "mnist.zip")
-urllib.urlretrieve("http://data.mxnet.io/mxnet/data/mnist.zip";, 
zippath)
+mx.test_utils.download("http://data.mxnet.io/mxnet/data/mnist.zip";, 
zippath)
 zf = zipfile.ZipFile(zippath, "r")
 zf.extractall()
 zf.close()
@@ -45,7 +46,7 @@ def get_cifar10(data_dir):
 import urllib, zipfile, glob
 dirname = os.getcwd()
 zippath = os.path.join(dirname, "cifar10.zip")
-urllib.urlretrieve("http://data.mxnet.io/mxnet/data/cifar10.zip";, 
zippath)
+mx.test_utils.download("http://data.mxnet.io/mxnet/data/cifar10.zip";, 
zippath)
 zf = zipfile.ZipFile(zippath, "r")
 zf.extractall()
 zf.close()
@@ -56,7 +57,6 @@ def get_cifar10(data_dir):
 os.rmdir(os.path.join(dirname, "cifar"))
 os.chdir(cwd)
 
-# data
 def get_cifar10_iterator(args, kv):
 data_shape = (3, 28, 28)
 data_dir = args.data_dir



[incubator-mxnet] branch java-api updated: Maven Surefire bug workaround (#13097)

2018-11-02 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/java-api by this push:
 new 62d2800  Maven Surefire bug workaround (#13097)
62d2800 is described below

commit 62d2800af08664abe322e32b7fe2f39c75b6c0c7
Author: Zach Kimberg 
AuthorDate: Fri Nov 2 16:24:07 2018 -0700

Maven Surefire bug workaround (#13097)
---
 scala-package/pom.xml | 1 +
 1 file changed, 1 insertion(+)

diff --git a/scala-package/pom.xml b/scala-package/pom.xml
index 9f7a498..daa2dff 100644
--- a/scala-package/pom.xml
+++ b/scala-package/pom.xml
@@ -215,6 +215,7 @@
 2.19
 
   true
+  false
 
   
   



[incubator-mxnet] branch master updated: Fix variable name in tutorial code snippet (#13052)

2018-11-02 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 50f43f0  Fix variable name in tutorial code snippet (#13052)
50f43f0 is described below

commit 50f43f05ab61fe9e19698bb18dc34858a240b263
Author: Joel Wong 
AuthorDate: Sat Nov 3 10:05:23 2018 +1100

Fix variable name in tutorial code snippet (#13052)

Fixes incorrect variable name in tutorial code as raised in issue 
https://github.com/apache/incubator-mxnet/issues/13051
---
 docs/tutorials/scala/char_lstm.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/tutorials/scala/char_lstm.md 
b/docs/tutorials/scala/char_lstm.md
index e5f071b..4d6a5ae 100644
--- a/docs/tutorials/scala/char_lstm.md
+++ b/docs/tutorials/scala/char_lstm.md
@@ -129,7 +129,7 @@ To prepare the data:
 ```scala
 scala> // Build  a vocabulary of what char we have in the content
 scala> def buildVocab(path: String): Map[String, Int] = {
-val content = readContent(dataPath).split("\n")
+val content = readContent(path).split("\n")
 var idx = 1 // 0 is left for zero padding
 var theVocab = Map[String, Int]()
 for (line <- content) {



[incubator-mxnet] branch java-api updated: [MXNET-1160] add Java build/run example (#12969)

2018-10-29 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/java-api by this push:
 new 7e776c9  [MXNET-1160] add Java build/run example (#12969)
7e776c9 is described below

commit 7e776c930f0fe7a394ec30db1c575bf0a30c8569
Author: Lanking 
AuthorDate: Mon Oct 29 11:49:28 2018 -0700

[MXNET-1160] add Java build/run example (#12969)

* add example

* clean up nit

* find the pain point

* add java tut into whitelist

* Trigger CI

* add java demo and split scala demo

* address the comments

* change the examples

* fix the wrong configuration
---
 .../scala/mxnet_java_install_and_run_examples.md   | 123 +
 scala-package/mxnet-demo/{ => java-demo}/Makefile  |   6 +-
 scala-package/mxnet-demo/{ => java-demo}/README.md |  39 ---
 .../{bin/demo.sh => java-demo/bin/java_sample.sh}  |   4 +-
 .../{bin/demo.sh => java-demo/bin/run_od.sh}   |   5 +-
 scala-package/mxnet-demo/java-demo/pom.xml |  25 +
 .../src/main/java/sample/HelloWorld.java}  |  20 ++--
 .../src/main/java/sample/ObjectDetection.java  | 101 +
 scala-package/mxnet-demo/{ => scala-demo}/Makefile |   2 +-
 .../mxnet-demo/{ => scala-demo}/README.md  |  12 +-
 .../mxnet-demo/{ => scala-demo}/bin/demo.sh|   0
 .../mxnet-demo/{ => scala-demo}/bin/run_im.sh  |   0
 scala-package/mxnet-demo/{ => scala-demo}/pom.xml  |   0
 .../src/main/scala/sample/HelloWorld.scala |   0
 .../scala/sample/ImageClassificationExample.scala  |   0
 tests/tutorials/test_sanity_tutorials.py   |   1 +
 16 files changed, 301 insertions(+), 37 deletions(-)

diff --git a/docs/tutorials/scala/mxnet_java_install_and_run_examples.md 
b/docs/tutorials/scala/mxnet_java_install_and_run_examples.md
new file mode 100644
index 000..83e1ec5
--- /dev/null
+++ b/docs/tutorials/scala/mxnet_java_install_and_run_examples.md
@@ -0,0 +1,123 @@
+# Install and run Java Examples
+
+## Prerequisites:
+Please follow the Step 1 in the [Scala 
configuration](http://mxnet.incubator.apache.org/install/scala_setup.html#setup-instructions)
+These should help you install the correct Java version and all dependencies.
+
+## Run the Java example project
+We have provided a general MXNet Java template under 
`scala-package/mxnet-demo/java-demo` which contains the necessary project files 
for you to get started. It contains a simple Hello world! equivalent program 
`JavaSample.java` and a full fledged `ObjectDetection.java `that shows how to 
run Object Detection on images using MXNet and pre-trained SSD model.
+
+Alternatively you could build project from scratch following the below 
instructions.
+
+## Import and run the Java package
+For users using a desktop/laptop, we recommend using IntelliJ IDE as it is 
tested and supported to provide the necessary documentation for the Java API.
+
+Alternatively, users can follow the second instruction to set up an empty 
Maven project for Java.
+
+### IntelliJ instruction
+If you are using a computer with Ubuntu16.04 or Mac, you can install IntelliJ 
to run the Java package. Please follow the instruction below:
+
+1. Create a new Java project in IntelliJ. Fire up IntelliJ and click `Create 
New Project`.
+
+2. Click `Next`, and in the `Create project from template` window, do not 
select anything and click `Next` again.
+
+3. In the next window choose your `Project name` and the `Project location` 
and click on `Finish`.
+
+4. Let's add the Java Inference API jars that we build from source. At the top 
of the window, Go to the `File -> Project Structure`. In the popup window that 
opens up, click on `Libraries -> +` and select the path to the jar files 
downloaded. Click `Apply` and then click `OK`.
+
+6. Create a new Java class under the folder `your-project-name/src`. Let's 
call this class `JavaSample.java`. Type in the following code snippet and run 
it. In this code snippet, we create an NDArray object in Java and print its 
shape.
+```java
+import org.apache.mxnet.javaapi.Context;
+import org.apache.mxnet.javaapi.NDArray;
+
+public class JavaSample {
+public static void main(String[] args) {
+  System.out.println("Hello");
+  NDArray nd = NDArray.ones(Context.cpu(), new int[] {10, 20});
+
+  System.out.println("Shape of NDarray is : "  + nd.shape());
+}
+}
+```
+
+7. If all went well, you should see an output like this :
+```
+Hello
+SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
+SLF4J: Defaulting to no-operation (NOP) logger implementation
+SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
details.
+Shape of NDarray is : (10,20)
+Process finished with exit code 0
+```
+This means you have succes

[incubator-mxnet] branch master updated: fix Sphinx errors for tutorials and install ToCs (#12945)

2018-10-28 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 1555735  fix Sphinx errors for tutorials and install ToCs (#12945)
1555735 is described below

commit 1555735f7a1a250c98ee808eb207367fce1b6406
Author: Aaron Markham 
AuthorDate: Sun Oct 28 22:25:40 2018 -0700

fix Sphinx errors for tutorials and install ToCs (#12945)

* missing line break fix for tutorials toc

* fix the install index toc errors
---
 docs/install/index.md   | 18 ++
 docs/tutorials/index.md |  1 +
 2 files changed, 19 insertions(+)

diff --git a/docs/install/index.md b/docs/install/index.md
index 7ddbaa8..e53fe18 100644
--- a/docs/install/index.md
+++ b/docs/install/index.md
@@ -1,5 +1,23 @@
 # Installing MXNet
 
+```eval_rst
+.. toctree::
+   :hidden:
+
+   amazonlinux_setup.md
+   build_from_source.md
+   c_plus_plus.md
+   centos_setup.md
+   download.md
+   osx_setup.md
+   raspbian_setup.md
+   scala_setup.md
+   tx2_setup.md
+   ubuntu_setup.md
+   validate_mxnet.md
+   windows_setup.md
+```
+
 Indicate your preferred configuration. Then, follow the customized commands to 
install MXNet.
 
 
diff --git a/docs/tutorials/index.md b/docs/tutorials/index.md
index 07f32b5..7ad16d0 100644
--- a/docs/tutorials/index.md
+++ b/docs/tutorials/index.md
@@ -3,6 +3,7 @@
 ```eval_rst
 .. toctree::
:hidden:
+   
basic/index.md
c++/index.md
control_flow/index.md



[incubator-mxnet] branch java-api updated: First pass at adding JavaDocs for new java api classes (#12963)

2018-10-26 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/java-api by this push:
 new 743301c  First pass at adding JavaDocs for new java api classes 
(#12963)
743301c is described below

commit 743301ccfe53ae5bee7debc4b3486f080a45291f
Author: Andrew Ayres 
AuthorDate: Fri Oct 26 16:52:19 2018 -0700

First pass at adding JavaDocs for new java api classes (#12963)

* First pass at adding JavaDocs for new java api classes

* Fix a scalastyle issue

* Updating JavaDoc based on feedback
---
 .../scala/org/apache/mxnet/javaapi/Context.scala   |  12 ++
 .../main/scala/org/apache/mxnet/javaapi/IO.scala   |   8 +
 .../scala/org/apache/mxnet/javaapi/NDArray.scala   | 199 +
 .../mxnet/infer/javaapi/ObjectDetector.scala   |  16 +-
 .../org/apache/mxnet/infer/javaapi/Predictor.scala |  17 ++
 5 files changed, 251 insertions(+), 1 deletion(-)

diff --git 
a/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/Context.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/Context.scala
index 2f4f3e6..ac3517b 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/Context.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/Context.scala
@@ -18,6 +18,13 @@ package org.apache.mxnet.javaapi
 
 import collection.JavaConverters._
 
+/**
+  * Constructing a context which is used to specify the device and device type 
that will
+  * be utilized by the engine.
+  *
+  * @param deviceTypeName {'cpu', 'gpu'} String representing the device type
+  * @param deviceId The device id of the device, needed for GPU
+  */
 class Context(val context: org.apache.mxnet.Context) {
 
   val deviceTypeid: Int = context.deviceTypeid
@@ -26,6 +33,11 @@ class Context(val context: org.apache.mxnet.Context) {
   = this(new org.apache.mxnet.Context(deviceTypeName, deviceId))
 
   def withScope[T](body: => T): T = context.withScope(body)
+
+  /**
+* Return device type of current context.
+* @return device_type
+*/
   def deviceType: String = context.deviceType
 
   override def toString: String = context.toString
diff --git 
a/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/IO.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/IO.scala
index 47b1c36..bf961b2 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/IO.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/IO.scala
@@ -30,5 +30,13 @@ object DataDesc{
 
   implicit def toDataDesc(dataDesc: DataDesc): org.apache.mxnet.DataDesc = 
dataDesc.dataDesc
 
+  /**
+* Get the dimension that corresponds to the batch size.
+* @param layout layout string. For example, "NCHW".
+* @return An axis indicating the batch_size dimension. When 
data-parallelism is used,
+* the data will be automatically split and concatenate along the 
batch_size dimension.
+* Axis can be -1, which means the whole array will be copied
+* for each data-parallelism device.
+*/
   def getBatchAxis(layout: String): Int = 
org.apache.mxnet.DataDesc.getBatchAxis(Some(layout))
 }
diff --git 
a/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/NDArray.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/NDArray.scala
index 96119be..d4e67f7 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/NDArray.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/NDArray.scala
@@ -29,27 +29,64 @@ object NDArray extends NDArrayBase {
 
   def waitall(): Unit = org.apache.mxnet.NDArray.waitall()
 
+  /**
+* One hot encoding indices into matrix out.
+* @param indices An NDArray containing indices of the categorical features.
+* @param out The result holder of the encoding.
+* @return Same as out.
+*/
   def onehotEncode(indices: NDArray, out: NDArray): NDArray
   = org.apache.mxnet.NDArray.onehotEncode(indices, out)
 
+  /**
+* Create an empty uninitialized new NDArray, with specified shape.
+*
+* @param shape shape of the NDArray.
+* @param ctx The context of the NDArray.
+*
+* @return The created NDArray.
+*/
   def empty(shape: Shape, ctx: Context, dtype: DType.DType): NDArray
   = org.apache.mxnet.NDArray.empty(shape, ctx, dtype)
   def empty(ctx: Context, shape: Array[Int]): NDArray
   = org.apache.mxnet.NDArray.empty(new Shape(shape), ctx)
   def empty(ctx : Context, shape : java.util.List[java.lang.Integer]) : NDArray
   = org.apache.mxnet.NDArray.empty(new Shape(shape), ctx)
+
+  /**
+* Create a new NDArray filled with 0, with specified shape.
+*
+* @param shape shape of the NDArray.
+* @param ctx The context of the NDArray.
+*
+* @return The created NDArray.
+*/
   def

[incubator-mxnet] branch java-api updated: [MXNET-984] Java NDArray Documentation Generation (#12835)

2018-10-26 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/java-api by this push:
 new 5aaa729  [MXNET-984] Java NDArray Documentation Generation (#12835)
5aaa729 is described below

commit 5aaa72998e180e56d4b21c90d8791928661754c3
Author: Lanking 
AuthorDate: Fri Oct 26 11:53:34 2018 -0700

[MXNET-984] Java NDArray Documentation Generation (#12835)

* cherry pick javaDoc changes

* update NDArray changes

* refactoring change and merge all docGen in a single place

* clean the scalastyle

* take on Piyush nit

* drop the comments
---
 .../scala/org/apache/mxnet/javaapi/NDArray.scala   |   2 +-
 .../scala/org/apache/mxnet/APIDocGenerator.scala   | 151 -
 .../apache/mxnet/javaapi/JavaNDArrayMacro.scala|   6 +-
 .../org/apache/mxnet/utils/CToScalaUtils.scala |   9 +-
 4 files changed, 124 insertions(+), 44 deletions(-)

diff --git 
a/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/NDArray.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/NDArray.scala
index c77b440..96119be 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/NDArray.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/javaapi/NDArray.scala
@@ -22,7 +22,7 @@ import org.apache.mxnet.javaapi.DType.DType
 import collection.JavaConverters._
 
 @AddJNDArrayAPIs(false)
-object NDArray {
+object NDArray extends NDArrayBase {
   implicit def fromNDArray(nd: org.apache.mxnet.NDArray): NDArray = new 
NDArray(nd)
 
   implicit def toNDArray(jnd: NDArray): org.apache.mxnet.NDArray = jnd.nd
diff --git 
a/scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala 
b/scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
index b4efa65..44d47a2 100644
--- a/scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
+++ b/scala-package/macros/src/main/scala/org/apache/mxnet/APIDocGenerator.scala
@@ -42,6 +42,8 @@ private[mxnet] object APIDocGenerator{
 hashCollector += absClassGen(FILE_PATH, false)
 hashCollector += nonTypeSafeClassGen(FILE_PATH, true)
 hashCollector += nonTypeSafeClassGen(FILE_PATH, false)
+// Generate Java API documentation
+hashCollector += javaClassGen(FILE_PATH + "javaapi/")
 val finalHash = hashCollector.mkString("\n")
   }
 
@@ -52,8 +54,45 @@ private[mxnet] object APIDocGenerator{
 org.apache.commons.codec.binary.Base64.encodeBase64URLSafeString(digest)
   }
 
-  def absClassGen(FILE_PATH : String, isSymbol : Boolean) : String = {
-// scalastyle:off
+  def fileGen(filePath : String, packageName : String, packageDef : String,
+  absFuncs : List[String]) : String = {
+val apacheLicense =
+  """/*
+|* Licensed to the Apache Software Foundation (ASF) under one or more
+|* contributor license agreements.  See the NOTICE file distributed 
with
+|* this work for additional information regarding copyright ownership.
+|* The ASF licenses this file to You under the Apache License, Version 
2.0
+|* (the "License"); you may not use this file except in compliance with
+|* the License.  You may obtain a copy of the License at
+|*
+|*http://www.apache.org/licenses/LICENSE-2.0
+|*
+|* Unless required by applicable law or agreed to in writing, software
+|* distributed under the License is distributed on an "AS IS" BASIS,
+|* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 
implied.
+|* See the License for the specific language governing permissions and
+|* limitations under the License.
+|*/
+|""".stripMargin
+val scalaStyle = "// scalastyle:off"
+val imports = "import org.apache.mxnet.annotation.Experimental"
+val absClassDef = s"abstract class $packageName"
+
+val finalStr =
+  s"""$apacheLicense
+ |$scalaStyle
+ |$packageDef
+ |$imports
+ |$absClassDef {
+ |${absFuncs.mkString("\n")}
+ |}""".stripMargin
+val pw = new PrintWriter(new File(filePath + s"$packageName.scala"))
+pw.write(finalStr)
+pw.close()
+MD5Generator(finalStr)
+  }
+
+  def absClassGen(filePath : String, isSymbol : Boolean) : String = {
 val absClassFunctions = getSymbolNDArrayMethods(isSymbol)
 // Defines Operators that should not generated
 val notGenerated = Set("Custom")
@@ -66,19 +105,27 @@ private[mxnet] object APIDocGenerator{
   s"$scalaDoc\n$defBody"
 })
 val packageName = if (isSymbol) "SymbolAPIBase" else "NDArrayAPIBase&qu

[incubator-mxnet] branch java-api updated: Bumping down minimum java support from 8 to 7 (#12965)

2018-10-24 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/java-api by this push:
 new f759984  Bumping down minimum java support from 8 to 7 (#12965)
f759984 is described below

commit f7599841411001778d50e7e076a3413a1c3d7a18
Author: Piyush Ghai 
AuthorDate: Wed Oct 24 14:45:27 2018 -0700

Bumping down minimum java support from 8 to 7 (#12965)
---
 .../org/apache/mxnet/javaapi/ResourceScopeTestSuite.java | 12 +---
 scala-package/pom.xml|  4 ++--
 2 files changed, 11 insertions(+), 5 deletions(-)

diff --git 
a/scala-package/core/src/test/java/org/apache/mxnet/javaapi/ResourceScopeTestSuite.java
 
b/scala-package/core/src/test/java/org/apache/mxnet/javaapi/ResourceScopeTestSuite.java
index f570ba9..1c246d8 100644
--- 
a/scala-package/core/src/test/java/org/apache/mxnet/javaapi/ResourceScopeTestSuite.java
+++ 
b/scala-package/core/src/test/java/org/apache/mxnet/javaapi/ResourceScopeTestSuite.java
@@ -73,7 +73,9 @@ public class ResourceScopeTestSuite {
 }
 
 assertEquals(list.size() , 10);
-list.forEach(n -> assertTrue(n.verifyIsDisposed()));
+for (TestNDArray item : list) {
+assertTrue(item.verifyIsDisposed());
+}
 }
 
 @Test
@@ -87,7 +89,9 @@ public class ResourceScopeTestSuite {
 }
 
 assertEquals(stringToNDArrayMap.size(), 10);
-stringToNDArrayMap.forEach((key, value) ->  
assertTrue(value.verifyIsDisposed()));
+for (Map.Entry entry : 
stringToNDArrayMap.entrySet()) {
+assertTrue(entry.getValue().verifyIsDisposed());
+}
 
 Map ndArrayToStringMap = new HashMap<>();
 
@@ -98,7 +102,9 @@ public class ResourceScopeTestSuite {
 }
 
 assertEquals(ndArrayToStringMap.size(), 10);
-ndArrayToStringMap.forEach((key, value) ->  
assertTrue(key.verifyIsDisposed()));
+for (Map.Entry entry : 
ndArrayToStringMap.entrySet()) {
+assertTrue(entry.getKey().verifyIsDisposed());
+}
 
 }
 }
diff --git a/scala-package/pom.xml b/scala-package/pom.xml
index eb3f6f0..9f7a498 100644
--- a/scala-package/pom.xml
+++ b/scala-package/pom.xml
@@ -190,8 +190,8 @@
 maven-compiler-plugin
 3.3
 
-  1.8
-  1.8
+  1.7
+  1.7
   UTF-8
 
   



[incubator-mxnet] branch java-api updated: Added unit tests for Resource Scope in Java (#12955)

2018-10-24 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch java-api
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/java-api by this push:
 new 58d4efb  Added unit tests for Resource Scope in Java (#12955)
58d4efb is described below

commit 58d4efbe06984b0fef23139b51171cbd56c45c06
Author: Piyush Ghai 
AuthorDate: Wed Oct 24 06:41:08 2018 -0700

Added unit tests for Resource Scope in Java (#12955)
---
 .../mxnet/javaapi/ResourceScopeTestSuite.java  | 104 +
 scala-package/pom.xml  |   4 +-
 2 files changed, 106 insertions(+), 2 deletions(-)

diff --git 
a/scala-package/core/src/test/java/org/apache/mxnet/javaapi/ResourceScopeTestSuite.java
 
b/scala-package/core/src/test/java/org/apache/mxnet/javaapi/ResourceScopeTestSuite.java
new file mode 100644
index 000..f570ba9
--- /dev/null
+++ 
b/scala-package/core/src/test/java/org/apache/mxnet/javaapi/ResourceScopeTestSuite.java
@@ -0,0 +1,104 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+
+package org.apache.mxnet.javaapi;
+
+import org.apache.mxnet.NativeResourceRef;
+import org.apache.mxnet.ResourceScope;
+import org.junit.Test;
+
+import java.util.*;
+import java.util.concurrent.Callable;
+
+import static org.junit.Assert.assertEquals;
+import static org.junit.Assert.assertTrue;
+
+public class ResourceScopeTestSuite {
+
+/**
+ * This is a placeholder class to test out whether NDArray References get 
collected or not when using
+ * try-with-resources in Java.
+ *
+ */
+class TestNDArray  {
+NDArray selfArray;
+
+public TestNDArray(Context context, int[] shape) {
+this.selfArray = NDArray.ones(context, shape);
+}
+
+public boolean verifyIsDisposed() {
+return this.selfArray.nd().isDisposed();
+}
+
+public NativeResourceRef getNDArrayReference() {
+return this.selfArray.nd().ref();
+}
+}
+
+@Test
+public void testNDArrayAutoRelease() {
+TestNDArray test = null;
+
+try (ResourceScope scope = new ResourceScope()) {
+test = new TestNDArray(Context.cpu(), new int[]{100, 100});
+}
+
+assertTrue(test.verifyIsDisposed());
+}
+
+@Test
+public void testObjectReleaseFromList() {
+List list = new ArrayList<>();
+
+try (ResourceScope scope = new ResourceScope()) {
+for (int i = 0;i < 10; i++) {
+list.add(new TestNDArray(Context.cpu(), new int[] {100, 100}));
+}
+}
+
+assertEquals(list.size() , 10);
+list.forEach(n -> assertTrue(n.verifyIsDisposed()));
+}
+
+@Test
+public void testObjectReleaseFromMap() {
+Map stringToNDArrayMap = new HashMap<>();
+
+try (ResourceScope scope = new ResourceScope()) {
+for (int i = 0;i < 10; i++) {
+stringToNDArrayMap.put(String.valueOf(i),new 
TestNDArray(Context.cpu(), new int[] {i, i}));
+}
+}
+
+assertEquals(stringToNDArrayMap.size(), 10);
+stringToNDArrayMap.forEach((key, value) ->  
assertTrue(value.verifyIsDisposed()));
+
+Map ndArrayToStringMap = new HashMap<>();
+
+try (ResourceScope scope = new ResourceScope()) {
+for (int i = 0;i < 10; i++) {
+ndArrayToStringMap.put(new TestNDArray(Context.cpu(), new 
int[] {i, i}), String.valueOf(i));
+}
+}
+
+assertEquals(ndArrayToStringMap.size(), 10);
+ndArrayToStringMap.forEach((key, value) ->  
assertTrue(key.verifyIsDisposed()));
+
+}
+}
diff --git a/scala-package/pom.xml b/scala-package/pom.xml
index fe78a62..eb3f6f0 100644
--- a/scala-package/pom.xml
+++ b/scala-package/pom.xml
@@ -190,8 +190,8 @@
 maven-compiler-plugin
 3.3
 
-  1.6
-  1.6
+  1.8
+  1.8
   UTF-8
 
   



[incubator-mxnet] branch master updated: Disabled flaky test: test_gluon_gpu.test_slice_batchnorm_reshape_batchnorm (#12768)

2018-10-23 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 7d0f7d6  Disabled flaky test: 
test_gluon_gpu.test_slice_batchnorm_reshape_batchnorm (#12768)
7d0f7d6 is described below

commit 7d0f7d623ffaff16935412865618593bf6146465
Author: Anton Chernov 
AuthorDate: Wed Oct 24 03:24:28 2018 +0200

Disabled flaky test: test_gluon_gpu.test_slice_batchnorm_reshape_batchnorm 
(#12768)
---
 tests/python/unittest/test_gluon.py | 1 +
 1 file changed, 1 insertion(+)

diff --git a/tests/python/unittest/test_gluon.py 
b/tests/python/unittest/test_gluon.py
index a6932d2..e8ef704 100644
--- a/tests/python/unittest/test_gluon.py
+++ b/tests/python/unittest/test_gluon.py
@@ -2007,6 +2007,7 @@ def test_reshape_batchnorm_reshape_batchnorm():
 
 
 @with_seed()
+@unittest.skip('Flaky test: 
https://github.com/apache/incubator-mxnet/issues/12767')
 def test_slice_batchnorm_reshape_batchnorm():
 class Net(gluon.HybridBlock):
 def __init__(self, shape, slice, **kwargs):



[incubator-mxnet] branch master updated: use ResourceScope in Model/Trainer/FeedForward.scala (#12882)

2018-10-23 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 6b4df85  use ResourceScope in Model/Trainer/FeedForward.scala (#12882)
6b4df85 is described below

commit 6b4df8576e373ff68b4fcd99ae6318ddb4b9ed12
Author: Naveen Swamy 
AuthorDate: Tue Oct 23 16:49:37 2018 -0700

use ResourceScope in Model/Trainer/FeedForward.scala (#12882)

* use ResourceScope in Model/Trainer/FeedForward.scala

* add moveToOuterScope public method to move resources to a outerScope if 
it exists

* fix memory leak in FeedForward.scala by making it a native resource and 
disposing argparams, auxParams
in dispose() method
---
 .../main/scala/org/apache/mxnet/FeedForward.scala  | 152 +
 .../scala/org/apache/mxnet/NativeResource.scala|   8 +-
 .../scala/org/apache/mxnet/ResourceScope.scala |  35 +++--
 .../imclassification/TrainModel.scala  |  80 +--
 .../imclassification/util/Trainer.scala| 133 +-
 5 files changed, 230 insertions(+), 178 deletions(-)

diff --git 
a/scala-package/core/src/main/scala/org/apache/mxnet/FeedForward.scala 
b/scala-package/core/src/main/scala/org/apache/mxnet/FeedForward.scala
index 00a1450..2ed9d8c 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/FeedForward.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/FeedForward.scala
@@ -17,9 +17,10 @@
 
 package org.apache.mxnet
 
+import org.apache.mxnet.Base.CPtrAddress
 import org.apache.mxnet.io.NDArrayIter
 import org.apache.mxnet.optimizer.SGD
-import org.slf4j.{LoggerFactory, Logger}
+import org.slf4j.{Logger, LoggerFactory}
 
 import scala.collection.mutable.ListBuffer
 
@@ -55,7 +56,7 @@ class FeedForward private(
 argParams: Map[String, NDArray],
 auxParams: Map[String, NDArray],
 private val allowExtraParams: Boolean,
-val beginEpoch: Int) {
+val beginEpoch: Int) extends NativeResource {
 
   val logger: Logger = LoggerFactory.getLogger(classOf[FeedForward])
   private var argumentChecked = false
@@ -126,6 +127,8 @@ class FeedForward private(
   }
 
   // Initialize weight parameters and auxiliary states
+  // The NDArrays associated with the _argParms and _auxParams are not 
disposed instead
+  // they are passed a outer scope if available.
   private def initParams(inputShapes: Map[String, Shape], overwrite: Boolean = 
false)
   : (IndexedSeq[String], IndexedSeq[String], IndexedSeq[String]) = {
 val (argShapes, _, auxShapes) = symbol.inferShape(inputShapes)
@@ -137,16 +140,26 @@ class FeedForward private(
 val paramNameShapes = (argNames zip argShapes).filter { case (name, _) =>
   paramNames.contains(name)
 }
-val argParams = paramNameShapes.map { case (name, shape) =>
-  (name, NDArray.zeros(shape))
+val argParams = paramNameShapes.map { case (name, shape) => {
+val param = NDArray.zeros(shape)
+val curScope = ResourceScope.getCurrentScope()
+if (curScope.isDefined) curScope.get.moveToOuterScope(param)
+(name, param)
+  }
 }.toMap
-val auxParams = (auxNames zip auxShapes).map { case (name, shape) =>
-  (name, NDArray.zeros(shape))
+
+val auxParams = (auxNames zip auxShapes).map { case (name, shape) => {
+val param = NDArray.zeros(shape)
+val curScope = ResourceScope.getCurrentScope()
+if (curScope.isDefined) curScope.get.moveToOuterScope(param)
+(name, param)
+  }
 }.toMap
 
 for ((k, v) <- argParams) {
   if (_argParams != null && _argParams.contains(k) && (!overwrite)) {
 argParams(k).set(_argParams(k))
+
   } else {
 initializer(k, v)
   }
@@ -277,13 +290,15 @@ class FeedForward private(
   def fit(trainData: DataIter, evalData: DataIter, evalMetric: EvalMetric, 
kvStoreType: String,
   epochEndCallback: EpochEndCallback, batchEndCallback: 
BatchEndCallback,
   logger: Logger, workLoadList: Seq[Float]): Unit = {
-// init params first to allow kv store use _argParams to decide its type
-initSymbolParams(trainData)
-// create kvstore
-val (kvStore, updateOnKVStore) = Model.createKVStore(kvStoreType, 
ctx.length, _argParams)
-fit(trainData, evalData, evalMetric, kvStore, updateOnKVStore,
-  epochEndCallback, batchEndCallback, logger, workLoadList)
-kvStore.foreach(_.dispose())
+ResourceScope.using() {
+  // init params first to allow kv store use _argParams to decide its type
+  initSymbolParams(trainData)
+  // create kvstore
+  val (kvStore, updateOnKVStore) = Model.createKVStore(kvStoreType, 
ctx.length, _argParams)
+  fit(trainData, evalData, evalMetric, kvStore, updateOnKVStore,
+epochEndCallback, batchEndCallback, logger, work

[incubator-mxnet] branch master updated: Ignore generated scala files. (#12928)

2018-10-23 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 0874677  Ignore generated scala files. (#12928)
0874677 is described below

commit 08746779d4a43580a11eeba606ffbc872030493d
Author: Frank Liu 
AuthorDate: Tue Oct 23 13:56:47 2018 -0700

Ignore generated scala files. (#12928)
---
 scala-package/.gitignore | 5 +
 1 file changed, 5 insertions(+)

diff --git a/scala-package/.gitignore b/scala-package/.gitignore
new file mode 100644
index 000..0f860e6
--- /dev/null
+++ b/scala-package/.gitignore
@@ -0,0 +1,5 @@
+.flattened-pom.xml
+core/src/main/scala/org/apache/mxnet/NDArrayAPIBase.scala
+core/src/main/scala/org/apache/mxnet/NDArrayBase.scala
+core/src/main/scala/org/apache/mxnet/SymbolAPIBase.scala
+core/src/main/scala/org/apache/mxnet/SymbolBase.scala



[incubator-mxnet] branch master updated: fix the paths issue for downloading script (#12913)

2018-10-22 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
 new 38e32bd  fix the paths issue for downloading script (#12913)
38e32bd is described below

commit 38e32bdf25beb9e624bcf9381cf441c095d09b44
Author: Lanking 
AuthorDate: Mon Oct 22 19:25:23 2018 -0700

fix the paths issue for downloading script (#12913)
---
 .../examples/scripts/infer/imageclassifier/get_resnet_18_data.sh  | 2 +-
 .../examples/scripts/infer/imageclassifier/get_resnet_data.sh | 8 
 .../examples/scripts/infer/objectdetector/get_ssd_data.sh | 2 +-
 3 files changed, 6 insertions(+), 6 deletions(-)

diff --git 
a/scala-package/examples/scripts/infer/imageclassifier/get_resnet_18_data.sh 
b/scala-package/examples/scripts/infer/imageclassifier/get_resnet_18_data.sh
index 4ba9fd5..1ce996e 100755
--- a/scala-package/examples/scripts/infer/imageclassifier/get_resnet_18_data.sh
+++ b/scala-package/examples/scripts/infer/imageclassifier/get_resnet_18_data.sh
@@ -37,5 +37,5 @@ if [ ! -f "$data_path" ]; then
   wget 
https://s3.us-east-2.amazonaws.com/scala-infer-models/resnet-18/resnet-18-symbol.json
 -P $data_path
   wget 
https://s3.us-east-2.amazonaws.com/scala-infer-models/resnet-18/resnet-18-.params
 -P $data_path
   wget 
https://s3.us-east-2.amazonaws.com/scala-infer-models/resnet-18/synset.txt -P 
$data_path
-  wget https://s3.amazonaws.com/model-server/inputs/kitten.jpg -P $image_path
+  wget 
https://s3.us-east-2.amazonaws.com/mxnet-scala/scala-example-ci/resnet152/kitten.jpg
 -P $image_path
 fi
diff --git 
a/scala-package/examples/scripts/infer/imageclassifier/get_resnet_data.sh 
b/scala-package/examples/scripts/infer/imageclassifier/get_resnet_data.sh
index b68e2f3..6fd85e4 100755
--- a/scala-package/examples/scripts/infer/imageclassifier/get_resnet_data.sh
+++ b/scala-package/examples/scripts/infer/imageclassifier/get_resnet_data.sh
@@ -34,8 +34,8 @@ if [ ! -d "$image_path" ]; then
 fi
 
 if [ ! -f "$data_path" ]; then
-  wget 
http://data.mxnet.io/models/imagenet-11k/resnet-152/resnet-152-.params -P 
$data_path
-  wget 
http://data.mxnet.io/models/imagenet-11k/resnet-152/resnet-152-symbol.json -P 
$data_path
-  wget http://data.mxnet.io/models/imagenet-11k/synset.txt -P $data_path
-  wget https://s3.amazonaws.com/model-server/inputs/kitten.jpg -P $image_path
+  wget 
https://s3.us-east-2.amazonaws.com/mxnet-scala/scala-example-ci/resnet152/resnet-152-.params
 -P $data_path
+  wget 
https://s3.us-east-2.amazonaws.com/mxnet-scala/scala-example-ci/resnet152/resnet-152-symbol.json
 -P $data_path
+  wget 
https://s3.us-east-2.amazonaws.com/mxnet-scala/scala-example-ci/resnet152/synset.txt
 -P $data_path
+  wget 
https://s3.us-east-2.amazonaws.com/mxnet-scala/scala-example-ci/resnet152/kitten.jpg
 -P $image_path
 fi
diff --git 
a/scala-package/examples/scripts/infer/objectdetector/get_ssd_data.sh 
b/scala-package/examples/scripts/infer/objectdetector/get_ssd_data.sh
index ab231d4..8787d63 100755
--- a/scala-package/examples/scripts/infer/objectdetector/get_ssd_data.sh
+++ b/scala-package/examples/scripts/infer/objectdetector/get_ssd_data.sh
@@ -37,7 +37,7 @@ fi
 if [ ! -f "$data_path" ]; then
 wget 
https://s3.amazonaws.com/model-server/models/resnet50_ssd/resnet50_ssd_model-symbol.json
 -P $data_path
 wget 
https://s3.amazonaws.com/model-server/models/resnet50_ssd/resnet50_ssd_model-.params
 -P $data_path
-wget 
https://raw.githubusercontent.com/awslabs/mxnet-model-server/master/examples/ssd/synset.txt
 -P $data_path
+wget https://s3.amazonaws.com/model-server/models/resnet50_ssd/synset.txt 
-P $data_path
 cd $image_path
 wget 
https://cloud.githubusercontent.com/assets/3307514/20012566/cbb53c76-a27d-11e6-9aaa-91939c9a1cd5.jpg
 -O 01.jpg
 wget 
https://cloud.githubusercontent.com/assets/3307514/20012567/cbb60336-a27d-11e6-93ff-cbc3f09f5c9e.jpg
 -O dog.jpg



[incubator-mxnet] branch master updated (5b86701 -> 3c81b3f)

2018-10-22 Thread nswamy
This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git.


from 5b86701  [MXNET-793] ★ Virtualized testing in CI with QEMU ★ (#12094)
 add 3c81b3f  [MXNET-1017] Updating the readme file for cpp-package and 
adding readme file for example directory. (#12773)

No new revisions were added by this update.

Summary of changes:
 cpp-package/README.md |  49 ++-
 cpp-package/example/README.md | 106 ++
 2 files changed, 143 insertions(+), 12 deletions(-)
 create mode 100644 cpp-package/example/README.md



  1   2   3   >