chowkamlee81 opened a new issue #9179: Sample example codes to work on ConvLSTM
and ConvGRU based on 2D images..
URL: https://github.com/apache/incubator-mxnet/issues/9179
I would like to work on spatio-temporal mechanism using ConvLSTM/ConvGRU.
I would like to know any sample example co
taylover-pei opened a new issue #9178: How to enhance the batch size?
URL: https://github.com/apache/incubator-mxnet/issues/9178
I download the code and run the code on all 4 GPU(1080), but the batch size
could only be 10. So what's the problem?
---
zhaoningning commented on issue #9156: float64 data backward error using gluon
URL:
https://github.com/apache/incubator-mxnet/issues/9156#issuecomment-353525320
@sxjscience I have already cast all data to float64,so forward is OK ,but
backward give error.
I have to use float64 b
eric-haibin-lin opened a new issue #9177: Support standard optimizer with
sparse gradient
URL: https://github.com/apache/incubator-mxnet/issues/9177
Per @mg0880gm's request:
Operators such as dot and sparse_embedding generates row_sparse gradients,
one can use SGD with momentum or a
RogerChern commented on issue #8884: forward can't run parallelly using
multi-gpus when custom operator using numpy
URL:
https://github.com/apache/incubator-mxnet/issues/8884#issuecomment-353524459
FYI. A simple workaround for loss-type CustomOp is to comment out all
calculations in for
RogerChern commented on issue #8884: forward can't run parallelly using
multi-gpus when custom operator using numpy
URL:
https://github.com/apache/incubator-mxnet/issues/8884#issuecomment-353524459
FYI. A simple workaround for loss-type CustomOp is to comment out all
calculations in for
eric-haibin-lin commented on issue #8822: Can mx.nd.where(condition, x, y)
supports if both x and y are None?
URL:
https://github.com/apache/incubator-mxnet/issues/8822#issuecomment-353523262
@reminisce why a new storage type is required? What's the implication of
that on existing operato
eric-haibin-lin commented on a change in pull request #8732: rsp push and rsp
pull for comm device, used in kvstore('device')
URL: https://github.com/apache/incubator-mxnet/pull/8732#discussion_r158428472
##
File path: tests/python/gpu/test_kvstore_gpu.py
##
@@ -26,44 +26,
eric-haibin-lin opened a new issue #9176: row_sparse ndarray + 0 should not
return dense ndarray
URL: https://github.com/apache/incubator-mxnet/issues/9176
Currently `nd._plus_scalar(row_sparse, 0)` and `nd._minus_scalar(row_sparse,
0)` return dense NDArray. Instead it can return a row-spa
helloworldlxb commented on issue #9032: Why does a tanh activation layer
generates values greater than 1?
URL:
https://github.com/apache/incubator-mxnet/issues/9032#issuecomment-353520009
@reminisce
This is the definition of the network. I defined a new layer so it may be
long. Thank
helloworldlxb commented on issue #9032: Why does a tanh activation layer
generates values greater than 1?
URL:
https://github.com/apache/incubator-mxnet/issues/9032#issuecomment-353520009
@reminisce
```
class DetectionLoss(mx.operator.NumpyOp):
def __init__(self):
reminisce commented on issue #9032: Why does a tanh activation layer generates
values greater than 1?
URL:
https://github.com/apache/incubator-mxnet/issues/9032#issuecomment-353517302
@helloworldlxb Could you please provide a script?
@chowkamlee81 I believe your issue has been resolved,
pracheer commented on a change in pull request #9152: tutorial for distributed
training
URL: https://github.com/apache/incubator-mxnet/pull/9152#discussion_r158421337
##
File path: docs/faq/distributed_training.md
##
@@ -0,0 +1,286 @@
+# Distributed training
+MXNet support
pracheer commented on a change in pull request #9152: tutorial for distributed
training
URL: https://github.com/apache/incubator-mxnet/pull/9152#discussion_r158421815
##
File path: docs/faq/distributed_training.md
##
@@ -0,0 +1,286 @@
+# Distributed training
+MXNet support
pracheer commented on a change in pull request #9152: tutorial for distributed
training
URL: https://github.com/apache/incubator-mxnet/pull/9152#discussion_r158419876
##
File path: docs/faq/distributed_training.md
##
@@ -0,0 +1,286 @@
+# Distributed training
+MXNet support
pracheer commented on a change in pull request #9152: tutorial for distributed
training
URL: https://github.com/apache/incubator-mxnet/pull/9152#discussion_r158419895
##
File path: docs/faq/distributed_training.md
##
@@ -0,0 +1,286 @@
+# Distributed training
+MXNet support
pracheer commented on a change in pull request #9152: tutorial for distributed
training
URL: https://github.com/apache/incubator-mxnet/pull/9152#discussion_r158420860
##
File path: docs/faq/distributed_training.md
##
@@ -0,0 +1,286 @@
+# Distributed training
+MXNet support
pracheer commented on a change in pull request #9152: tutorial for distributed
training
URL: https://github.com/apache/incubator-mxnet/pull/9152#discussion_r158420006
##
File path: docs/faq/index.md
##
@@ -15,7 +15,9 @@ and full working examples, visit the [tutorials
sect
pracheer commented on a change in pull request #9152: tutorial for distributed
training
URL: https://github.com/apache/incubator-mxnet/pull/9152#discussion_r158422315
##
File path: docs/faq/distributed_training.md
##
@@ -0,0 +1,286 @@
+# Distributed training
+MXNet support
bhavinthaker commented on a change in pull request #9169: Make versions of
python dependencies like numpy, deterministic.
URL: https://github.com/apache/incubator-mxnet/pull/9169#discussion_r158417058
##
File path: python/setup.py
##
@@ -28,7 +28,7 @@
else:
from setu
Feywell closed issue #9162: fatal error: cub/cub.cuh: No such file or directory
URL: https://github.com/apache/incubator-mxnet/issues/9162
This is an automated message from the Apache Git Service.
To respond to the message, p
Feywell commented on issue #9162: fatal error: cub/cub.cuh: No such file or
directory
URL:
https://github.com/apache/incubator-mxnet/issues/9162#issuecomment-353503178
@sxjscience Thank you!
It is work for me.
This is an
sxjscience commented on issue #9162: fatal error: cub/cub.cuh: No such file or
directory
URL:
https://github.com/apache/incubator-mxnet/issues/9162#issuecomment-353501066
@Feywell
```
cd mxnet
git submodule update --init
```
You can also try the following command if tha
sxjscience commented on issue #9162: fatal error: cub/cub.cuh: No such file or
directory
URL:
https://github.com/apache/incubator-mxnet/issues/9162#issuecomment-353501066
@Feywell
```
cd mxnet/3rdparty
git submodule add https://github.com/dmlc/cub cub
```
---
sxjscience opened a new issue #9175: Concat does not support negative axis
URL: https://github.com/apache/incubator-mxnet/issues/9175
Need to refactor concat using the same logic as stack
https://github.com/apache/incubator-mxnet/blob/master/src/operator/tensor/matrix_op.cc#L693-L731
-
Feywell commented on issue #9162: fatal error: cub/cub.cuh: No such file or
directory
URL:
https://github.com/apache/incubator-mxnet/issues/9162#issuecomment-353497852
@reminisce Thank you! Can you tell me the detail?
I just git clone code yesterday. Why it is not work.
So I don't kn
lupesko commented on issue #9173: updated master branch with 1.0.0 release on
the home page
URL: https://github.com/apache/incubator-mxnet/pull/9173#issuecomment-353493968
Looks good!
Can we have another Issue/PR to load all assets from this repo and not
dmlc/web-data ?
---
gautamkmr opened a new pull request #9174: Adding test for forward backward
compatibility
URL: https://github.com/apache/incubator-mxnet/pull/9174
## Description ##
(Brief description on what this PR is about)
## Checklist ##
### Essentials ###
- [ ] Passed code style checki
thinksanky commented on issue #9173: updated master branch with 1.0.0 release
on the home page
URL: https://github.com/apache/incubator-mxnet/pull/9173#issuecomment-353492115
Also, CSS changes done to fix the version picker issue.
---
thinksanky opened a new pull request #9173: updated master branch with 1.0.0
release on the home page
URL: https://github.com/apache/incubator-mxnet/pull/9173
## Description ##
* Updated the home page center section to 1.0.0 release.
* Sanity tests done locally.
* Background image
sxjscience commented on issue #9172: Wrong gradient of gather_nd when the
indices have duplicates
URL:
https://github.com/apache/incubator-mxnet/issues/9172#issuecomment-353487847
Directly call `+=` if openmp is not used. If openmp is used, we need to add
use the atomic support of omp `#p
sxjscience commented on issue #9172: Wrong gradient of gather_nd when the
indices have duplicates
URL:
https://github.com/apache/incubator-mxnet/issues/9172#issuecomment-353487847
Directly call `+=` if openmp is not used. If openmp is used, we can use the
atomic support of omp `#pragma om
reminisce commented on issue #9172: Wrong gradient of gather_nd when the
indices have duplicates
URL:
https://github.com/apache/incubator-mxnet/issues/9172#issuecomment-353487649
What is the cpu version of atomicAdd?
This is
szha commented on issue #9171: MXNet: Using FusedRNNCell with its
"bidirectional" flag turned True, can lead to hanging of training run.
URL:
https://github.com/apache/incubator-mxnet/issues/9171#issuecomment-353487198
Could you provide runnable code snippet that reproduces the hanging pro
szha commented on issue #9171: MXNet: Using FusedRNNCell with its
"bidirectional" flag turned True, can lead to hanging of training run.
URL:
https://github.com/apache/incubator-mxnet/issues/9171#issuecomment-353487078
What's the patch version for cudnn? Would you confirm if the hanging st
sxjscience commented on issue #9172: Wrong gradient of gather_nd when the
indices have duplicates
URL:
https://github.com/apache/incubator-mxnet/issues/9172#issuecomment-353486874
We could either use automicAdd or call the backward of take. atomicAdd seems
to be simpler.
kalpitdixit commented on issue #9171: MXNet: Using FusedRNNCell with its
"bidirectional" flag turned True, can lead to hanging of training run.
URL:
https://github.com/apache/incubator-mxnet/issues/9171#issuecomment-353485727
## Want (but leads to hanging)
cell = FusedRNNCell( bidir
kalpitdixit commented on issue #9171: MXNet: Using FusedRNNCell with its
"bidirectional" flag turned True, can lead to hanging of training run.
URL:
https://github.com/apache/incubator-mxnet/issues/9171#issuecomment-353485727
## Want (but leads to hanging):
cell = FusedRNNCell( bidi
sxjscience opened a new issue #9172: Wrong gradient of gather_nd when the
indices have duplicates
URL: https://github.com/apache/incubator-mxnet/issues/9172
This issue is borrowed from https://discuss.gluon.ai/t/topic/3389.
I find the cause is a bug in the gradient computation of `ga
kalpitdixit commented on issue #9171: MXNet: Using FusedRNNCell with its
"bidirectional" flag turned True, can lead to hanging of training run.
URL:
https://github.com/apache/incubator-mxnet/issues/9171#issuecomment-353484524
I am using:
MXNet==1.0.0
CUDA==9.0
cuDNN==7.0
As
kalpitdixit opened a new issue #9171: MXNet: Using FusedRNNCell with its
"bidirectional" flag turned True, can lead to hanging of training run.
URL: https://github.com/apache/incubator-mxnet/issues/9171
## Description
MXNet
Using FusedRNNCell with its "bidirectional" flag turned True,
nswamy commented on a change in pull request #9169: Make versions of python
dependencies like numpy, deterministic.
URL: https://github.com/apache/incubator-mxnet/pull/9169#discussion_r158401307
##
File path: python/setup.py
##
@@ -28,7 +28,7 @@
else:
from setuptools
mbaijal opened a new pull request #9170: [Cherry-picked PR 8876 from v1.0.0]
Merge License Updates from v1.0.0 to master
URL: https://github.com/apache/incubator-mxnet/pull/9170
## Description ##
Copy of [PR 8876 from
v1.0.0](https://github.com/apache/incubator-mxnet/pull/8876)
Remo
eric-haibin-lin closed issue #8660: Incorrect autograd results for elemwise_add
URL: https://github.com/apache/incubator-mxnet/issues/8660
This is an automated message from the Apache Git Service.
To respond to the message,
eric-haibin-lin commented on issue #8660: Incorrect autograd results for
elemwise_add
URL:
https://github.com/apache/incubator-mxnet/issues/8660#issuecomment-353482142
Possibly fixed by recent PRs. Closing it for now.
This
sxjscience commented on issue #8660: Incorrect autograd results for
elemwise_add
URL:
https://github.com/apache/incubator-mxnet/issues/8660#issuecomment-353481476
@eric-haibin-lin I've tried the scripts using the latest version again and
there is no error now.
--
sxjscience closed issue #8687: Autograd y = x and y = 1*x gives different
gradient
URL: https://github.com/apache/incubator-mxnet/issues/8687
This is an automated message from the Apache Git Service.
To respond to the messa
sxjscience commented on issue #8687: Autograd y = x and y = 1*x gives different
gradient
URL:
https://github.com/apache/incubator-mxnet/issues/8687#issuecomment-353480416
Fixed now
This is an automated message from the Apac
sandeep-krishnamurthy commented on issue #9169: Make versions of python
dependencies like numpy, deterministic.
URL: https://github.com/apache/incubator-mxnet/pull/9169#issuecomment-353479493
@szha - Yes. @jesterhazy can add more info here.
-
sandeep-krishnamurthy commented on a change in pull request #9169: Make
versions of python dependencies like numpy, deterministic.
URL: https://github.com/apache/incubator-mxnet/pull/9169#discussion_r158397924
##
File path: python/setup.py
##
@@ -28,7 +28,7 @@
else:
eric-haibin-lin commented on a change in pull request #9152: tutorial for
distributed training
URL: https://github.com/apache/incubator-mxnet/pull/9152#discussion_r158397066
##
File path: docs/faq/distributed_training.md
##
@@ -0,0 +1,286 @@
+# Distributed training
+MXNet
eric-haibin-lin commented on a change in pull request #9152: tutorial for
distributed training
URL: https://github.com/apache/incubator-mxnet/pull/9152#discussion_r158397519
##
File path: docs/faq/distributed_training.md
##
@@ -0,0 +1,286 @@
+# Distributed training
+MXNet
This is an automated email from the ASF dual-hosted git repository.
zhasheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git
The following commit(s) were added to refs/heads/master by this push:
new c6cdf51 Merging version changes from
szha closed pull request #9168: Merging version changes from 1.0.0 to master
URL: https://github.com/apache/incubator-mxnet/pull/9168
This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:
As this is a fo
eric-haibin-lin commented on a change in pull request #9151: removes python
path insert of tests folder for examples
URL: https://github.com/apache/incubator-mxnet/pull/9151#discussion_r158396723
##
File path: python/mxnet/test_utils.py
##
@@ -1441,6 +1441,74 @@ def read_d
szha commented on issue #9169: Make versions of python dependencies like numpy,
deterministic.
URL: https://github.com/apache/incubator-mxnet/pull/9169#issuecomment-353477956
Thanks for bringing this up. I think it's a good practice to specify tested
versions. Has the latest numpy version
jesterhazy commented on a change in pull request #9169: Make versions of python
dependencies like numpy, deterministic.
URL: https://github.com/apache/incubator-mxnet/pull/9169#discussion_r158396216
##
File path: python/setup.py
##
@@ -28,7 +28,7 @@
else:
from setupt
sandeep-krishnamurthy opened a new pull request #9169: Make versions of python
dependencies like numpy, deterministic.
URL: https://github.com/apache/incubator-mxnet/pull/9169
## Description ##
Dependencies listed in python setup.py did not have version numbers. It was
always pullin
mbaijal opened a new pull request #9168: Merging version changes from 1.0.0 to
master
URL: https://github.com/apache/incubator-mxnet/pull/9168
## Description ##
Updated NEWS.md, README.md and tags in a couple of files.
## Checklist ##
### Essentials ###
- [ ] Passed code st
javelinjs commented on issue #9129: add tests for distribution generators
URL: https://github.com/apache/incubator-mxnet/pull/9129#issuecomment-353462615
changes merged to https://github.com/apache/incubator-mxnet/pull/9119
Th
larroy commented on issue #8874: mxnet installation from source: C++ linkage
error on HPC
URL:
https://github.com/apache/incubator-mxnet/issues/8874#issuecomment-353462435
Mxnet should be portable. Could be the version of openblas that you are
using, how did you install it?
Could y
thinksanky opened a new pull request #40: fixed all references to the
compressed background image
URL: https://github.com/apache/incubator-mxnet-site/pull/40
## Description ##
Background image size was compressed from 933K to 106K to improve the
loading performance.
Note that since t
GSanchis commented on issue #8669: module.Module and CSVIter
URL:
https://github.com/apache/incubator-mxnet/issues/8669#issuecomment-353454088
I have been diving quite deeply into the code (I even took a look at the c++
code pointed to by the error), but I haven't been able to find anythin
sxjscience opened a new issue #9167: Gluon raises error if the user does not
call nd.waitall()
URL: https://github.com/apache/incubator-mxnet/issues/9167
The following program will raise an error if I directly run it as `python
simple_program.py`
```python
import mxnet as mx
from
eric-haibin-lin commented on issue #8669: module.Module and CSVIter
URL:
https://github.com/apache/incubator-mxnet/issues/8669#issuecomment-353442898
This error indicates that the input to your `sparse_embedding` operator
contains data greater or equal to `input_dim`. Any chance the data f
tsutton opened a new issue #9166: Segfault on ndarray with negative dimension
i.e. mxnet.nx.zeros((-1,))
URL: https://github.com/apache/incubator-mxnet/issues/9166
## Description
When trying to create an ndarray with a negative size in some dimension, I
get a segmentation fault or bad_a
sxjscience commented on issue #9156: float64 data backward error using gluon
URL:
https://github.com/apache/incubator-mxnet/issues/9156#issuecomment-353435453
@zhaoningning You can try to explicitly set the dtype of all the ndarray
weights/biases to float64. Also, would float64 be a must?
reminisce commented on issue #9162: fatal error: cub/cub.cuh: No such file or
directory
URL:
https://github.com/apache/incubator-mxnet/issues/9162#issuecomment-353432913
`cub` has been moved to `3rdparty` folder. Update your submodule and try
again.
--
reminisce commented on issue #9160: How to convert data in range -m to +n to
data in range 0 to1? Is there are any python API
URL:
https://github.com/apache/incubator-mxnet/issues/9160#issuecomment-353432704
Just use element-wise ops. Suppose `x` is the tensor you want to convert,
and the
chaoyuaw commented on issue #9165: add embedding learning example
URL: https://github.com/apache/incubator-mxnet/pull/9165#issuecomment-353432566
Just sent a PR to web-data for the image
(https://github.com/dmlc/web-data/pull/40).
Will update the path and remove the image here once that
sxjscience commented on issue #9129: add tests for distribution generators
URL: https://github.com/apache/incubator-mxnet/pull/9129#issuecomment-353424186
@marcoabreu I've removed the unnecessary prints. Need @javelinjs to merge
this in.
piiswrong commented on issue #9164: No acceleration on 1.0.0 example with
USE_NCCL=1
URL:
https://github.com/apache/incubator-mxnet/issues/9164#issuecomment-353421859
nccl is only faster in some cases, usually when batch size is small
--
chaoyuaw opened a new pull request #9165: add embedding learning example
URL: https://github.com/apache/incubator-mxnet/pull/9165
## Description ##
Add gluon example for embedding learning.
## Checklist ##
### Essentials ###
- [v] Passed code style checking (`make lint`)
-
anjishnu commented on issue #9111: added SELU and ELU activation functions
URL: https://github.com/apache/incubator-mxnet/pull/9111#issuecomment-353379492
I'll just update the PR to be hybridblocks and include Swish.
This is
anjishnu commented on issue #9111: added SELU and ELU activation functions
URL: https://github.com/apache/incubator-mxnet/pull/9111#issuecomment-353379492
I'll just update the CR to be hybridblocks and include Swish.
This is
This is an automated email from the ASF dual-hosted git repository.
qkou pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git
The following commit(s) were added to refs/heads/master by this push:
new ddec3cc R RNN API fixes and Optimizer cl
thirdwing closed pull request #9022: R RNN API fixes and Optimizer clip
gradient on NDArray
URL: https://github.com/apache/incubator-mxnet/pull/9022
This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:
meissnereric commented on issue #8846: Batching improvements for GEMM/TRSM
operators and full MKL usage docs.
URL: https://github.com/apache/incubator-mxnet/pull/8846#issuecomment-353378031
Hey, this was rebased against master just before pushing. I'm not sure why
the pr-merge isn't buildi
kobenaxie commented on issue #9111: added SELU and ELU activation functions
URL: https://github.com/apache/incubator-mxnet/pull/9111#issuecomment-353360202
Swish like this ? @chinakook @anjishnu
```
import mxnet.gluon as gluon
from mxnet import nd
class Swish(gluon.HybridBlo
beeva-enriqueotero opened a new issue #9164: No acceleration on 1.0.0 example
with USE_NCCL=1
URL: https://github.com/apache/incubator-mxnet/issues/9164
I'm performing the image classification fine-tuning example on 4 V100 on a
p3.8x:
```
python fine-tune.py --pretrained-model imagen
GSanchis commented on issue #8669: module.Module and CSVIter
URL:
https://github.com/apache/incubator-mxnet/issues/8669#issuecomment-353340992
So... progressing on this, I did get this to pad, and the GPU problems were
solved by specifying the context in the mx.nd.array's for l,c,v. Code a
larroy commented on issue #9122: [WIP] refactor graph exec
URL: https://github.com/apache/incubator-mxnet/pull/9122#issuecomment-353329302
I can measure. Is exactly the same algorithm, just easier to read since it
uses well known stl algos.
-
conansherry opened a new issue #9163: how to reshape ouput during forward like
caffe?
URL: https://github.com/apache/incubator-mxnet/issues/9163
like faster-rcnn proposal_target_layer.py
def forward(self, bottom, top):
.
# sampled rois
top
Feywell opened a new issue #9162: fatal error: cub/cub.cuh: No such file or
directory
URL: https://github.com/apache/incubator-mxnet/issues/9162
## Description
I encounter a error, when I install from code as follow:
`In file included from src/operator/nn/convolution.cu:33:0:
src/o
reminisce commented on issue #8822: Can mx.nd.where(condition, x, y) supports
if both x and y are None?
URL:
https://github.com/apache/incubator-mxnet/issues/8822#issuecomment-353288865
More and more people are asking for this feature. I think we should start
considering adding this to MX
85 matches
Mail list logo