Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

2020-02-28 Thread JackieWu
I think we should keep ONNX APIs, since it is able to export many basic models, 
although it is not perfect. Users will train their models in MXNet 2.0, and 
export ONNX model,  then use the ONNX model in their deployment frameworks. 
(http://onnx.ai/supported-tools).

It is useful to attract users to use MXNet 2.0 to train their models with ONNX. 

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-592878029

Re: [apache/incubator-mxnet] [RFC] Custom Operator Part 2 (#17006)

2019-12-28 Thread JackieWu
@larroy Users may need matrix operators and DNN Op(e.g. ReLU, Conv) when 
writing a custom Op. Although they can implement it by third-party libraries, 
it is more convenient to use the built-in functions in MXNet.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17006#issuecomment-569467244

Re: [apache/incubator-mxnet] [RFC] Custom Operator Part 2 (#17006)

2019-12-07 Thread JackieWu
Hi @samskalicky , thank you for the contribution!
I have several suggestions.

- custom GPU operators
  1. Provide CUDA stream in `OpResource`.
  2. Share the same function on CPU and GPU.
  Users can discriminate the context by `MXTensor::dltensor::ctx`
- Call framework specific math helper
  It is important for a custom operator. Users may call gemm, even convolution 
op in custom op.

Thanks.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17006#issuecomment-562898682

Re: [apache/incubator-mxnet] [RFC] Introducing NumPy-compatible coding experience into MXNet (#14253)

2019-08-28 Thread JackieWu
Reopened #14253.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/14253#event-2592397494

Re: [apache/incubator-mxnet] [RFC] Introducing NumPy-compatible coding experience into MXNet (#14253)

2019-08-28 Thread JackieWu
Hi @reminisce , I try to pass a numpy-compatible array into a legacy operator, 
and it raises this error.

```python
>>> import mxnet.numpy as np
>>> import mxnet as mx
>>> import mxnet.numpy as np
>>> a = np.array([1,2])
>>> b = np.array([3,4])
>>> mx.nd.broadcast_add(a,b)
Traceback (most recent call last):
  File "", line 1, in 
  File "", line 56, in broadcast_add
  File "/home/wkcn/proj/mxnet/python/mxnet/ndarray/register.py", line 99, in 
_verify_all_legacy_ndarrays
.format(op_name, func_name))
TypeError: Operator `broadcast_add` registered in backend is known as 
`broadcast_add` in Python. This is a legacy operator which can only accept 
legacy ndarrays, while received an MXNet numpy ndarray. Please call 
`as_nd_ndarray()` upon the numpy ndarray to convert it to a legacy ndarray, and 
then feed the converted array to this operator.
```

I hope that the legacy operator is the subset of 

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/14253#issuecomment-525985724

Re: [apache/incubator-mxnet] [RFC] Introducing NumPy-compatible coding experience into MXNet (#14253)

2019-08-28 Thread JackieWu
Closed #14253.

-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/14253#event-2592397357