leleamol commented on issue #15166: asnumpy() fails on float16 gradient URL: https://github.com/apache/incubator-mxnet/issues/15166#issuecomment-500076301 > @leleamol This is my bad... I was attempting to post an issue that got at the root issue, but I now realize I posted something with a different cause, which is as you have pointed out, not a bug. > > Here is a MRE and stack trace for the actual issue I am facing. I can post it as a separate issue and close this one if desired. > ## MRE > > ``` > import mxnet as mx > from mxnet.gluon.rnn import LSTM > > fake_data = mx.nd.random.uniform(shape=(1, 32, 32), dtype="float16").as_in_context(mx.gpu(0)) > fake_label = mx.nd.random.uniform(shape=(1, 32), dtype="float16").as_in_context(mx.gpu(0)) > > lstm_layer = LSTM(32, dtype='float16') > lstm_layer.initialize(ctx=mx.gpu(0)) > > ctc_loss = mx.gluon.loss.CTCLoss() > > with mx.autograd.record(): > x = lstm_layer(fake_data) > loss = ctc_loss(x, fake_label) > > loss.backward() > l = mx.nd.mean(loss).asnumpy() > ``` > > ## Stack trace > > ``` > --------------------------------------------------------------------------- > MXNetError Traceback (most recent call last) > <ipython-input-69-6bfc070aed48> in <module>() > 13 > 14 loss.backward() > ---> 15 l = mx.nd.mean(loss).asnumpy() > > ~/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/ndarray/ndarray.py in asnumpy(self) > 1994 self.handle, > 1995 data.ctypes.data_as(ctypes.c_void_p), > -> 1996 ctypes.c_size_t(data.size))) > 1997 return data > 1998 > > ~/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/base.py in check_call(ret) > 251 """ > 252 if ret != 0: > --> 253 raise MXNetError(py_str(_LIB.MXGetLastError())) > 254 > 255 > > MXNetError: [23:39:12] include/mxnet/././tensor_blob.h:236: Check failed: mshadow::DataType<DType>::kFlag == type_flag_: TBlob.get_with_shape: data type do not match specified type.Expected: 2 v.s. given 0 > Stack trace: > [bt] (0) /home/ec2-user/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x4ac1eb) [0x7ff57de371eb] > [bt] (1) /home/ec2-user/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x30c8972) [0x7ff580a53972] > [bt] (2) /home/ec2-user/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x31dc115) [0x7ff580b67115] > [bt] (3) /home/ec2-user/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/libmxnet.so(mxnet::imperative::PushFCompute(std::function<void (nnvm::NodeAttrs const&, mxnet::OpContext const&, std::vector<mxnet::TBlob, std::allocator<mxnet::TBlob> > const&, std::vector<mxnet::OpReqType, std::allocator<mxnet::OpReqType> > const&, std::vector<mxnet::TBlob, std::allocator<mxnet::TBlob> > const&)> const&, nnvm::Op const*, nnvm::NodeAttrs const&, mxnet::Context const&, std::vector<mxnet::engine::Var*, std::allocator<mxnet::engine::Var*> > const&, std::vector<mxnet::engine::Var*, std::allocator<mxnet::engine::Var*> > const&, std::vector<mxnet::Resource, std::allocator<mxnet::Resource> > const&, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*, std::allocator<mxnet::NDArray*> > const&, std::vector<unsigned int, std::allocator<unsigned int> > const&, std::vector<mxnet::OpReqType, std::allocator<mxnet::OpReqType> > const&)::{lambda(mxnet::RunContext)#1}::operator()(mxnet::RunContext) const+0x307) [0x7ff57ffd9f47] > [bt] (4) /home/ec2-user/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x259adf4) [0x7ff57ff25df4] > [bt] (5) /home/ec2-user/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x25a8789) [0x7ff57ff33789] > [bt] (6) /home/ec2-user/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x25abbf0) [0x7ff57ff36bf0] > [bt] (7) /home/ec2-user/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x25abe86) [0x7ff57ff36e86] > [bt] (8) /home/ec2-user/anaconda3/envs/mxnet_p36/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x25a6f94) [0x7ff57ff31f94] > ``` Thanks for the update @charlieyou , I would recommend opening a separate issue and closing this one.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services