lsdustc opened a new issue #19628:
URL: https://github.com/apache/incubator-mxnet/issues/19628
## Description
(A clear and concise description of what the bug is.)
### Error Message
(Paste the complete error message. Please also include stack trace by
setting environment variable `DMLC_LOG_STACK_TRACE_DEPTH=100` before running
your script.)
`[14:22:42] /home/super/software/incubator-mxnet/src/storage/storage.cc:199:
Using Pooled (Naive) StorageManager for GPU
Traceback (most recent call last):
File "/data/workspace/mxnet_project/ime/test.py", line 53, in <module>
mx.nd.waitall()
File
"/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/ndarray/ndarray.py",
line 240, in waitall
check_call(_LIB.MXNDArrayWaitAll())
File
"/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/base.py",
line 246, in check_call
raise get_last_ffi_error()
mxnet.base.MXNetError: Traceback (most recent call last):
[bt] (13) /lib/x86_64-linux-gnu/libc.so.6(clone+0x3f) [0x7fd72e61d4cf]
[bt] (12) /lib/x86_64-linux-gnu/libpthread.so.0(+0x7fa3) [0x7fd72e87bfa3]
[bt] (11) /lib/x86_64-linux-gnu/libstdc++.so.6(+0xbbb2f) [0x7fd6b1f1db2f]
[bt] (10)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(std::thread::_State_impl<std::thread::_Invoker<std::tuple<std::function<void
(std::shared_ptr<dmlc::ManualEvent>)>, std::shared_ptr<dmlc::ManualEvent> > >
>::_M_run()+0x33) [0x7fd6fe279b63]
[bt] (9)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(std::_Function_handler<void
(std::shared_ptr<dmlc::ManualEvent>),
mxnet::engine::ThreadedEnginePerDevice::PushToExecute(mxnet::engine::OprBlock*,
bool)::{lambda()#4}::operator()()
const::{lambda(std::shared_ptr<dmlc::ManualEvent>)#1}>::_M_invoke(std::_Any_data
const&, std::shared_ptr<dmlc::ManualEvent>&&)+0x37) [0x7fd6fe27e3e7]
[bt] (8)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(void
mxnet::engine::ThreadedEnginePerDevice::GPUWorker<(dmlc::ConcurrentQueueType)0>(mxnet::Context,
bool,
mxnet::engine::ThreadedEnginePerDevice::ThreadWorkerBlock<(dmlc::ConcurrentQueueType)0>*,
std::shared_ptr<dmlc::ManualEvent> const&)+0x17e) [0x7fd6fe27e12e]
[bt] (7)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(mxnet::engine::ThreadedEngine::ExecuteOprBlock(mxnet::RunContext,
mxnet::engine::OprBlock*)+0x111) [0x7fd6fe27a9b1]
[bt] (6)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(+0x1e095b0)
[0x7fd6fe26f5b0]
[bt] (5)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(std::_Function_handler<void
(mxnet::RunContext), mxnet::imperative::PushFCompute(std::function<void
(nnvm::NodeAttrs const&, mxnet::OpContext const&, std::vector<mxnet::TBlob,
std::allocator<mxnet::TBlob> > const&, std::vector<mxnet::OpReqType,
std::allocator<mxnet::OpReqType> > const&, std::vector<mxnet::TBlob,
std::allocator<mxnet::TBlob> > const&)> const&, nnvm::Op const*,
nnvm::NodeAttrs const&, mxnet::Context const&, std::vector<mxnet::engine::Var*,
std::allocator<mxnet::engine::Var*> > const&, std::vector<mxnet::engine::Var*,
std::allocator<mxnet::engine::Var*> > const&, std::vector<mxnet::Resource,
std::allocator<mxnet::Resource> > const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, std::vector<unsigned int,
std::allocator<unsigned int> > const&, std::vector<mxnet::OpReqType
, std::allocator<mxnet::OpReqType> >
const&)::{lambda(mxnet::RunContext)#1}>::_M_invoke(std::_Any_data const&,
mxnet::RunContext&&)+0x17) [0x7fd6fe2f0867]
[bt] (4)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(mxnet::imperative::PushFCompute(std::function<void
(nnvm::NodeAttrs const&, mxnet::OpContext const&, std::vector<mxnet::TBlob,
std::allocator<mxnet::TBlob> > const&, std::vector<mxnet::OpReqType,
std::allocator<mxnet::OpReqType> > const&, std::vector<mxnet::TBlob,
std::allocator<mxnet::TBlob> > const&)> const&, nnvm::Op const*,
nnvm::NodeAttrs const&, mxnet::Context const&, std::vector<mxnet::engine::Var*,
std::allocator<mxnet::engine::Var*> > const&, std::vector<mxnet::engine::Var*,
std::allocator<mxnet::engine::Var*> > const&, std::vector<mxnet::Resource,
std::allocator<mxnet::Resource> > const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, std::vector<unsigned int,
std::allocator<unsigned int> > const&, std::vector<mxnet::OpReqType,
std::allocator<mxnet::OpReqType> > const&)::{la
mbda(mxnet::RunContext)#1}::operator()(mxnet::RunContext) const+0x9f2)
[0x7fd6fe2eff02]
[bt] (3)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(void
mxnet::op::CTCLossOpForward<mshadow::gpu>(nnvm::NodeAttrs const&,
mxnet::OpContext const&, std::vector<mxnet::TBlob, std::allocator<mxnet::TBlob>
> const&, std::vector<mxnet::OpReqType, std::allocator<mxnet::OpReqType> >
const&, std::vector<mxnet::TBlob, std::allocator<mxnet::TBlob> >
const&)+0x121c) [0x7fd7042c8072]
[bt] (2)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(mshadow::Tensor<mshadow::gpu,
3, float> mxnet::TBlob::get<mshadow::gpu, 3,
float>(mshadow::Stream<mshadow::gpu>*) const+0xf7) [0x7fd7038cc645]
[bt] (1)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(float*
mxnet::TBlob::dptr<float>() const+0x11d) [0x7fd6fe21e5ed]
[bt] (0)
/home/super/python-latest/lib/python3.9/site-packages/mxnet-2.0.0-py3.9.egg/mxnet/libmxnet.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x6b)
[0x7fd6fe05d72b]
File
"/home/super/software/incubator-mxnet/include/mxnet/././tensor_blob.h", line 256
MXNetError: Check failed: mshadow: :DataType<DType>::kFlag == type_flag_:
TBlob.get_with_shape: data type do not match specified type.Expected: half v.s.
given float`
## To Reproduce
(If you developed your own code, please provide a short script that
reproduces the error. For existing examples, please provide link.)
### Steps to reproduce
(Paste the commands you ran that produced the error.)
1.
2.
## What have you tried to solve it?
1.
2.
## Environment
***We recommend using our script for collecting the diagnostic information
with the following command***
`curl --retry 10 -s
https://raw.githubusercontent.com/apache/incubator-mxnet/master/tools/diagnose.py
| python3`
<details>
<summary>Environment Information</summary>
```
# Paste the diagnose.py command output here
```
</details>
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]