pribadihcr opened a new issue #19550:
URL: https://github.com/apache/incubator-mxnet/issues/19550


   ## Description
   Got error in c++ inference example when enable tensorRT
   
   ### Error Message
   I got the following error:
   terminate called after throwing an instance of 'dmlc::Error'
     what():  [12:35:05] ../../include/mxnet-cpp/symbol.hpp:260: Check failed: 
MXSymbolInferShapeEx(GetHandle(), keys.size(), keys.data(), arg_ind_ptr.data(), 
arg_shape_data.data(), &in_shape_size, &in_shape_ndim, &in_shape_data, 
&out_shape_size, &out_shape_ndim, &out_shape_data, &aux_shape_size, 
&aux_shape_ndim, &aux_shape_data, &complete) == 0 (-1 vs. 0) : 
   Stack trace:
     [bt] (0) 
./imagenet_inference(dmlc::LogMessageFatal::~LogMessageFatal()+0x75) 
[0x564f81675465]
     [bt] (1) ./imagenet_inference(+0x20fea) [0x564f8167efea]
     [bt] (2) ./imagenet_inference(+0x2bcd7) [0x564f81689cd7]
     [bt] (3) ./imagenet_inference(+0x1536e) [0x564f8167336e]
     [bt] (4) ./imagenet_inference(+0xc92b) [0x564f8166a92b]
     [bt] (5) /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xe7) 
[0x7f67780d3b97]
     [bt] (6) ./imagenet_inference(+0xcf2a) [0x564f8166af2a]
   
   ### Steps to reproduce
   cpp-package/example/inference$ ./imagenet_inference --symbol_file 
"./model/Inception-BN-symbol.json" --params_file 
"./model/Inception-BN-0126.params" --batch_size 16 --num_inference_batches 500 
--benchmark --enableTRT
   
   
   ## Environment
   mxnet 1.8.0.rc2
   cuda-10.2
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@mxnet.apache.org
For additional commands, e-mail: issues-h...@mxnet.apache.org

Reply via email to