benz725 commented on issue #19731:
URL: 
https://github.com/apache/incubator-mxnet/issues/19731#issuecomment-773353505


   when I followed the instructions below, I do the last ninja step , it 
informed me that: 3rdparty/onnx-tensorrt/NvOnnxParser.h:26:10: fatal error: 
NvInfer.h: No such file or directory
   
   however, I had added the $TENSORRT_ROOT/include and $TENSORRT_ROOT/lib into 
my $LD_LIBRARY_PATH, how does this happen?
   And when I do ninja, many times it informed me :ninja: build stopped: 
subcommand failed.
   I search this in the internet, and get replies that the ulimit number is to 
small, then I run ulimit -n 10240 to set ulimit. But when I do ninja, It 
informed me :"ninja: build stopped: subcommand failed." again. why does this 
happen?
   
   > The build with tensorrt is currently not very user friendly. Nvidia plans 
to contribute a PR to improve it. For now, follow
   > 
https://github.com/apache/incubator-mxnet/blob/ae8c9748743ca98979964bd34643aca343f93c7c/ci/docker/runtime_functions.sh#L541-L594
   
   @leezu @Kh4L 
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to