IvyGongoogle commented on issue #14512: the inference speed using C++ API with 
mxnet of higher version is slower than lower mxnet
URL: 
https://github.com/apache/incubator-mxnet/issues/14512#issuecomment-476528740
 
 
   @wkcn  thanks for your reply, 
   ```
   mxnet v0.8:
    initialization time(that is `MXPredCreat` function time): 69906ms(the first 
run)/1416ms/1419ms/1471ms, my model inference time:
   218ms/209.3ms/207.6ms/207.4ms
   mxnet v1.0.0:
    initialization time(that is `MXPredCreat` function time): 170572ms(the 
first run)/2779ms/2794ms/2756ms, my model inference time:
   218.5ms/218.9ms/215.5ms/224.8ms
   ```
   as we can see that the initialization time of higher mxnet is nearly double 
in contrast to mxnet v0.8, and my model inference time is also slower.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to