[GitHub] mwbyeon commented on issue #9308: batch size in inference

2018-01-05 Thread GitBox
mwbyeon commented on issue #9308: batch size in inference URL: https://github.com/apache/incubator-mxnet/issues/9308#issuecomment-355717931 @BenLag2906 This is code for understanding. (not tested) ```cpp // Allocate memory for a batch of images const int image_size = w

[GitHub] mwbyeon commented on issue #9308: batch size in inference

2018-01-04 Thread GitBox
mwbyeon commented on issue #9308: batch size in inference URL: https://github.com/apache/incubator-mxnet/issues/9308#issuecomment-355328275 This is an example. 1. Set the shape of input data with batch size. `(batch_size, channel, height, width)` https://github.com/apache/incub

[GitHub] mwbyeon commented on issue #9308: batch size in inference

2018-01-04 Thread GitBox
mwbyeon commented on issue #9308: batch size in inference URL: https://github.com/apache/incubator-mxnet/issues/9308#issuecomment-355328275 This is an example. 1. Set the shape of input data with batch size. `(batch_size, channel, height, width)` https://github.com/apache/incub

[GitHub] mwbyeon commented on issue #9308: batch size in inference

2018-01-04 Thread GitBox
mwbyeon commented on issue #9308: batch size in inference URL: https://github.com/apache/incubator-mxnet/issues/9308#issuecomment-355328275 This is an example. Fist, Set the shape of input data with batch size. `(batch_size, channel, height, width)` https://github.com/apache/in