Hi,

I'm looking for the fastest way to read a folder of same-size images into an 
NDArray. Surprisingly, using a `for loop` of `concats` is 3x faster than doing 
a list comprehension. Any idea why? Any suggestion of fast technique for that?

Idea 1: For Loop of concats (**100ms**)

    ims = (mxim.imread(batch_path + '/' + piclist[0])
           .expand_dims(0)  # Create an extra dim for the concat
           .as_in_context(ctx))

    for picname in piclist[1:]:
        pic = mxim.imread(batch_path + '/' + picname).expand_dims(0)
        ims = nd.concat(ims, pic.as_in_context(ctx), dim=0)
        
    nd.waitall()

Idea 2: list comprehension (**320ms**)

    ims = nd.concat(
        *[mxim.imread(batch_path + '/' + pic).expand_dims(0) for pic in 
piclist],
        dim=0).as_in_context(ctx)

    nd.waitall()





---
[Visit 
Topic](https://discuss.mxnet.apache.org/t/reading-images-fast-list-comprehension-vs-for-loop/6660/1)
 or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.mxnet.apache.org/email/unsubscribe/3389a9218872324905cf9d15a862a2202f5250dbefade9c5237215a22f6cc35b).

Reply via email to