Avik: this is actually the situation I'm in. My tight inner loops
involve elementwise operations on many small arrays, so the runtime cost
of using vectorized code is enormous. Using the macros provided by
Devectorize.jl only somewhat alleviates this problem, so I ended up
hand-writing devector
That overhead seems constant, beyond a single temporary. So I dont suppose
there is that much to be worried about, unless you are using many small
arrays.
julia> 800-sizeof(rand(2))
784
julia> 944-sizeof(rand(20))
784
On Tuesday, 29 April 2014 10:40:10 UTC+1, Carlos Becker wrote:
>
> Besides
Besides Julia internals, I suppose there is memory overhead in terms of the
structure holding the array itself (when temporaries are created).
I suppose an array isn't just the size in bytes of the data it holds, but
also information about its size/type/etc. Though I doubt that would add up
to 8
I just saw another part of your message, I am wondering also why memory
consumption is so high.
El martes, 29 de abril de 2014 11:31:09 UTC+2, Carlos Becker escribió:
>
> This is likely to be because Julia is creating temporaries. This is
> probably why you get increasing memory usage when incre
This is likely to be because Julia is creating temporaries. This is
probably why you get increasing memory usage when increasing array size.
This is a long topic, that will have to be solved (hopefully soon), I had a
previous question related to something similar
here: https://groups.google.com