Hi all, 

While doing line-oriented file reads, I'm seeing what looks like alot of 
memory being allocated.   In the case below, the file is a little over 
250mb, but @time seems to indicate over 3gb is being allocated, even though 
I'm working with one line at a time.

Should I be worried about the impact on performance, e.g. due to garbage 
collection?
And is there a way to cut down on the memory usage?

thanks,
Keith

;wc testfile.csv

  3391853   3391853 267068728 testfile.csv


function test_readlines()
    # simplified version of code from zindex.jl
    fi=open("testfile.csv", "r")
    x=""; y=""; z="" 
    for line in readlines(fi)
      rec = split(chomp(line),',')
        x=rec[9]; y=rec[10]; z=rec[11]
    end
    println("read file.  last row: $x $y $z")
    close(fi)
    return x,y,z
end

@time x,y,z=test_readlines()
@time test_readlines()

read file.  last row: -121.590713 38.695626 35.104988
elapsed time: 12.547989688 seconds (3289077104 bytes allocated)
read file.  last row: -121.590713 38.695626 35.104988
elapsed time: 12.539404518 seconds (3288969180 bytes allocated)

Reply via email to