I've got a question about lazy-sequence and file reading.
Is line-seq good to process lines from huge file?
Let take this case, I want to process each line from a file with one or 
more functions. All lines must be processed. Line-seq return a lazy 
sequence, it means all already read lines stay in memory, doesn't it?
So, if the processed file's size is many gigabytes, my heap size will 
explode, right?  Or did I miss something?




Le samedi 27 octobre 2012 01:45:47 UTC+2, daveray a écrit :
>
> Hi, 
>
> At work I've had a few conversations about treating files, especially 
> large ones, as seqs of lines. In particular, the apparent conflict 
> between using clojure.core/with-open to ensure a file is closed 
> appropriately, and clojure.core/line-seq as a generic sequence of 
> lines which may be consumed by code that has no idea it's coming from 
> a file. I've been told [1] that C# solves this problem because the 
> IEnumerator interface is disposable so it's possible to clean up the 
> underlying file, even if it's been wrapped in several layers of 
> enumerators... and that since Clojure doesn't do this, it's flawed :) 
>
> Thoughs? I'm aware of the "custom seq that closes the file when the 
> end is reached" hack, but that doesn't seem very satisfying. How do 
> others process large files in Clojure? Just make sure that the 
> sequence is totally consumed within with-open? Just don't worry about 
> closing files? 
>
> Cheers, 
>
> Dave 
>
>
> [1] My C# experience is limited to a few days of writing example code 
> for the C# bindings of a product's API. :) 
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to