> I am using lazy-seqs to join two very large csv files. I am very certain > that I am not holding on to any of the heads and If I did .. the jvm would > be out of memory far sooner than what I am seeing currently. The size of the > file is something like 73 G and the Ram allocated to the jvm is about 8G . > It seems like a very gradual leak. Has anybody else encountered similar > problems? In case some of you feel that my code might be the culprit, the > following gist has the source.
OT, but I hereby nominate `lazy-join-sorted-map-seqs-with-only-second-map-seq-allowed-to-have-duplicate-fields` as the most awesome fn name of the decade ;-) Regards, BG -- Baishampayan Ghose b.ghose at gmail.com -- You received this message because you are subscribed to the Google Groups "Clojure" group. To post to this group, send email to clojure@googlegroups.com Note that posts from new members are moderated - please be patient with your first post. To unsubscribe from this group, send email to clojure+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/clojure?hl=en