On Oct 31, 4:58 pm, Mark Triggs <[EMAIL PROTECTED]> wrote:
> On Oct 31, 1:57 pm, Mark Triggs <[EMAIL PROTECTED]> wrote:
>
> > When I ran my code it very quickly ran out of memory and fell over.
> > After thinking about it for a while I've realised it must be because my
> > 'do-something' function call is hanging on to the head of the list, so
> > as its elements are realised and cached it gradually eats up all my
> > memory.
>
> Answering my own question, using the function itself as the recur
> target does exactly what I want:
>
> ;; Process the list one chunk at a time
> (defn do-something [biglist]
> (when biglist
> (doall (take 1000 biglist))
> (recur (drop 1000 biglist))))
>
> I guess I should have tried it instead of assuming it wouldn't work.
> Please excuse my talking to myself :o)
>
Sorry for the delay. You are right, if you want to do this manually
you have to take care not to retain the head of the list.
OTOH, you might want to reconsider doing it manually and leverage the
seq functions that already handle this:
(map process-a-chunk (partition 1000 biglist))
Rich
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups
"Clojure" group.
To post to this group, send email to clojure@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
-~----------~----~----~----~------~----~------~--~---