I m really sorry if no one of you liked/agreed with the fridge analogy but
that's what my brain could come up with at the time, I have to say it's not
a very scientific argument. but I only meant to say that if you are piping
data into memory & this data is larger than that memory then there is no
problem with the code but with the data, & I think this paragraph actually
confirms some of it :

For anyone who cares about the real issue: it seems that tarfile.py caches
> every member it processes in an internal list.  The list isn't actually
> used if accessing the file as an iterator, so by reinitializing it to [],
> the memory consumption problem is avoided.  This breaks other methods of
> the module, which are used to extract particular desired members, but in
> my case, that's okay.
>

but I have to admit I was completely wrong and a new patch to the tarfile
module will soon see the light to fix this problem for the rest of ur lives
painlessly.
_______________________________________________
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor

Reply via email to