Re: MemoryError and Pickle

2016-11-21 Thread Steve D'Aprano
On Tue, 22 Nov 2016 10:27 am, Fillmore wrote: > > Hi there, Python newbie here. > > I am working with large files. For this reason I figured that I would > capture the large input into a list and serialize it with pickle for > later (faster) usage. > Everything has worked beautifully until today

Re: MemoryError and Pickle

2016-11-21 Thread Steve D'Aprano
On Tue, 22 Nov 2016 11:40 am, Peter Otten wrote: > Fillmore wrote: > >> Hi there, Python newbie here. >> >> I am working with large files. For this reason I figured that I would >> capture the large input into a list and serialize it with pickle for >> later (faster) usage. > > But is it really

Re: MemoryError and Pickle

2016-11-21 Thread Chris Kaynor
On Mon, Nov 21, 2016 at 3:43 PM, John Gordon wrote: > In Fillmore > writes: > > >> Question for experts: is there a way to refactor this so that data may >> be filled/written/released as the scripts go and avoid the problem? >> code below. > > That depends on how the data will be read. Here is

Re: MemoryError and Pickle

2016-11-21 Thread Peter Otten
Fillmore wrote: > Hi there, Python newbie here. > > I am working with large files. For this reason I figured that I would > capture the large input into a list and serialize it with pickle for > later (faster) usage. But is it really faster? If the pickle is, let's say, twice as large as the or

Re: MemoryError and Pickle

2016-11-21 Thread John Gordon
In Fillmore writes: > Question for experts: is there a way to refactor this so that data may > be filled/written/released as the scripts go and avoid the problem? > code below. That depends on how the data will be read. Here is one way to do it: fileObject = open(filename, "w") for

MemoryError and Pickle

2016-11-21 Thread Fillmore
Hi there, Python newbie here. I am working with large files. For this reason I figured that I would capture the large input into a list and serialize it with pickle for later (faster) usage. Everything has worked beautifully until today when the large data (1GB) file caused a MemoryError :(