Re: reading file objects in chunks
On Mon, 12 Nov 2007 17:47:29 +0100, Martin Marcher wrote: > I'd really like something nicer than > > chunksize = 26 > f = file("datafile.dat", buffering=chunksize) > > chunk = f.read(chunksize) > while len(chunk) == chunksize: > compute_data(chunk) > f.read(chunksize) > > I just don't feel comfortable with it for some reason I can't explain... chunksize = 26 f = open('datafile.dat', 'rb') for chunk in iter(lambda: f.read(chunksize), ''): compute_data(chunk) f.close() Ciao, Marc 'BlackJack' Rintsch -- http://mail.python.org/mailman/listinfo/python-list
reading file objects in chunks
Hi, I'm looking for something that will give me an iterator to a file-(like)-object. I have large files with only a single line in it that have fixed length fields like, record length is 26bytes, dataA is 10 bytes, dataB is 16 bytes. Now when I made my parsing stuff but can't find anything that will let me read those file efficiently (guess I'm just thinking too complicated). I'd like to have something like: f = file("datafile.dat", buffering=26) for chunk in f.read_in_chunks(): compute_data(chunk) f.iter() looked promising at first but somehow it doesn't do "the right thing"(tm). also itertools doesn't quite seem to be what I want. Maybe I just need coffee but right now I'm in the dark. I'd really like something nicer than chunksize = 26 f = file("datafile.dat", buffering=chunksize) chunk = f.read(chunksize) while len(chunk) == chunksize: compute_data(chunk) f.read(chunksize) I just don't feel comfortable with it for some reason I can't explain... thanks martin -- http://noneisyours.marcher.name http://feeds.feedburner.com/NoneIsYours -- http://mail.python.org/mailman/listinfo/python-list