@William:
Thanks,

My Line size varies from 40 to 550 characters. Please note that text file
which I have to process is in gigabytes ( approx 50 GB ) . This was the
code which i used to process line by line without loading into memory.

*for lines in open('uniqname.txt'): *

* <processing>*

On Mon, Nov 4, 2013 at 7:16 PM, William Ray Wing <w...@mac.com> wrote:

> On Nov 4, 2013, at 8:30 AM, Amal Thomas <amalthomas...@gmail.com> wrote:
> How long are the lines in your file?  In particular, are they many
> hundreds or thousands of characters long, or are they only few hundred
> characters, say 200 or less?
>
> Unless they are so long as to exceed the normal buffer size of your OS's
> read-ahead buffer, I strongly suspect that the big time sink in your
> attempt to read line-by-line was some inadvertent inefficiency that you
> introduced.  Normally, when reading from a text file, python buffers the
> reads (or uses the host OS buffering).  Those reads pull in huge chunks of
> text WAY ahead of where the actual python processing is going on, and are
> VERY efficient.
>
> -Bill




-- 

*AMAL THOMAS*
_______________________________________________
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor

Reply via email to