> I need to process a really huge text file (4GB) and this is what i
> need to do. It takes for ever to complete this. I read some where that
> "list comprehension" can fast up things. Can you point out how to do
> it in this case?
> thanks a lot!
>
>
> f = open('file.txt','r')
> for line in f:
>         db[line.split(' ')[0]] = line.split(' ')[-1]
>         db.sync()

What is db here? Looks like a dictionary but that doesn't have a sync method.
If the file is 4GB are you sure you want to store the whole thing into
memory? In case yes and you want to store it in a list then you can
use list comprehension like this:

db = [ line.split(' ')[-1] for line in open('file.txt','r') ]

or

db = [ ( line.split(' ')[0], line.split(' ')[-1] ) for line in
open('file.txt','r') ]

depending on what exactly you want to store. But reading 4GB into
memory will be slow in any case. You can use the timeit module to find
out which method is fastest.

HTH,
Daniel
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to