In article <7f01f7b7-a561-483a-8e6d-861a8c05f...@p6g2000pre.googlegroups.com>,
forrest yang  <gforrest.y...@gmail.com> wrote:
>
>i try to load a big file into a dict, which is about 9,000,000 lines,
>something like
>1 2 3 4
>2 2 3 4
>3 4 5 6
>
>code
>for line in open(file)
>   arr=line.strip().split('\t')
>   dict[arr[0]]=arr
>
>but, the dict is really slow as i load more data into the memory, by
>the way the mac i use have 16G memory.
>is this cased by the low performace for dict to extend memory or
>something other reason.

Try gc.disable() before the loop and gc.enable() afterward.
-- 
Aahz (a...@pythoncraft.com)           <*>         http://www.pythoncraft.com/

"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur."  --Red Adair
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to