Hello,

I have to import, clean and merge data from different sources (Excel sheets, 
Access, text files, etc) into a new database (Postgresql 8.1).
Some of those excel sheets, which have been converted to CSV files, are very 
very big (about 500 000 lines).
For this new project I decided to use SQLAlchemy to import those data (I used 
it only for the web interface in the past), and it's really a big saving of 
time.
However, after ~15000 iterations my script consumes about 60% of the memory and 
things become very very slow ... 
For some parts of the script things can be slow, because for each line in file 
A I have to search corresponding line in file B (but file B is not so big in 
this case).

I don't know if this problem is related to SQLAlchemy or not ... but I didn't 
have this problem before.

Here is the script : http://rafb.net/p/RvsH3133.html (it's quite huge but the 
details aren't important)

Does somebody has an idea what could cause this high memory consumption ?

Thanks,
Julien

-- 
Julien Cigar <[EMAIL PROTECTED]>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to