On Oct 15, 2007, at 5:40 AM, pbienst wrote:

>
> Hi,
>
> I wrote a small program to parse a log file, generate some objects
> from it, and then store these objects in an SQL database using the
> ORM.
>
> It takes 20 seconds to parse a 1000 line file. I was wondering if
> that's intrinsic in using the high levels of abstraction, or if there
> is something that can be done about it, either in my code or in
> SQLAlchemy itself.
>

the speed issue is straightforward here, you have autoflush=True set  
on, and your session is flushing on every query (from the output  
below it seems it flushes 1400 times).  the expensive call is  
locate_dirty which is scanning through all the instances present to  
find changes on attributes.   since autoflush=True is a brand new  
feature in 0.4 we havent yet come across issues like these since  
historically people flush all changes manually.  So for now id turn  
the autoflush off and flush changes manually when database state is  
needed..while i noticed that turning off autoflush entirely seems to  
prevent the proper generation of primary keys here, it makes it all  
the way to the commit() and the point of issuing SQL in about 7 seconds.




--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to