On Dec 10, 2009, at 05:50 , Feng wrote:

> Hi all, when I query into a big table, it lead to memory error.

(...)

> for record in al_records:  #there are 18 million records in the table
> variation
>    pass

This is loading every record into memory before even starting processing the 
first one.

See the discussion at 
http://groups.google.com/group/sqlalchemy/browse_thread/thread/a32916af437cc366

With the new OurSQL-dialect and #1619 this should hopefully improve.

--
Alex Brasetvik

--

You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalch...@googlegroups.com.
To unsubscribe from this group, send email to 
sqlalchemy+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en.


Reply via email to