Hi. I have a table with many records (about 1mln), mapping is straightforward - one DB record corresponds to one Python object of the mapped class. I have to do something with every record. Code like this doesn't work:
for inst in session.query(Bar): foo(inst) because all instances are loaded into memory before 'foo' is called - out of memory error or poor performance. Using offsets and limits doesnt' work good too, because doing eg. OFFSET 500000 causes very poor performance (memory usage is low, but executing SELECT takes much time; I'm using PostgreSQL). SQL Expression sql.select supports lazy yielding records, but the only way I can automatically transform fetched tuple into mapped class' object is calling Query.instances(cursor) method which loads all instances again (no possibility to load a part of result set). The only solution I have found is this: def generateAll(domainCls, table, session=sa.Session, minId=None, maxId=None): if minId is None: minId = session.query(domainCls).min(table.c.id) if maxId is None: maxId = session.query(domainCls).max(table.c.id) for id in xrange(minId, maxId + 1): inst = session.query(domainCls).get(id) if inst is not None: yield inst Am I missing something or there is no better way? I understand that in general case loading all instances at once is required, but I think that in cases like mine there should be some possibility for lazily loading instances. --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "sqlalchemy" group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~----------~----~----~----~------~----~------~--~---