Hi, I have a script which will process a large amount of data and save some of those results to the database. This takes a large amount of time and so what I would do in something like java is to lock the tables before processing, do the work, and then lock them after processing..........concurrency isnt needed, no other users apart from me. I lock the tables before in order to get a speed increase. I am trying to figure out how to do this using SQLAlchemy......
session = SDY.Session() conn = SDY.serendipity_engine.connect() analysis =SDY.Analysis() ans = {} session.add(analysis) locks = [ "intervals", "entities" ] q = session.query(SDY.Sentence).order_by(SDY.Sentence.sentence_id) for s in iterresults(q, conn, locks): results = go(s) conn.execute(entities_table.insert(), results ) class iterresults(object): def __init__(self, query, conn = None, locks = []): self.query = query self.n = 1000 self.conn = conn self.locks = locks def __iter__(self): start = 0 while True: index = 0 for tblname in self.locks: self.conn.execute("lock tables " + tblname + " write") r = self.query[start:start + self.n] for i in r: index = index + 1 yield i if index < self.n: break else: start += self.n if self.locks: self.conn.execute("unlock tables") I get the following exception: sqlalchemy.exc.OperationalError: (OperationalError) (1100, "Table 'entities' was not locked with LOCK TABLES") u'INSERT INTO entities (interval_id, type) VALUES (%s, %s, %s, %s, %s)' [29046L, 'FOO'] What I am doing wrong? Is there something with the way conn.execute and insert operate? And can I find a way around this? Nathan --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "sqlalchemy" group. To post to this group, send email to sqlalchemy@googlegroups.com To unsubscribe from this group, send email to sqlalchemy+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/sqlalchemy?hl=en -~----------~----~----~----~------~----~------~--~---