I have to run a script on 2MM objects to update the database.  Not really a 
schema migration, more like changing the internal data representation in 
the fields.

There's a bit of post-processing and bottlenecks involved, so doing 
everything one-at-a-time will take a few days.

I'd like to split this out into 5-10 'task runners' that are each 
responsible for a a section of the database (ie, every 5th record).  That 
should considerably drop the runtime.

I thought I had seen a recipe for this somewhere, but checked and couldn't 
find anything.  That leads me to question if this is a good idea or not. 
 Anyone have thoughts/pointers?

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to