I have to run a script on 2MM objects to update the database. Not really a
schema migration, more like changing the internal data representation in
the fields.
There's a bit of post-processing and bottlenecks involved, so doing
everything one-at-a-time will take a few days.
I'd like to split
On 10/14/15 12:55 PM, jason kirtland wrote:
> If you can partition the rows numerically, this is trivially easily to
> implement using redis as the orchestrator.
>
> For example if you have integer PKs, you might have a loop like:
>
> offset = 0
> while offset < tablesize:
>
If you can partition the rows numerically, this is trivially easily to
implement using redis as the orchestrator.
For example if you have integer PKs, you might have a loop like:
offset = 0
while offset < tablesize:
for row in query[offset:batchsize]:
migrate(row)