Hello,

I have a very large data set spread over multiple indexes that I want to 
basically grab each record/transform into another index. Reading the docs 
points me towards scan & scroll and then some bulk indexing. What concerns 
me is failure during this copy it seems there is no way to 'resume' this 
job if it fails in the middle. T Based off some initial tests this copy 
will take a long time to run and I wonder if I have overlooked some options 
or I am not thinking of something. The only thing I can think of is 
persisting scroll id and keeping them open for an extended period of time 
but the downside being this will have strong negative impact on ES memory.

Thanks,
Barry

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/4ce039ea-e44b-4ab6-acb0-ed7e4dfea35e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to