Disable speculative retry only for specific stages?

2016-01-22 Thread Adam McElwee
I've used speculative execution a couple times in the past w/ good results, but I have one stage in my job with a non-idempotent operation in a `forEachPartition` block. I don't see a way to disable speculative retry on certain stages, but does anyone know of any tricks to help out here? Spark

Re: How to connect HadoopHA from spark

2015-10-01 Thread Adam McElwee
Do you have all of the required HDFS HA config options in your override? I think these are the minimum required for HA: dfs.nameservices dfs.ha.namenodes.{nameservice ID} dfs.namenode.rpc-address.{nameservice ID}.{name node ID} On Thu, Oct 1, 2015 at 7:22 AM, Vinoth Sankar