Hey William,
Thanks for pointing this inconsistency out out.
Best Regards,
Myrle
On Fri, Aug 9, 2019 at 12:22 AM William Shen
wrote:
> Not sure where this thread is heading toward, but I find the role
> definition listed on
> http://www.apache.org/foundation/how-it-works.html#roles clarifying
switched to immutable.Set and it works. this is weird as the code in
ScalaReflection.scala seems to support scala.collection.Set
cc: dev list, in case this is a bug
On Thu, Aug 8, 2019 at 8:41 PM Mohit Jaggi wrote:
> Is this not supported? I found this diff
>
Hi Sean,
To finish the job, I did need to set spark.stage.maxConsecutiveAttempts to a
large number e.g., 100; a suggestion from Jiang Xingbo.
I haven't seen any recent movement/PRs on this issue, but I'll see if we can
repro with a more recent version of Spark.
Best regards,
Tyson
Interesting but I'd put this on the JIRA, and also test vs master
first. It's entirely possible this is something else that was
subsequently fixed, and maybe even backported for 2.4.4.
(I can't quite reproduce it - just makes the second job fail, which is
also puzzling)
On Fri, Aug 9, 2019 at
Hi,
We are able to reproduce this bug in Spark 2.4 using the following program:
import scala.sys.process._
import org.apache.spark.TaskContext
val res = spark.range(0, 1 * 1, 1).map{ x => (x % 1000,
x)}.repartition(20)
res.distinct.count
// kill an executor in the stage