Re: job reports as KILLED in standalone mode

2013-10-18 Thread Ameet Kini
Gotcha, so its expected behavior. Thanks Aaron. Ameet On Fri, Oct 18, 2013 at 12:10 PM, Aaron Davidson wrote: > Whenever an Executor ends, it enters into one of three states: KILLED, > FAILED, LOST (see: > 1

Re: job reports as KILLED in standalone mode

2013-10-18 Thread Aaron Davidson
Whenever an Executor ends, it enters into one of three states: KILLED, FAILED, LOST (see: 1). None of these sound like "exited cleanly," which I a

Re: job reports as KILLED in standalone mode

2013-10-18 Thread Ameet Kini
Jey, I don't see a "close()" method on SparkContext. http://spark.incubator.apache.org/docs/latest/api/core/index.html#org.apache.spark.SparkContext I tried the "stop()" method but still see the job is reported KILLED. Btw, I don't recall getting this behavior in 0.7.3, my standalone programs use

Re: job reports as KILLED in standalone mode

2013-10-17 Thread Jey Kottalam
You can try calling the "close()" method on your SparkContext, which should allow for a cleaner shutdown. On Thu, Oct 17, 2013 at 2:38 PM, Ameet Kini wrote: > > I'm using the scala 2.10 branch of Spark in standalone mode, and am seeing > the job reports itself as KILLED in the UI with the below m

job reports as KILLED in standalone mode

2013-10-17 Thread Ameet Kini
I'm using the scala 2.10 branch of Spark in standalone mode, and am seeing the job reports itself as KILLED in the UI with the below message in each of the executors log, even though the job processes correctly and returns the correct result. The job is triggered by a .count on an RDD and the count