Re: spark.local.dir leads to Job cancelled because SparkContext was shut down

2015-03-04 Thread Akhil Das
: As long as I set the spark.local.dir to multiple disks, the job will failed, the errors are as follow: (if I set the spark.local.dir to only 1 dir, the job will succed...) Exception in thread main org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

spark.local.dir leads to Job cancelled because SparkContext was shut down

2015-03-03 Thread lisendong
As long as I set the spark.local.dir to multiple disks, the job will failed, the errors are as follow: (if I set the spark.local.dir to only 1 dir, the job will succed...) Exception in thread main org.apache.spark.SparkException: Job cancelled because SparkContext was shut down

Job cancelled because SparkContext was shut down - failures!

2014-10-24 Thread Sadhan Sood
Hi, Trying to run a query on spark-sql but it keeps failing with this error on the cli ( we are running spark-sql on a yarn cluster): org.apache.spark.SparkException: Job cancelled because SparkContext was shut down at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop

Re: Job cancelled because SparkContext was shut down

2014-09-26 Thread jamborta
seems yarn kills some of the executors as they request more memory than expected. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Job-cancelled-because-SparkContext-was-shut-down-tp15189p15216.html Sent from the Apache Spark User List mailing list archive

Job cancelled because SparkContext was shut down

2014-09-25 Thread jamborta
executor to shut down [E 140926 01:00:13 base:56] Request failed 14/09/26 01:00:13 INFO YarnClientSchedulerBackend: Stopped [E 140926 01:00:13 base:57] {'error_msg': type 'exceptions.Exception', org.apache.spark.SparkException: Job cancelled because SparkContext was shut down, traceback object