Re: spark.local.dir leads to Job cancelled because SparkContext was shut down

2015-03-04 Thread Akhil Das
When you say multiple directories, make sure those directories are available and spark have permission to write to those directories. You can look at the worker logs to see the exact reason of failure. Thanks Best Regards On Tue, Mar 3, 2015 at 6:45 PM, lisendong lisend...@163.com wrote: As

spark.local.dir leads to Job cancelled because SparkContext was shut down

2015-03-03 Thread lisendong
As long as I set the spark.local.dir to multiple disks, the job will failed, the errors are as follow: (if I set the spark.local.dir to only 1 dir, the job will succed...) Exception in thread main org.apache.spark.SparkException: Job cancelled because SparkContext was shut down at