Hi,

I am running a heavy memory/cpu overhead graphx application, I think the memory 
is sufficient and set RDDs’ StorageLevel using MEMORY_AND_DISK.

But I found there were some tasks failed due to following errors:

java.io.FileNotFoundException: 
/data/spark/local/spark-local-20150205151711-9700/09/rdd_3_275 (No files or 
folders of this type)

ExecutorLostFailure (executor 11 lost)


So, finally that stage failed:

org.apache.spark.shuffle.FetchFailedException: java.io.FileNotFoundException: 
/data/spark/local/spark-local-20150205151711-587a/16/shuffle_11_219_0.index


Anyone has points? Where I can get more details for this issue?


Best,
Yifan LI





Reply via email to