Hi Team,

I getting below exception in spark jobs. Please let me know how to fix this
issue.

*Below is my cluster configuration :- *

I am using SparkJobServer to trigger the jobs. Below is my configuration in
SparkJobServer.

   - num-cpu-cores = 4
   - memory-per-node = 4G

I have a 4 workers in my cluster.


"result": {
    "errorClass": "org.apache.spark.SparkException",
    "cause":
"/tmp/spark-31a538f3-9451-4a2d-9123-00feb56c9e91/executor-73be6ffd-cd03-452a-bc99-a44290953d4f/spark-d0630f1f-e3df-4714-af30-4c839f6e3e8a/9400069401471754061530_lock
(No such file or directory)",
    "stack": ["java.io.RandomAccessFile.open0(Native Method)",
"java.io.RandomAccessFile.open(RandomAccessFile.java:316)",
"java.io.RandomAccessFile.<init>(RandomAccessFile.java:243)",
"org.apache.spark.util.Utils$.fetchFile(Utils.scala:373)",
"org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)",
"org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)",
"scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)",
"scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)",
"scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)",
"scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)",
"scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)",
"scala.collection.mutable.HashMap.foreach(HashMap.scala:98)",
"scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)",
"org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)",
"org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)",
"java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)",
"java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)",
"java.lang.Thread.run(Thread.java:745)"],
    "causingClass": "java.io.FileNotFoundException",
    "message": "Job aborted due to stage failure: Task 0 in stage 1286.0
failed 4 times, most recent failure: Lost task 0.3 in stage 1286.0 (TID
39149, svcjo-prd911.cisco.com): java.io.FileNotFoundException:
/tmp/spark-31a538f3-9451-4a2d-9123-00feb56c9e91/executor-73be6ffd-cd03-452a-bc99-a44290953d4f/spark-d0630f1f-e3df-4714-af30-4c839f6e3e8a/9400069401471754061530_lock
(No such file or directory)\n\tat java.io.RandomAccessFile.open0(Native
Method)\n\tat
java.io.RandomAccessFile.open(RandomAccessFile.java:316)\n\tat
java.io.RandomAccessFile.<init>(RandomAccessFile.java:243)\n\tat
org.apache.spark.util.Utils$.fetchFile(Utils.scala:373)\n\tat
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)\n\tat
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)\n\tat
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)\n\tat
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)\n\tat
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)\n\tat
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)\n\tat
scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)\n\tat
scala.collection.mutable.HashMap.foreach(HashMap.scala:98)\n\tat
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)\n\tat
org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)\n\tat
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)\n\tat
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)\n\tat
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)\n\tat
java.lang.Thread.run(Thread.java:745)\n\nDriver stacktrace:"
  },

Regards,
Rajesh

Reply via email to