In Spark Streaming job, I see a Batch failed with following error. Haven't
seen anything like this earlier.

This has happened during Shuffle for one Batch (haven't reoccurred after
that).. Just curious to know what can cause this error. I am running Spark
1.5.1

Regards,
Dibyendu


Job aborted due to stage failure: Task 2801 in stage 9421.0 failed 4
times, most recent failure: Lost task 2801.3 in stage 9421.0:
java.lang.IllegalArgumentException: requirement failed: File segment
length cannot be negative (got -68321)
        at scala.Predef$.require(Predef.scala:233)
        at org.apache.spark.storage.FileSegment.<init>(FileSegment.scala:28)
        at 
org.apache.spark.storage.DiskBlockObjectWriter.fileSegment(DiskBlockObjectWriter.scala:216)
        at 
org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:684)
        at 
org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:80)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Reply via email to