Tien-Dung LE created SPARK-5499:
-----------------------------------

             Summary: iterative computing with 1000 iterations causes stage 
failure
                 Key: SPARK-5499
                 URL: https://issues.apache.org/jira/browse/SPARK-5499
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.2.0
            Reporter: Tien-Dung LE


I got an error "org.apache.spark.SparkException: Job aborted due to stage 
failure: Task serialization failed: java.lang.StackOverflowError" when 
executing an action with 1000 transformations cause.

Here is a code snippet to re-produce the error:

  import org.apache.spark.rdd.RDD
  var pair: RDD[(Long,Long)] = sc.parallelize(Array((1L,2L)))
    var newPair: RDD[(Long,Long)] = null

    for (i <- 1 to 1000) {
      newPair = pair.map(_.swap)
      pair = newPair
    }

    println("Count = " + pair.count())




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to