Hi,

Would you mind sharing the piece of code that caused this exception? As per
Javadoc NoSuchElementException is thrown if you call nextElement() method
of Enumeration and there is no more element in Enumeration.



Thanks
Best Regards.


On Tue, Apr 22, 2014 at 8:50 AM, gogototo <wangbi...@gmail.com> wrote:

> 14/04/22 10:43:45 WARN scheduler.TaskSetManager: Loss was due to
> java.util.NoSuchElementException
> java.util.NoSuchElementException: End of stream
>         at org.apache.spark.util.NextIterator.next(NextIterator.scala:83)
>         at
> org.apache.spark.InterruptibleIterator.next(InterruptibleIterator.scala:29)
>         at
>
> org.apache.spark.graphx.impl.RoutingTable$$anonfun$1.apply(RoutingTable.scala:52)
>         at
>
> org.apache.spark.graphx.impl.RoutingTable$$anonfun$1.apply(RoutingTable.scala:51)
>         at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:450)
>         at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:450)
>         at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:34)
>         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:241)
>         at org.apache.spark.rdd.RDD.iterator(RDD.scala:232)
>         at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:161)
>         at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
>         at org.apache.spark.scheduler.Task.run(Task.scala:53)
>         at
>
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:213)
>         at
> org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:49)
>         at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
>         at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:744)
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-solve-this-problem-tp4579.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to