I saw some exceptions like this in driver log. Can you shed some lights? Is it 
related with the behaviour?

 

14/07/11 20:40:09 ERROR LiveListenerBus: Listener JobProgressListener threw an 
exception

java.util.NoSuchElementException: key not found: 64019

         at scala.collection.MapLike$class.default(MapLike.scala:228)

         at scala.collection.AbstractMap.default(Map.scala:58)

         at scala.collection.mutable.HashMap.apply(HashMap.scala:64)

         at 
org.apache.spark.ui.jobs.JobProgressListener.onStageCompleted(JobProgressListener.scala:78)

         at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$2.apply(SparkListenerBus.scala:48)

         at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$2.apply(SparkListenerBus.scala:48)

         at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)

         at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:79)

         at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

         at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

         at 
org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:79)

         at 
org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:48)

         at 
org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)

         at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)

         at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)

         at scala.Option.foreach(Option.scala:236)

         at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)

         at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)

         at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)

         at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)

         at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)

 

________________________________

From: Haopu Wang 
Sent: Thursday, July 10, 2014 7:38 PM
To: user@spark.apache.org
Subject: RE: All of the tasks have been completed but the Stage is still shown 
as "Active"?

 

I didn't keep the driver's log. It's a lesson.

I will try to run it again to see if it happens again.

 

________________________________

From: Tathagata Das [mailto:tathagata.das1...@gmail.com] 
Sent: 2014年7月10日 17:29
To: user@spark.apache.org
Subject: Re: All of the tasks have been completed but the Stage is still shown 
as "Active"?

 

Do you see any errors in the logs of the driver?

 

On Thu, Jul 10, 2014 at 1:21 AM, Haopu Wang <hw...@qilinsoft.com> wrote:

I'm running an App for hours in a standalone cluster. From the data
injector and "Streaming" tab of web ui, it's running well.

However, I see quite a lot of Active stages in web ui even some of them
have all of their tasks completed.

I attach a screenshot for your reference.

Do you ever see this kind of behavior?

 

Reply via email to