Re: Missing parents for stage (Spark Streaming)

2014-11-22 Thread Sean Owen
This message appears in normal operation. I do not think it refers to
anything in your code.
On Nov 21, 2014 11:57 PM, YaoPau jonrgr...@gmail.com wrote:

 When I submit a Spark Streaming job, I see these INFO logs printing
 frequently:

 14/11/21 18:53:17 INFO DAGScheduler: waiting: Set(Stage 216)
 14/11/21 18:53:17 INFO DAGScheduler: failed: Set()
 14/11/21 18:53:17 INFO DAGScheduler: Missing parents for Stage 216: List()
 14/11/21 18:53:17 INFO DAGScheduler: Submitting Stage 216 (MappedRDD[1733]
 at map at MappedDStream.scala:35), which is now runnable

 I have a feeling this means there is some error with a Map I created as a
 broadcast variable, but I'm not sure.  How can I figure out what this is
 referring to?




 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Missing-parents-for-stage-Spark-Streaming-tp19530.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Missing parents for stage (Spark Streaming)

2014-11-21 Thread YaoPau
When I submit a Spark Streaming job, I see these INFO logs printing
frequently:

14/11/21 18:53:17 INFO DAGScheduler: waiting: Set(Stage 216)
14/11/21 18:53:17 INFO DAGScheduler: failed: Set()
14/11/21 18:53:17 INFO DAGScheduler: Missing parents for Stage 216: List()
14/11/21 18:53:17 INFO DAGScheduler: Submitting Stage 216 (MappedRDD[1733]
at map at MappedDStream.scala:35), which is now runnable

I have a feeling this means there is some error with a Map I created as a
broadcast variable, but I'm not sure.  How can I figure out what this is
referring to?




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Missing-parents-for-stage-Spark-Streaming-tp19530.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org