Hi,

According to my knowledge of current Spark Streaming Kafka connector, I think 
there's no chance for APP user to detect such kind of failure, this will either 
be done by Kafka consumer with ZK coordinator, either by ReceiverTracker in 
Spark Streaming, so I think you don't need to take care of this issue from 
user's perspective.

If there's no new message coming to consumer, the consumer will wait.

Thanks
Jerry

-----Original Message-----
From: Hafiz Mujadid [mailto:hafizmujadi...@gmail.com] 
Sent: Thursday, December 4, 2014 2:47 PM
To: u...@spark.incubator.apache.org
Subject: Spark Streaming empty RDD issue

Hi Experts
I am using Spark Streaming to integrate Kafka for real time data processing.
I am facing some issues related to Spark Streaming So I want to know how can we 
detect
1) Our connection has been lost
2) Our receiver is down
3) Spark Streaming has no new messages to consume.

how can we deal these issues?

I will be glad to hear from you and will be thankful to you.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-empty-RDD-issue-tp20329.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to