Hi Akhil,

I suppose this will give me the transformed msg & not the original msg.

I need the data corresponding to msgStream & not wordCountPair.

As per my understanding, we need to keep a copy of incoming stream (not sure 
how), so as to refer to that in catch block.

Regards,
Sam

From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Wednesday, September 16, 2015 12:24 PM
To: Samya MAITI <samya.ma...@amadeus.com>
Cc: user@spark.apache.org
Subject: Re: Getting parent RDD

​How many RDDs are you having in that stream? If its a single RDD then you 
could do a .foreach and log the message, something like:


val ssc = ....
val msgStream = .....   //SparkKafkaDirectAPI
val wordCountPair = TransformStream.transform(msgStream)
/wordCountPair.foreach(
​msg​
=>
      try{
        //Some action that causes exception
      }catch {
        case ex1 : Exception => {
           // *How to get hold of the msgStream, so that I can log the
actual message that caused the exception.*
​     Log.error("Whoops! This message :=>" + msg)​

      }
)/​


Thanks
Best Regards

On Tue, Sep 15, 2015 at 9:13 PM, Samya 
<samya.ma...@amadeus.com<mailto:samya.ma...@amadeus.com>> wrote:
Hi Team

I have the below situation.

val ssc = ....
val msgStream = .....   //SparkKafkaDirectAPI
val wordCountPair = TransformStream.transform(msgStream)
/wordCountPair.foreachRDD(rdd =>
      try{
        //Some action that causes exception
      }catch {
        case ex1 : Exception => {
           // *How to get hold of the msgStream, so that I can log the
actual message that caused the exception.*
      }
)/


Regards,
Sam



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Getting-parent-RDD-tp24701.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>

Reply via email to