Re: Kafka Spark Streaming integration : Relationship between DStreams and Tasks

2019-05-12 Thread Sheel Pancholi
Hello Can anyone help me understand this? We work with Receiver based approach and are trying to move to Direct based approach. There is no problem as such moving from former to the latter. I am just trying to understand the inner details bottom up. Please help. Regards Sheel On Mon 13 May,

Re: Streaming job, catch exceptions

2019-05-12 Thread bsikander
>> Code would be very helpful, I will try to put together something to post here. >> 1. Writing in Java I am using Scala >> Wrapping the entire app in a try/catch Once the SparkContext object is created, a Future is started where actions and transformations are defined and streaming context is

Re: Streaming job, catch exceptions

2019-05-12 Thread Jason Nerothin
Code would be very helpful, but it *seems like* you are: 1. Writing in Java 2. Wrapping the *entire app *in a try/catch 3. Executing in local mode The code that is throwing the exceptions is not executed locally in the driver process. Spark is executing the failing code on the cluster. On Sun,

Re: Streaming job, catch exceptions

2019-05-12 Thread bsikander
Hi, Anyone? This should be a straight forward one :) -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Kafka Spark Streaming integration : Relationship between DStreams and Tasks

2019-05-12 Thread Sheel Pancholi
Hello Everyone I am trying to understand the internals of Spark Streaming (not Structured Streaming), specifically the way tasks see the DStream. I am going over the source code of Spark in scala, here . I understand the call stack: ExecutorCoarseGrainedBackend