Re: spark.streaming.concurrentJobs

2015-09-25 Thread Atul Kulkarni
Can someone please help either by explaining or pointing to documentation the relationship between #executors needed and How to let the concurrent jobs that are created by the above parameter run in parallel? On Thu, Sep 24, 2015 at 11:56 PM, Atul Kulkarni <atulskulka...@gmail.com> wrote:

spark.streaming.concurrentJobs

2015-09-25 Thread Atul Kulkarni
I am curious if there is a requirement that #Executors be >= a particular number (a calculation based on how many repartitions after unio od DSreams etc. - I don't know I am grasping at Straws here.) I would appreciate some help in this regard. Thanks in advance. -- Regards, Atul Kulkarni

Re: Spark based Kafka Producer

2015-09-11 Thread Atul Kulkarni
Folks, Any help on this? Regards, Atul. On Fri, Sep 11, 2015 at 8:39 AM, Atul Kulkarni <atulskulka...@gmail.com> wrote: > Hi Raghavendra, > > Thanks for your answers, I am passing 10 executors and I am not sure if > that is the problem. It is still hung. > > Regards, &

Re: Spark based Kafka Producer

2015-09-11 Thread Atul Kulkarni
pening inside would be helpful in understanding the working so that I don't make such mistake again. Regards, Atul. On Fri, Sep 11, 2015 at 11:32 AM, Atul Kulkarni <atulskulka...@gmail.com> wrote: > Folks, > > Any help on this? > > Regards, > Atul. > > >

Re: Spark based Kafka Producer

2015-09-11 Thread Atul Kulkarni
mand line option > --num-executors.You need more than 2 executors to make spark-streaming > working. > > For more details on command line option, please go through > http://spark.apache.org/docs/latest/running-on-yarn.html. > > > On Fri, Sep 11, 2015 at 10:52 AM, Atul Kulkarni

Spark based Kafka Producer

2015-09-10 Thread Atul Kulkarni
be that Streaming context is not able to read *.gz files? I am not sure what more details I can provide to help explain my problem. -- Regards, Atul Kulkarni

Re: Spark based Kafka Producer

2015-09-10 Thread Atul Kulkarni
read can run and that is why your job is stuck. > Specify it local[*], to make thread pool equal to number of cores... > > Raghav > On Sep 11, 2015 6:06 AM, "Atul Kulkarni" <atulskulka...@gmail.com> wrote: > >> Hi Folks, >> >> Below is the code have for Spark ba