Re: Spark based Kafka Producer

2015-09-11 Thread Atul Kulkarni
ster conf.. By default it is local, that >>>> means only one thread can run and that is why your job is stuck. >>>> Specify it local[*], to make thread pool equal to number of cores... >>>> >>>> Raghav >>>> On Sep 11, 2015 6:06 AM, "Atu

Re: Spark based Kafka Producer

2015-09-11 Thread Raghavendra Pandey
ka...@gmail.com> wrote: >> >>> Hi Folks, >>> >>> Below is the code have for Spark based Kafka Producer to take advantage >>> of multiple executors reading files in parallel on my cluster but I am >>> stuck at The program not making any progress. &

Re: Spark based Kafka Producer

2015-09-11 Thread Atul Kulkarni
@gmail.com> wrote: >>>> >>>>> What is the value of spark master conf.. By default it is local, that >>>>> means only one thread can run and that is why your job is stuck. >>>>> Specify it local[*], to make thread pool equal to number of cores... >&g

Re: Spark based Kafka Producer

2015-09-11 Thread Atul Kulkarni
t;> What is the value of spark master conf.. By default it is local, that >>> means only one thread can run and that is why your job is stuck. >>> Specify it local[*], to make thread pool equal to number of cores... >>> >>> Raghav >>> On Sep 11, 201

Spark based Kafka Producer

2015-09-10 Thread Atul Kulkarni
Hi Folks, Below is the code have for Spark based Kafka Producer to take advantage of multiple executors reading files in parallel on my cluster but I am stuck at The program not making any progress. Below is my scrubbed code: val sparkConf = new SparkConf().setAppName(applicationName) val ssc

Re: Spark based Kafka Producer

2015-09-10 Thread Atul Kulkarni
read can run and that is why your job is stuck. > Specify it local[*], to make thread pool equal to number of cores... > > Raghav > On Sep 11, 2015 6:06 AM, "Atul Kulkarni" <atulskulka...@gmail.com> wrote: > >> Hi Folks, >> >> Below is the code have for Spark ba

Re: Spark based Kafka Producer

2015-09-10 Thread Raghavendra Pandey
ote: > Hi Folks, > > Below is the code have for Spark based Kafka Producer to take advantage > of multiple executors reading files in parallel on my cluster but I am > stuck at The program not making any progress. > > Below is my scrubbed code: > > val sparkConf = new Sp