ster conf.. By default it is local, that
>>>> means only one thread can run and that is why your job is stuck.
>>>> Specify it local[*], to make thread pool equal to number of cores...
>>>>
>>>> Raghav
>>>> On Sep 11, 2015 6:06 AM, "Atu
ka...@gmail.com> wrote:
>>
>>> Hi Folks,
>>>
>>> Below is the code have for Spark based Kafka Producer to take advantage
>>> of multiple executors reading files in parallel on my cluster but I am
>>> stuck at The program not making any progress.
&
@gmail.com> wrote:
>>>>
>>>>> What is the value of spark master conf.. By default it is local, that
>>>>> means only one thread can run and that is why your job is stuck.
>>>>> Specify it local[*], to make thread pool equal to number of cores...
>&g
t;> What is the value of spark master conf.. By default it is local, that
>>> means only one thread can run and that is why your job is stuck.
>>> Specify it local[*], to make thread pool equal to number of cores...
>>>
>>> Raghav
>>> On Sep 11, 201
Hi Folks,
Below is the code have for Spark based Kafka Producer to take advantage of
multiple executors reading files in parallel on my cluster but I am stuck
at The program not making any progress.
Below is my scrubbed code:
val sparkConf = new SparkConf().setAppName(applicationName)
val ssc
read can run and that is why your job is stuck.
> Specify it local[*], to make thread pool equal to number of cores...
>
> Raghav
> On Sep 11, 2015 6:06 AM, "Atul Kulkarni" <atulskulka...@gmail.com> wrote:
>
>> Hi Folks,
>>
>> Below is the code have for Spark ba
ote:
> Hi Folks,
>
> Below is the code have for Spark based Kafka Producer to take advantage
> of multiple executors reading files in parallel on my cluster but I am
> stuck at The program not making any progress.
>
> Below is my scrubbed code:
>
> val sparkConf = new Sp