Re: Number of executors in spark-1.6 and spark-1.5

2016-04-10 Thread Vikash Pareek
Hi Talebzadeh,

Thank for your quick response.

>>in 1.6, how many executors do you see for each node?
I have1 executor for 1 node with SPARK_WORKER_INSTANCES=1.

>>in standalone mode how are you increasing the number of worker instances.
Are you starting another slave on each node?
No, I am not starting another slave node, I just changed *spark-env.sh *for
each slave node i.e. set SPARK_WORKER_INSTANCES=2.





Best Regards,


Vikash Pareek
Software Developer, *InfoObjects Inc.*
m: +918800206898 a: E5, Jhalana Institutional Area, Jaipur
s: vikaspareek1991 e: vikash.par...@infoobjects.com



On Sun, Apr 10, 2016 at 3:00 PM, Mich Talebzadeh 
wrote:

> Hi,
>
> in 1.6, how many executors do you see for each node?
> in standalone mode how are you increasing the number of worker instances.
> Are you starting another slave on each node?
>
> HTH
>
>
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> *
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> On 10 April 2016 at 08:26, Vikash Pareek 
> wrote:
>
>> Hi,
>>
>> I have upgraded 5 node spark cluster from spark-1.5 to spark-1.6 (to use
>> mapWithState function).
>> After using spark-1.6, I am getting a strange behaviour of spark, jobs are
>> not using multiple executors of different nodes at a time means there is
>> no
>> parallel processing if each node having single worker and executor.
>> I am running jobs in spark standalone mode.
>>
>> I observed following points related to this issue.
>> 1. If I run same job with spark-1.5 then this will use multiple executors
>> across different nodes at a time.
>> 2. In Spark-1.6, If I increase no of cores(spark.cores.max) then jobs are
>> running in parallel thread but within same executor.
>> 3. In Spark-1.6, If I increase no of worker instances on each node then
>> jobs
>> are running in parallel as no of workers but within same executor.
>>
>> Can anyone suggest, why spark 1.6 can not use multiple executors across
>> different node at a time for parallel processing.
>> Your suggestion will be highly appreciated.
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Number-of-executors-in-spark-1-6-and-spark-1-5-tp26733.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: Number of executors in spark-1.6 and spark-1.5

2016-04-10 Thread Mich Talebzadeh
Hi,

in 1.6, how many executors do you see for each node?
in standalone mode how are you increasing the number of worker instances.
Are you starting another slave on each node?

HTH




Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com



On 10 April 2016 at 08:26, Vikash Pareek 
wrote:

> Hi,
>
> I have upgraded 5 node spark cluster from spark-1.5 to spark-1.6 (to use
> mapWithState function).
> After using spark-1.6, I am getting a strange behaviour of spark, jobs are
> not using multiple executors of different nodes at a time means there is no
> parallel processing if each node having single worker and executor.
> I am running jobs in spark standalone mode.
>
> I observed following points related to this issue.
> 1. If I run same job with spark-1.5 then this will use multiple executors
> across different nodes at a time.
> 2. In Spark-1.6, If I increase no of cores(spark.cores.max) then jobs are
> running in parallel thread but within same executor.
> 3. In Spark-1.6, If I increase no of worker instances on each node then
> jobs
> are running in parallel as no of workers but within same executor.
>
> Can anyone suggest, why spark 1.6 can not use multiple executors across
> different node at a time for parallel processing.
> Your suggestion will be highly appreciated.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Number-of-executors-in-spark-1-6-and-spark-1-5-tp26733.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Number of executors in spark-1.6 and spark-1.5

2016-04-10 Thread Vikash Pareek
Hi,

I have upgraded 5 node spark cluster from spark-1.5 to spark-1.6 (to use
mapWithState function).
After using spark-1.6, I am getting a strange behaviour of spark, jobs are
not using multiple executors of different nodes at a time means there is no
parallel processing if each node having single worker and executor.
I am running jobs in spark standalone mode.

I observed following points related to this issue.
1. If I run same job with spark-1.5 then this will use multiple executors
across different nodes at a time.
2. In Spark-1.6, If I increase no of cores(spark.cores.max) then jobs are
running in parallel thread but within same executor.
3. In Spark-1.6, If I increase no of worker instances on each node then jobs
are running in parallel as no of workers but within same executor.

Can anyone suggest, why spark 1.6 can not use multiple executors across
different node at a time for parallel processing.
Your suggestion will be highly appreciated.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Number-of-executors-in-spark-1-6-and-spark-1-5-tp26733.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org