Re: Set the node the spark driver will be started

2016-06-30 Thread Felix Massem
Hey Bryan, yes this definitely sounds like the issue I have :-) Thx a lot and best regards Felix Felix Massem | IT-Consultant | Karlsruhe mobil: +49 (0) 172.2919848 <> www.codecentric.de | blog.codecentric.de | www.meettheexperts.de

Re: Set the node the spark driver will be started

2016-06-29 Thread Bryan Cutler
Hi Felix, I think the problem you are describing has been fixed in later versions, check out this JIRA https://issues.apache.org/jira/browse/SPARK-13803 On Wed, Jun 29, 2016 at 9:27 AM, Mich Talebzadeh wrote: > Fine. in standalone mode spark uses its own scheduling

Re: Set the node the spark driver will be started

2016-06-29 Thread Mich Talebzadeh
Fine. in standalone mode spark uses its own scheduling as opposed to Yarn or anything else. As a matter of interest can you start spark-submit from any node in the cluster? Are these all have the same or similar CPU and RAM? HTH Dr Mich Talebzadeh LinkedIn *

Re: Set the node the spark driver will be started

2016-06-29 Thread Felix Massem
In addition we are not using Yarn we are using the standalone mode and the driver will be started with the deploy-mode cluster Thx Felix Felix Massem | IT-Consultant | Karlsruhe mobil: +49 (0) 172.2919848 <> www.codecentric.de | blog.codecentric.de

Re: Set the node the spark driver will be started

2016-06-29 Thread Felix Massem
Hey Mich, the distribution is like not given. Just right now I have 15 applications and all 15 drivers are running on one node. This is just after giving all machines a little more memory. Before I had like 15 applications and about 13 driver where running on one machine. While trying to

Re: Set the node the spark driver will be started

2016-06-28 Thread Mich Talebzadeh
Hi Felix, In Yarn-cluster mode the resource manager Yarn is expected to take care of that. Are you getting some skewed distribution with drivers created through spark-submit on different nodes? HTH Dr Mich Talebzadeh LinkedIn *

Re: Set the node the spark driver will be started

2016-06-28 Thread Felix Massem
Hey Mich, thx for the fast reply. We are using it in cluster mode and spark version 1.5.2 Greets Felix Felix Massem | IT-Consultant | Karlsruhe mobil: +49 (0) 172.2919848 <> www.codecentric.de | blog.codecentric.de |

Re: Set the node the spark driver will be started

2016-06-28 Thread Mich Talebzadeh
Hi Felix, what version of Spark? Are you using yarn client mode or cluster mode? HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw *

Set the node the spark driver will be started

2016-06-28 Thread adaman79
Hey guys, I have a problem with memory because over 90% of my spark driver will be started on one of my nine spark nodes. So now I am looking for the possibility to define the node the spark driver will be started when using spark-submit or setting it somewhere in the code. Is this possible?