Hey guys,

I have a problem with memory because over 90% of my spark driver will be
started on one of my nine spark nodes. 
So now I am looking for the possibility to define the node the spark driver
will be started when using spark-submit or setting it somewhere in the code.

Is this possible? Does anyone else have this kind of problem?

thx and best regards
Felix



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Set-the-node-the-spark-driver-will-be-started-tp27244.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to