Hi,
You can set those parameters through the
spark.executor.extraJavaOptions
Which is documented in the configuration guide:
spark.apache.org/docs/latest/configuration.htnl
On 2 Jul 2015 9:06 pm, Mulugeta Mammo mulugeta.abe...@gmail.com wrote:
Hi,
I'm running Spark 1.4.0, I want to specify
Yes, that does appear to be the case. The documentation is very clear
about the heap settings and that they can not be used with
spark.executor.extraJavaOptions
spark.executor.extraJavaOptions(none)A string of extra JVM options to pass
to executors. For instance, GC settings or other logging.
tried that one and it throws error - extraJavaOptions is not allowed to
alter memory settings, use spakr.executor.memory instead.
On Thu, Jul 2, 2015 at 12:21 PM, Benjamin Fradet benjamin.fra...@gmail.com
wrote:
Hi,
You can set those parameters through the
spark.executor.extraJavaOptions
You should use:
spark.executor.memory
from the docs https://spark.apache.org/docs/latest/configuration.html:
spark.executor.memory512mAmount of memory to use per executor process, in
the same format as JVM memory strings (e.g.512m, 2g).
-Todd
On Thu, Jul 2, 2015 at 3:36 PM, Mulugeta Mammo
thanks but my use case requires I specify different start and max heap
sizes. Looks like spark sets start and max sizes same value.
On Thu, Jul 2, 2015 at 1:08 PM, Todd Nist tsind...@gmail.com wrote:
You should use:
spark.executor.memory
from the docs
Ya, I think its a limitation too.I looked at the source code,
SparkConf.scala and ExecutorRunnable.scala both Xms and Xmx are set equal
value which is spark.executor.memory.
Thanks
On Thu, Jul 2, 2015 at 1:18 PM, Todd Nist tsind...@gmail.com wrote:
Yes, that does appear to be the case. The