Hi there,

I've been trying to increase the spark.driver.memory and
spark.executor.memory during some unit tests. Most of the information I can
find about increasing memory for Spark is based on either flags to
spark-submit, or settings in the spark-defaults.conf file. Running unit
tests with Maven on both a local machine and on a Jenkins box, and editing
both the .conf file, and attempting to set the spark.driver.memory and
spark.executor.memory variables in a SparkConf object in the unit tests
@BeforeClass method, I still can't seem to change what the Storage Memory
of Executor is, it remains the same upon every execution when I check the
UI during the tests. When Spark is invoked on a local machine, and not
through spark-submit in the shell (as during unit test), are the memory
defaults computed some other way, perhaps based on JVM heap allocation
settings?

Best,
Alek

Reply via email to