Hello, 

Spark has 'spark.executor.memory' property which defines amount of memory which 
will be used on each computational node. 
And by default it is equal to 512Mb. Is there way to tell spark to use 'all 
available memory minus 1Gb'? 

Thank you in advance. 
-- 
Cordialement, 
Hlib Mykhailenko 
Doctorant à INRIA Sophia-Antipolis Méditerranée 
2004 Route des Lucioles BP93 
06902 SOPHIA ANTIPOLIS cedex 

Reply via email to