hi all;
         --driver-java-options not support multiple JVM configuration.

the submot as following:

Cores=16 
sparkdriverextraJavaOptions="-XX:newsize=2096m -XX:MaxPermSize=512m 
-XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseP 
arNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=80 
-XX:GCTimeLimit=5 -XX:GCHeapFreeLimit=95" 
main1=com.suning.spark.streaming.ppsc.RecommendBasedShoppingCart 
spark-submit --deploy-mode cluster \ 
--total-executor-cores $Cores \ 
--executor-memory 8g \ 
--driver-memory 16g \ 
--conf spark.driver.cores=4 \ 
--driver-java-options $sparkdriverextraJavaOptions \ 
--class $main1 \ 
hdfs:///user/bdapp/$appjars 

error :
          Error: Unrecognized option '-XX:MaxPermSize=512m  ;


when change to :

sparkdriverextraJavaOptions="-XX:newsize=2096m,-XX:MaxPermSize=512m,-XX:+PrintGCDetails,-XX:+PrintGCTimeStamps,-XX:+UseParNewGC,-XX:+UseConcMarkSweepGC,-XX:CMSInitiatingOccupancyFraction=80,-XX:GCTimeLimit=5,-XX:GCHeapFreeLimit=95"

the driver errors is :
Unrecognized VM option 
'newsize=2096m,-XX:MaxPermSize=512m,-XX:+PrintGCDetails,-XX:+PrintGCTimeStamps,-XX:+UseParNewGC,-XX:+UseConcMarkSweepGC,-XX:CMSInitiatingOccupancyFraction=80,-XX:GCTimeLimit=5,-XX:GCHeapFreeLimit=95












Reply via email to