[ https://issues.apache.org/jira/browse/SPARK-26606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16740694#comment-16740694 ]
Ravindra commented on SPARK-26606: ---------------------------------- 10.36.67.188:7077 is what I am using. The rest of all the master links over there are just stand by's > parameters passed in extraJavaOptions are not being picked up > -------------------------------------------------------------- > > Key: SPARK-26606 > URL: https://issues.apache.org/jira/browse/SPARK-26606 > Project: Spark > Issue Type: Bug > Components: Spark Submit > Affects Versions: 2.3.1 > Reporter: Ravindra > Priority: Major > Labels: java, spark > Attachments: Screen Shot 2019-01-09 at 4.31.01 PM.png, Screen Shot > 2019-01-11 at 1.12.33 PM.png > > > driver.extraJavaOptions and executor.extraJavaOptions are not being picked up > . Even though I see the parameters are being passed fine in the spark launch > command I do not see these parameters are being picked up for some unknown > reason. My source code throws an error stating the java params are empty > > This is my spark submit command: > output=`spark-submit \ > --class com.demo.myApp.App \ > --conf 'spark.executor.extraJavaOptions=-Dapp.env=dev -Dapp.country=US > -Dapp.banner=ABC -Doracle.net.tns_admin=/work/artifacts/oracle/current > -Djava.security.egd=[file:/dev/./urandom|file:///dev/urandom]' \ > --conf 'spark.driver.extraJavaOptions=-Dapp.env=dev -Dapp.country=US > -Dapp.banner=ABC -Doracle.net.tns_admin=/work/artifacts/oracle/current > -Djava.security.egd=[file:/dev/./urandom|file:///dev/urandom]' \ > --executor-memory "$EXECUTOR_MEMORY" \ > --executor-cores "$EXECUTOR_CORES" \ > --total-executor-cores "$TOTAL_CORES" \ > --driver-memory "$DRIVER_MEMORY" \ > --deploy-mode cluster \ > /home/spark/asm//current/myapp-*.jar 2>&1 &` -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org