[ 
https://issues.apache.org/jira/browse/SPARK-26606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16740683#comment-16740683
 ] 

Ravindra edited comment on SPARK-26606 at 1/11/19 8:08 PM:
-----------------------------------------------------------

So, if you look at the below image, you can see the spark launch command for my 
spark job. So, you can see the launch command which has the expected values. 
But the individual parts like  "-Dapp.env=prod"  "-Dapp.country=US" 
"-Dapp.banner=WMT" are missing. I am suspecting this could be the reason why my 
source code is throwing an exception that the jvm params are null. 

 


was (Author: rrb441):
So, if you look at the below image, you can see the spark launch command for my 
spark job. So, you can see the launch command which has the expected values. 
But the individual parts like  "-Dapp.env=prod"  "-Dapp.country=US" 
"-Dapp.banner=WMT" are missing. I am suspecting this could be the reason why my 
source code is throwing an exception that the jvm params are null.   !Screen 
Shot 2019-01-11 at 1.12.33 PM.png!

> parameters passed in extraJavaOptions are not being picked up 
> --------------------------------------------------------------
>
>                 Key: SPARK-26606
>                 URL: https://issues.apache.org/jira/browse/SPARK-26606
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 2.3.1
>            Reporter: Ravindra
>            Priority: Major
>              Labels: java, spark
>         Attachments: Screen Shot 2019-01-09 at 4.31.01 PM.png, Screen Shot 
> 2019-01-11 at 1.12.33 PM.png
>
>
> driver.extraJavaOptions and executor.extraJavaOptions are not being picked up 
> . Even though I see the parameters are being passed fine in the spark launch 
> command I do not see these parameters are being picked up for some unknown 
> reason. My source code throws an error stating the java params are empty
>  
> This is my spark submit command: 
>     output=`spark-submit \
>  --class com.demo.myApp.App \
>  --conf 'spark.executor.extraJavaOptions=-Dapp.env=dev -Dapp.country=US 
> -Dapp.banner=ABC -Doracle.net.tns_admin=/work/artifacts/oracle/current 
> -Djava.security.egd=[file:/dev/./urandom|file:///dev/urandom]' \
>  --conf 'spark.driver.extraJavaOptions=-Dapp.env=dev -Dapp.country=US 
> -Dapp.banner=ABC -Doracle.net.tns_admin=/work/artifacts/oracle/current 
> -Djava.security.egd=[file:/dev/./urandom|file:///dev/urandom]' \
>  --executor-memory "$EXECUTOR_MEMORY" \
>  --executor-cores "$EXECUTOR_CORES" \
>  --total-executor-cores "$TOTAL_CORES" \
>  --driver-memory "$DRIVER_MEMORY" \
>  --deploy-mode cluster \
>  /home/spark/asm//current/myapp-*.jar 2>&1 &`
>  
>  
> Is there any other way I can access the java params with out using 
> extraJavaOptions. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to