skonto edited a comment on issue #25229: [SPARK-27900][K8s] Add jvm oom flag
URL: https://github.com/apache/spark/pull/25229#issuecomment-516594015
 
 
   @dongjoon-hyun I updated the PR.
   Using the following submission cmd:
   ```
    ./bin/spark-submit --master k8s://https://192.168.2.5:8443\
    --deploy-mode cluster \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.memory=1G \
    --conf spark.kubernetes.namespace=spark \
    --conf spark.kubernetes.driverEnv.DRIVER_VERBOSE=true \
    --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa \
    --conf spark.driver.memory=500m \
    --conf spark.executor.instances=2  \
    --conf spark.kubernetes.container.image.pullPolicy=Always \
    --conf spark.kubernetes.container.image=skonto/spark:oom \
    local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar 
1000000000```
   
   ```
   I get the flag added by default (tested with `jsp -lvm` within the 
container): 
   ```
   19 org.apache.spark.deploy.SparkSubmit --deploy-mode client --conf 
spark.driver.bindAddress=172.17.0.4 --properties-file /tmp/spark.properties 
--class org.apache.spark.examples.SparkPi --verbose 
local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar 
1000000000 -Xmx500m -XX:OnOutOfMemoryError=kill -9 %p
   ```
   If I add ` --conf "spark.driver.extraJavaOptions=-Ds=3" \` I get as expected:
   ```
   19 org.apache.spark.deploy.SparkSubmit --deploy-mode client --conf 
spark.driver.bindAddress=172.17.0.4 --properties-file /tmp/spark.properties 
--class org.apache.spark.examples.SparkPi --verbose 
local:///opt/spark/examples/jars/spark-examples_2.12-3.0.0-SNAPSHOT.jar 
1000000000 -Xmx500m -XX:OnOutOfMemoryError=kill -9 %p -Ds=3
   ```
   Right now this is a bit verbose solution but complete. The other 
alternatives I see are:
   a) dont address the issue in code but just add to the docs how the user 
could simply pass the OOM flag at this command.
   b) add the flag as previously using spark.driver.extraJavaOptions in the 
entrypoint script but inform the user that this will be overriden if he also 
passes spark.driver.extraJavaOptions in his command.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to