Github user koertkuipers commented on the issue:
https://github.com/apache/spark/pull/609
```OPTS+=" --driver-java-options \"-Da=b -Dx=y\""```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user ganeshm25 commented on the issue:
https://github.com/apache/spark/pull/609
@koertkuipers can you the most recent syntax that you used to achieve the
same?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user koertkuipers commented on the issue:
https://github.com/apache/spark/pull/609
@ganeshm25 it seems to work in newer spark versions. i havent tried in
spark 1.4.2. however its still very tricky to get it right and i would prefer a
simpler solution.
---
If your project is
Github user ganeshm25 commented on the issue:
https://github.com/apache/spark/pull/609
@koertkuipers i am trying to do achieve running the multiple
driver-java-options with Spark 1.4.2 inside a bash script? is there a solution
you found for this ?
---
If your project is set up for