Github user srowen closed the pull request at:
https://github.com/apache/spark/pull/4188
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enab
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73323965
Yeah I don't think this is the way to do it. Scratch this one; let me have
another run at this problem from the other end.
---
If your project is set up for it, you can r
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73335694
Yeah, Yarn makes a mess out of this. Basically, it does this:
val args = List("abd", "def", "g h")
val cmd = args.mkString(" ")
So even though
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73334927
@vanzin the code in `ExecutorRunnable.scala` that handles the executor opts
still does build up `javaOpts` as a collection. You're saying it doesn't need
this treatment b
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73318793
BTW#2, the code that handles `spark.yarn.am.extraJavaOptions` a few lines
below the code I mentioned has the same issue.
---
If your project is set up for it, you can rep
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73367478
@vanzin Got it, yes. Yes, this has to change too, to parse the arguments
and then escape them, or else escaping them does the wrong thing, and things
have to be escaped to
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73318107
BTW I'd also recommend adding a test to `YarnClusterSuite` (or extending an
existing one).
---
If your project is set up for it, you can reply to this email and have your
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73316977
Hi @srowen,
I think this isn't the correct fix. I think the proper fix would be to
change this in `Client.scala`:
sparkConf.getOption("spark.driver.ex
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73316114
@andrewor14 Yeah basically this keeps the quotes all the way through. It's
not a problem of escaping, but the opposite -- the quotes are gone from the
get-go. Yes, that's
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73341829
I was basing my comments on this code:
// Set extra Java options for the executor, if defined
sys.props.get("spark.executor.extraJavaOptions").foreach {
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73341528
@vanzin I don't think it does build a big command line though. I see:
```
val commands = prepareCommand(...) // List[String]
...
ctx.setComm
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73305737
[Test build #26931 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/26931/consoleFull)
for PR 4188 at commit
[`8e91cc3`](https://gith
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73305744
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/26
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73300317
To fix the YARN issue maybe we should do something specific there, like
escaping the double quotes before passing them to YARN?
---
If your project is set up for it,
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73299916
Hey @srowen it seems like this will break existing behavior though. What if
I want to run an application with the following arguments
```
a "b c" d
```
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-73299811
[Test build #26931 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/26931/consoleFull)
for PR 4188 at commit
[`8e91cc3`](https://githu
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/4188#discussion_r24266009
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -1283,6 +1291,7 @@ private[spark] object Utils extends Logging {
if (inWord || i
Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/4188#discussion_r24265176
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -1283,6 +1291,7 @@ private[spark] object Utils extends Logging {
if (inWord
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71498551
Ok thanks for verifying. I will take a closer look later today
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub a
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71442313
PS @andrewor14 I did run my patched version on a cluster, with and without
the offending argument, using a command like that reported in SPARK-4267, and
it succeeded both
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71403888
Yeah, meaning the quotes in `-Dnumbers="one two three"` are not escaped
properly and the setting is not treated as one argument.
---
If your project is set up for it,
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71403203
If you mean that YARN doesn't parse the argument like bash does, I think
that's true. But the issue here is that the quotes have disappeared by the time
the command line a
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71403089
I see. That's not surprising considering how YARN runs commands as strings
rather than sequence of strings, from what I've heard at least.
---
If your project is set
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71402447
@andrewor14 I reproduced this on Spark 1.2.0 and verified that it was the
`-Dnumbers="one two three"` arg that triggers the problem. The executors fail
to start. I haven't
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71397184
Hey @srowen have you tested it before and after this patch to see if it
actually fixes it? If this is a problem in master it's probably also a problem
in older branche
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71333101
[Test build #26050 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/26050/consoleFull)
for PR 4188 at commit
[`377c3bb`](https://gith
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71333105
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/26
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/4188#issuecomment-71330214
[Test build #26050 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/26050/consoleFull)
for PR 4188 at commit
[`377c3bb`](https://githu
GitHub user srowen opened a pull request:
https://github.com/apache/spark/pull/4188
SPARK-4267 [CORE] Failing to launch jobs on Spark on YARN with Hadoop 2.5.0
or later
Per SPARK-4267, I think the problem here is stripping quotes in JVM
arguments before sending them on the YARN et
29 matches
Mail list logo