Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-65107341
@tgravescs Nishkam (who filed the bug) has most of the background info on
the performance impact. But, from a high level, installing a security manager
seems to introduce
Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-65071338
@vanzin I would still be curious if you have more details on the exact
performance impact?
---
If your project is set up for it, you can reply to this email and have
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/3484
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enab
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64925544
Okay I'd like to pull this in for rc1, so I'm going to merge this with the
edits that Tom suggested. Feel free to open up additional patches.
---
If your project is set
Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64914293
mostly looks good. I think we should fix the exit code before pulling it
in.
---
If your project is set up for it, you can reply to this email and have your
reply app
Github user tgravescs commented on a diff in the pull request:
https://github.com/apache/spark/pull/3484#discussion_r21039416
--- Diff:
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
---
@@ -106,10 +106,14 @@ private[spark] class ApplicationMaster(
Github user tgravescs commented on a diff in the pull request:
https://github.com/apache/spark/pull/3484#discussion_r21039238
--- Diff:
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
---
@@ -403,42 +407,6 @@ private[spark] class ApplicationMaster(a
Github user tgravescs commented on a diff in the pull request:
https://github.com/apache/spark/pull/3484#discussion_r21039046
--- Diff:
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
---
@@ -106,10 +106,14 @@ private[spark] class ApplicationMaster(
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64820187
Actually reading more in this thread I'll hold off until you or @andrewor14
signs off.
---
If your project is set up for it, you can reply to this email and have your
r
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64813930
Hey @tgravescs do you mind if I pull this in for a release candidate? I'm
planning to cut one today. If there is some open discussion in this issue
though I can defer it
Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64791575
so I'm trying to understand exactly when this affects performance. I saw
in the jira you saw its when you use java word count. So is this only an issue
when things do
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64721120
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/23
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64721113
[Test build #23906 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23906/consoleFull)
for PR 3484 at commit
[`21f2502`](https://gith
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64712202
[Test build #23906 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23906/consoleFull)
for PR 3484 at commit
[`21f2502`](https://githu
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64711809
Ok, updated to match what was discussed. Tested with `System.exit(0)` and
`System.exit(1)` (both result in SUCCEEDED app).
---
If your project is set up for it, you can r
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64704937
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/23
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64704750
Ok, until we figure out a way to avoid that I retract my earlier LGTM
because this is currently a regression in a different way. Thanks for digging
into this.
---
If
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64704929
[Test build #23901 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23901/consoleFull)
for PR 3484 at commit
[`4198b3b`](https://gith
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64704472
Yeah, I'm not a fan of that behavior. Let me look at changing it, but also
waiting to hear back from Tom.
---
If your project is set up for it, you can reply to this emai
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64704154
Wait, actually that's pretty bad right? If I run the same user code from
1.1 that calls `System.exit(0)` now it will be re-run many times.
---
If your project is set
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64703510
Actually the current code will fail the app (not succeed) regardless of
what System.exit() says, because of the check on L108... so this actually does
what Sandy suggests.
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64701438
I see. The only thing this changes is the `System.exit(x)` case, meaning
uncaught exceptions will still be properly treated as failures. This LGTM as a
hot fix, though
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64699874
The main reason why I chose this approach is because if we do what you
suggest, a successful app will actually be retried and will eventually fail.
That sounds worse than
Github user sryza commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64699567
If this is behavior that we encourage users to avoid, I would think that
defaulting to failure when System.exit is called would be preferable. Also,
I'm guessing that user
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64697426
cluster UserClass:
system.exit(1) - fail, 3 tries
system.exit(-1) - fail, 3 tries
These will be actually "success" now. All others should remain the
Github user andrewor14 commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64697009
Hey @vanzin quick question which conditions listed in @tgravescs' comment
[here](https://issues.apache.org/jira/browse/SPARK-3627?focusedCommentId=14150013&page=com.atl
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64693947
[Test build #23901 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/23901/consoleFull)
for PR 3484 at commit
[`4198b3b`](https://githu
Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/3484#issuecomment-64693586
@tgravescs @andrewor14 @nishkamravi2 please take a look and let me know if
you have questions.
I tested successful and failed apps in yarn client and cluster modes
GitHub user vanzin opened a pull request:
https://github.com/apache/spark/pull/3484
[SPARK-4584] [yarn] Remove security manager from Yarn AM.
The security manager adds a lot of overhead to the runtime of the
app, and causes a severe performance regression. Even stubbing out
a
29 matches
Mail list logo