Github user KaiXinXiaoLei commented on the pull request:
https://github.com/apache/spark/pull/10157#issuecomment-163186959
ok. thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this fea
Github user KaiXinXiaoLei closed the pull request at:
https://github.com/apache/spark/pull/10157
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/10157#issuecomment-162853865
@KaiXinXiaoLei do you mind closing this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your proje
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/10157#discussion_r46788170
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -177,7 +177,7 @@ private[deploy] class SparkSubmitArguments(args:
Github user jerryshao commented on the pull request:
https://github.com/apache/spark/pull/10157#issuecomment-162415650
Looks like this is a hidden environment, do we need to support it
explicitly?
---
If your project is set up for it, you can reply to this email and have your
reply a
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/10157#issuecomment-162194399
It looks like this is only for YARN client mode. I actually think the
reference in `SparkConf` should be removed. It's not otherwise documented, so
no I don't think we s
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10157#issuecomment-162173968
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your projec
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10157#issuecomment-162173970
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10157#issuecomment-162173861
**[Test build #47220 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47220/consoleFull)**
for PR 10157 at commit
[`2860c8a`](https://g
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10157#issuecomment-162168151
**[Test build #47220 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47220/consoleFull)**
for PR 10157 at commit
[`2860c8a`](https://gi
GitHub user KaiXinXiaoLei opened a pull request:
https://github.com/apache/spark/pull/10157
[SPARK-12156] Make SPARK_EXECUTOR_INSTANCES become effective
I set SPARK_EXECUTOR_INSTANCES=3, but two executors starts. That is,
SPARK_EXECUTOR_INSTANCES does not work.
You can merge this p
11 matches
Mail list logo