Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/952#issuecomment-50565225
QA results for PR 952:- This patch PASSES unit tests.- This patch
merges cleanly- This patch adds no public classesFor more
information see test
ouptut:https://amplab.cs.
Github user CrazyJvm commented on the pull request:
https://github.com/apache/spark/pull/952#issuecomment-50562260
ok, so I will close this PR and send another patch to the guide. thanks for
your discussion.
---
If your project is set up for it, you can reply to this email and have y
Github user CrazyJvm closed the pull request at:
https://github.com/apache/spark/pull/952
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/952#issuecomment-50561997
QA tests have started for PR 952. This patch merges cleanly. View
progress:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17400/consoleFull
---
If y
Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/952#issuecomment-50513331
Oh, I see. It would be better if you send a patch to the guide then -- just
tell users to add this stuff into the .conf file.
---
If your project is set up for it, you can
Github user CrazyJvm commented on the pull request:
https://github.com/apache/spark/pull/952#issuecomment-50458866
@mateiz YES, i agree. I was motivated by the
"http://spark.apache.org/docs/latest/spark-standalone.html"; , which says "Note
that if you are running spark-shell from one
Github user mateiz commented on the pull request:
https://github.com/apache/spark/pull/952#issuecomment-50429937
Wouldn't this be fixed by having conf/spark-defaults.conf set correctly on
each cluster node? I don't think we should look at the environment here, we can
just recommend to