[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-27 Thread mengxr
Github user mengxr commented on the issue: https://github.com/apache/spark/pull/17423 LGTM. Merged into master. The failed tests are irrelevant to this PR, fixed in https://github.com/apache/spark/commit/a2ce0a2e309e70d74ae5d2ed203f7919a0f79397. --- If your project is set up for

[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-26 Thread yhuai
Github user yhuai commented on the issue: https://github.com/apache/spark/pull/17423 got it. Thanks :) --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or

[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-26 Thread felixcheung
Github user felixcheung commented on the issue: https://github.com/apache/spark/pull/17423 sure, that's just the context around it --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this

[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-25 Thread yhuai
Github user yhuai commented on the issue: https://github.com/apache/spark/pull/17423 @felixcheung `SparkContext.getOrCreate` is the preferred way to create a SparkContext. So, even we have check, it is still better to use `getOrCreate`. --- If your project is set up for it, you can

[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-25 Thread felixcheung
Github user felixcheung commented on the issue: https://github.com/apache/spark/pull/17423 this is already checked on the R side and we should never call `createSparkContext ` more than once --- If your project is set up for it, you can reply to this email and have your reply

[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-25 Thread HyukjinKwon
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/17423 (I just happened to see this does not trigger the build with AppVeyor. Let me leave a build that I triggered with this by my account). Build started: [SparkR] `ALL`

[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-24 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/17423 Merged build finished. Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature

[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-24 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/17423 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/75193/ Test FAILed. ---

[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-24 Thread SparkQA
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/17423 **[Test build #75193 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/75193/testReport)** for PR 17423 at commit

[GitHub] spark issue #17423: [SPARK-20088] Do not create new SparkContext in SparkR c...

2017-03-24 Thread SparkQA
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/17423 **[Test build #75193 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/75193/testReport)** for PR 17423 at commit