[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-23 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 Thanks I'll try to kick off the winbuilder build soon (i'm out of town till tomorrow). One more thing we might need to fix is that winbuilder has a 10 or 20 minute time limit for tests (not sure

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-23 Thread felixcheung
Github user felixcheung commented on the issue: https://github.com/apache/spark/pull/17966 merged to master/2.2 I think we should still check win-builder. Also it's a bit hard to tell if the skipped tests are skipped - might want to follow up with a trace --- If your

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-22 Thread felixcheung
Github user felixcheung commented on the issue: https://github.com/apache/spark/pull/17966 testing now - could you submit to https://win-builder.r-project.org? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-22 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 @felixcheung Unfortunately I'm out traveling and haven't been able to do the windows tests yet -- Would you have a chance to do that ? Also what are your thoughts on merging this while we test

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-19 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/17966 Merged build finished. Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-19 Thread SparkQA
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/17966 **[Test build #77114 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77114/testReport)** for PR 17966 at commit

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-19 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/17966 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77114/ Test PASSed. ---

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-19 Thread SparkQA
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/17966 **[Test build #77114 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77114/testReport)** for PR 17966 at commit

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-19 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 @felixcheung I made the change - I'm right now going to test this in my Windows VM. Will update this PR with the results --- If your project is set up for it, you can reply to this email and have

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-19 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 Sorry I've been out traveling -- I'll try to update this by tonight --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-19 Thread felixcheung
Github user felixcheung commented on the issue: https://github.com/apache/spark/pull/17966 @shivaram have you got a chance to work on this again? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread felixcheung
Github user felixcheung commented on the issue: https://github.com/apache/spark/pull/17966 So I'd propose this ``` is_cran <- function() { !identical(Sys.getenv("NOT_CRAN"), "true") } is_windows <- function() { .Platform$OS.type == "windows" } hadoop_home_set <-

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread felixcheung
Github user felixcheung commented on the issue: https://github.com/apache/spark/pull/17966 Probably - but how to check for Hadoop? See if HADOOP_HOME is set? We don't need to set that on *nix though, I think --- If your project is set up for it, you can reply to this email

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread HyukjinKwon
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/17966 Just FYI, closing and opening a PR is a workaround to re-trigger the build in AppVeyor as (I assume) we all don't currently have the permission via AppVeyor Web UI. --- If your project is set

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread HyukjinKwon
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/17966 Thank you for cc'ing me. I think primarily it is because single AppVeyor account is shared across several Apache projects but the number of concurrent jobs is single up to my knowledge. So, it

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 @HyukjinKwon Do we know why things sometime queue for a long time on AppVeyor ? Like this PR has been queued for around 5 hours right now. --- If your project is set up for it, you can reply to

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 Actually thinking more about this, I think we should be checking for availability of `hadoop` library / binaries rather than `is_cran`. For example I just found that win-builder only runs `R CMD

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/17966 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/76876/ Test PASSed. ---

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/17966 Merged build finished. Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread SparkQA
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/17966 **[Test build #76876 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/76876/testReport)** for PR 17966 at commit

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 This is SPARK-20727 - I just happened to have the other JIRA also open and pasted it incorrectly --- If your project is set up for it, you can reply to this email and have your reply appear on