[GitHub] spark issue #22589: [SPARK-25572][SPARKR] test only if not cran

2018-09-29 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/22589 LGTM. Thanks @felixcheung --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail

[GitHub] spark issue #21920: [SPARK-24956][BUILD][FOLLOWUP] Upgrade Maven version to ...

2018-07-30 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21920 LGTM. Thanks @HyukjinKwon --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail

[GitHub] spark issue #21666: [SPARK-24535][SPARKR] fix tests on java check error

2018-07-06 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21666 Thanks @felixcheung and sorry for the delay in looking at this. I think the fix looks good. Overall it looks like we need to use system2 for the java version check as otherwise it runs inside

[GitHub] spark pull request #21666: [SPARK-24535][SPARKR] fix tests on java check err...

2018-06-29 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21666#discussion_r199224949 --- Diff: R/pkg/R/client.R --- @@ -61,6 +61,11 @@ generateSparkSubmitArgs <- function(args, sparkHome, jars, sparkSubmitOpts, p

[GitHub] spark issue #21338: [SPARK-23601][build][follow-up] Keep md5 checksums for n...

2018-05-15 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21338 Can we check this the appropriate Apache group (is it infra ?) ? It seems odd that the policy would require removing them when nexus requires them

[GitHub] spark issue #21338: [SPARK-23601][build][follow-up] Keep md5 checksums for n...

2018-05-15 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21338 If I follow this correctly, this is a partial revert only for the Nexus artifacts ? --- - To unsubscribe, e-mail: reviews

[GitHub] spark issue #21314: [SPARK-24263][R] SparkR java check breaks with openjdk

2018-05-14 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21314 Yes @felixcheung or @vanzin can you merge this ? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

[GitHub] spark pull request #21314: [SPARK-24263][R] SparkR java check breaks with op...

2018-05-13 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21314#discussion_r187816811 --- Diff: R/pkg/R/client.R --- @@ -82,7 +82,7 @@ checkJavaVersion <- function() { }) javaVersionFilter <-

[GitHub] spark issue #21315: [SPARK-23780][R] Failed to use googleVis library with ne...

2018-05-13 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21315 LGTM. Lets wait till #21314 is merged ? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional

[GitHub] spark pull request #21314: [SPARK-24263][R] SparkR java check breaks with op...

2018-05-13 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21314#discussion_r187815762 --- Diff: R/pkg/R/client.R --- @@ -82,7 +82,7 @@ checkJavaVersion <- function() { }) javaVersionFilter <-

[GitHub] spark pull request #21314: [SPARK-24263][R] SparkR java check breaks with op...

2018-05-13 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21314#discussion_r187815145 --- Diff: R/pkg/R/client.R --- @@ -82,7 +82,7 @@ checkJavaVersion <- function() { }) javaVersionFilter <-

[GitHub] spark issue #21278: [SPARKR] Require Java 8 for SparkR

2018-05-11 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21278 Merging this to master and branch-2.3 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional

[GitHub] spark pull request #21278: [SPARKR] Require Java 8 for SparkR

2018-05-10 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21278#discussion_r187470922 --- Diff: R/pkg/R/client.R --- @@ -60,13 +60,48 @@ generateSparkSubmitArgs <- function(args, sparkHome, jars, sparkSubmitOpts, pack combinedA

[GitHub] spark pull request #21278: [SPARKR] Require Java 8 for SparkR

2018-05-10 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21278#discussion_r187407193 --- Diff: R/pkg/DESCRIPTION --- @@ -13,6 +13,7 @@ Authors@R: c(person("Shivaram", "Venkataraman", role = c("aut", "

[GitHub] spark pull request #21278: [SPARKR] Require Java 8 for SparkR

2018-05-10 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21278#discussion_r187405804 --- Diff: R/pkg/R/client.R --- @@ -60,13 +60,48 @@ generateSparkSubmitArgs <- function(args, sparkHome, jars, sparkSubmitOpts, pack combinedA

[GitHub] spark pull request #21278: [SPARKR] Require Java 8 for SparkR

2018-05-10 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21278#discussion_r187405812 --- Diff: R/pkg/R/client.R --- @@ -60,13 +60,48 @@ generateSparkSubmitArgs <- function(args, sparkHome, jars, sparkSubmitOpts, pack combinedA

[GitHub] spark issue #21278: [SPARKR] Require Java 8 for SparkR

2018-05-09 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21278 Looking at http://r-pkgs.had.co.nz/description.html - `... the SystemRequirements field. But this is just a plain text field and is not automatically checked.` I think using `== 8` is probably

[GitHub] spark pull request #21278: [SPARKR] Require Java 8 for SparkR

2018-05-09 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21278#discussion_r187169358 --- Diff: R/pkg/R/client.R --- @@ -60,13 +60,39 @@ generateSparkSubmitArgs <- function(args, sparkHome, jars, sparkSubmitOpts, pack combinedA

[GitHub] spark pull request #21278: [SPARKR] Require Java 8 for SparkR

2018-05-09 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21278#discussion_r187169209 --- Diff: R/pkg/R/client.R --- @@ -60,13 +60,39 @@ generateSparkSubmitArgs <- function(args, sparkHome, jars, sparkSubmitOpts, pack combinedA

[GitHub] spark pull request #21278: [SPARKR] Require Java 8 for SparkR

2018-05-09 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21278#discussion_r187169097 --- Diff: R/pkg/R/sparkR.R --- @@ -163,6 +163,10 @@ sparkR.sparkContext <- function( submitOps <- getClientModeSparkSubm

[GitHub] spark pull request #21278: [SPARKR] Require Java 8 for SparkR

2018-05-09 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/21278#discussion_r187168952 --- Diff: R/pkg/R/utils.R --- @@ -756,7 +756,7 @@ launchScript <- function(script, combinedArgs, wait = FALSE) { # stdout = F means disc

[GitHub] spark issue #21278: [SPARKR] Require Java 8 for SparkR

2018-05-09 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21278 Ah I know the problem with the vignettes - if you have _JAVA_OPTIONS set then the line numbers change. i.e. the output looks like ``` shivaram@localhost ~ » export _JAVA_OPTIONS=&quo

[GitHub] spark issue #21278: [SPARKR] Require Java 8 for SparkR

2018-05-09 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21278 Ah got it - Thanks @HyukjinKwon . I'll check if `== 1.8` is supported by R syntax @felixcheung I moved the logic into a `checkJavaVersion` function now. Let me know if this looks better

[GitHub] spark issue #21278: [SPARKR] Require Java 8 for SparkR

2018-05-08 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21278 Thats a fair question -- I initially created a script was to handle Windows calls but I think we can do some of the split stuff inside R. Let me try that out. Regarding Java 9, do you

[GitHub] spark issue #21278: [SPARKR] Require Java 8 for SparkR

2018-05-08 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/21278 The need for both the Requirements field and the runtime check is documented at https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Writing-portable-packages (Search for `Make sure

[GitHub] spark pull request #21278: [SPARKR] Require Java 8 for SparkR

2018-05-08 Thread shivaram
GitHub user shivaram opened a pull request: https://github.com/apache/spark/pull/21278 [SPARKR] Require Java 8 for SparkR This change updates the SystemRequirements and also includes a runtime check if the JVM is being launched by R. The runtime check is done by querying `java

[GitHub] spark issue #20770: [SPARK-23626][CORE] DAGScheduler blocked due to JobSubmi...

2018-03-08 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20770 @AjithShetty2489 I'm not sure just changing these two maps is sufficient ? For example createResultStage could in turn create all the parent stages and the parents stages could be ShuffleMapStage

[GitHub] spark issue #20770: [SPARK-23626][CORE] DAGScheduler blocked due to JobSubmi...

2018-03-08 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20770 cc @kayousterhout @markhamstra --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e

[GitHub] spark issue #20464: [SPARK-23291][SQL][R] R's substr should not reduce start...

2018-02-07 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20464 I think @felixcheung has the most context here, so I'd suggest we wait for his comments. --- - To unsubscribe, e-mail: reviews

[GitHub] spark issue #20464: [SPARK-23291][SQL][R] R's substr should not reduce start...

2018-01-31 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20464 Thanks for clarifying @viirya. Is the PR description accurate ? I read it as `..SQL's substr also accepts zero-based starting position` while R uses a 1-based starting position

[GitHub] spark issue #20464: [SPARK-23291][SQL][R] R's substr should not reduce start...

2018-01-31 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20464 One thing to keep in mind is what the user's perception of the API is. If R users are going to use 1-based indexing then this might not be the right fix ? http://stat.ethz.ch/R-manual/R-devel

[GitHub] spark issue #20414: [SPARK-23243][SQL] Shuffle+Repartition on an RDD could l...

2018-01-28 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20414 @jiangxb1987 @mridulm Could we have a special case of using the sort-based approach when the RDD type is comparable ? I think that should cover a bunch of the common cases and the hash version

[GitHub] spark issue #20393: [SPARK-23207][SQL] Shuffle+Repartition on a DataFrame co...

2018-01-26 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20393 I'm fine with merging this -- I just dont want to this issue to be forgotten for RDDs as I think its a major correctness issue. @mridulm @sameeragarwal Lets continue the discussion

[GitHub] spark issue #20393: [SPARK-23207][SQL] Shuffle+Repartition on an RDD/DataFra...

2018-01-25 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20393 @sameeragarwal I think we should wait for the RDD fix for 2.3 as well ? --- - To unsubscribe, e-mail: reviews-unsubscr

[GitHub] spark issue #20393: [SPARK-23207][SQL] Shuffle+Repartition on an RDD/DataFra...

2018-01-25 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20393 @jiangxb1987 If I'm not wrong this problem will also happen with RDD repartition ? Will this fix also cover

[GitHub] spark issue #20352: [SPARK-21727][R] Allow multi-element atomic vector as co...

2018-01-22 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20352 Thanks @neilalex - Change LGTM. Lets also see if @felixcheung has any comments. --- - To unsubscribe, e-mail: reviews-unsubscr

[GitHub] spark issue #20352: [SPARK-21727][R] Allow multi-element atomic vector as co...

2018-01-22 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20352 @neilalex Can you add the code snippet in the PR description as a new test case ? That way we will ensure this behavior is tested going forward

[GitHub] spark issue #20352: [SPARK-21727][R] Allow multi-element atomic vector as co...

2018-01-22 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20352 Jenkins, ok to test --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews

[GitHub] spark issue #19290: [SPARK-22063][R] Fixes lint check failures in R by lates...

2018-01-09 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19290 The minimum R version supported is something that we can revisit though. I think we do this for Python, Java versions as well in the project

[GitHub] spark issue #20118: [SPARK-22924][SPARKR] R API for sortWithinPartitions

2017-12-30 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20118 LGTM. I think withinPartitions sounds better than global --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

[GitHub] spark issue #20060: [SPARK-22889][SPARKR] Set overwrite=T when install Spark...

2017-12-22 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/20060 cc @felixcheung --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h

[GitHub] spark pull request #20060: [SPARK-22889][SPARKR] Set overwrite=T when instal...

2017-12-22 Thread shivaram
GitHub user shivaram opened a pull request: https://github.com/apache/spark/pull/20060 [SPARK-22889][SPARKR] Set overwrite=T when install SparkR in tests ## What changes were proposed in this pull request? Since all CRAN checks go through the same machine

[GitHub] spark pull request #19959: [SPARK-22766] Install R linter package in spark l...

2017-12-22 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19959#discussion_r158558712 --- Diff: dev/lint-r.R --- @@ -27,10 +27,11 @@ if (! library(SparkR, lib.loc = LOCAL_LIB_LOC, logical.return = TRUE)) { # Installs lintr from Github

[GitHub] spark pull request #19959: [SPARK-22766] Install R linter package in spark l...

2017-12-14 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19959#discussion_r157028493 --- Diff: dev/lint-r.R --- @@ -27,10 +27,11 @@ if (! library(SparkR, lib.loc = LOCAL_LIB_LOC, logical.return = TRUE)) { # Installs lintr from Github

[GitHub] spark pull request #19959: [SPARK-22766] Install R linter package in spark l...

2017-12-13 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19959#discussion_r156808529 --- Diff: dev/lint-r.R --- @@ -27,10 +27,11 @@ if (! library(SparkR, lib.loc = LOCAL_LIB_LOC, logical.return = TRUE)) { # Installs lintr from Github

[GitHub] spark pull request #19959: [SPARK-22766] Install R linter package in spark l...

2017-12-12 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19959#discussion_r156564046 --- Diff: dev/lint-r.R --- @@ -27,10 +27,11 @@ if (! library(SparkR, lib.loc = LOCAL_LIB_LOC, logical.return = TRUE)) { # Installs lintr from Github

[GitHub] spark issue #19657: [SPARK-22344][SPARKR] clean up install dir if running te...

2017-11-07 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19657 AppVeyor still has an error ``` 1. Failure: traverseParentDirs (@test_utils.R#252) - `dirs` not equal to `expect`. 1/4 mismatches x[1]: "c:\\

[GitHub] spark issue #19657: [SPARK-22344][SPARKR] clean up install dir if running te...

2017-11-07 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19657 Thanks @felixcheung -- The Appveyor test seems to have failed with the following err ``` 1. Failure: traverseParentDirs (@test_utils.R#255) - `dirs

[GitHub] spark pull request #19657: [SPARK-22344][SPARKR] clean up install dir if run...

2017-11-07 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19657#discussion_r149436164 --- Diff: R/pkg/tests/fulltests/test_utils.R --- @@ -236,4 +236,23 @@ test_that("basenameSansExtFromUrl", { expect_equal(basenameSansEx

[GitHub] spark pull request #19657: [SPARK-22344][SPARKR] clean up install dir if run...

2017-11-06 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19657#discussion_r149174223 --- Diff: R/pkg/R/install.R --- @@ -152,6 +152,9 @@ install.spark <- function(hadoopVersion = "2.7", mir

[GitHub] spark pull request #19657: [SPARK-22344][SPARKR] clean up install dir if run...

2017-11-06 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19657#discussion_r149140297 --- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd --- @@ -1183,3 +1183,24 @@ env | map ```{r, echo=FALSE} sparkR.session.stop

[GitHub] spark pull request #19657: [SPARK-22344][SPARKR] clean up install dir if run...

2017-11-05 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19657#discussion_r148979368 --- Diff: R/pkg/R/install.R --- @@ -152,6 +152,9 @@ install.spark <- function(hadoopVersion = "2.7", mir

[GitHub] spark pull request #19657: [SPARK-22344][SPARKR] clean up install dir if run...

2017-11-05 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19657#discussion_r148979827 --- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd --- @@ -1183,3 +1183,24 @@ env | map ```{r, echo=FALSE} sparkR.session.stop

[GitHub] spark pull request #19657: [SPARK-22344][SPARKR] clean up install dir if run...

2017-11-05 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19657#discussion_r148979924 --- Diff: R/pkg/tests/run-all.R --- @@ -60,3 +60,22 @@ if (identical(Sys.getenv("NOT_CRAN"), "true")) {

[GitHub] spark pull request #19624: [SPARKR][SPARK-22315] Warn if SparkR package vers...

2017-11-01 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19624#discussion_r148320092 --- Diff: R/pkg/R/sparkR.R --- @@ -420,6 +420,18 @@ sparkR.session <- function( enableHiveSupport) ass

[GitHub] spark pull request #19624: [SPARKR][SPARK-22315] Warn is SparkR package vers...

2017-10-31 Thread shivaram
GitHub user shivaram opened a pull request: https://github.com/apache/spark/pull/19624 [SPARKR][SPARK-22315] Warn is SparkR package version doesn't match SparkContext ## What changes were proposed in this pull request? This PR adds a check between the R package version

[GitHub] spark issue #19589: [SPARKR][SPARK-22344] Set java.io.tmpdir for SparkR test...

2017-10-29 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19589 Merging to master, branch-2.2 and branch-2.1 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional

[GitHub] spark issue #19589: [SPARKR][SPARK-22344] Set java.io.tmpdir for SparkR test...

2017-10-27 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19589 cc @felixcheung -- It'll be great if you could independently test this as well ! --- - To unsubscribe, e-mail: reviews

[GitHub] spark pull request #19589: [SPARKR][SPARK-22344] Set java.io.tmpdir for Spar...

2017-10-27 Thread shivaram
GitHub user shivaram opened a pull request: https://github.com/apache/spark/pull/19589 [SPARKR][SPARK-22344] Set java.io.tmpdir for SparkR tests This PR sets the java.io.tmpdir for CRAN checks and also disables the hsperfdata for the JVM when running CRAN checks. Together

[GitHub] spark issue #19557: [SPARK-22281][SPARKR] Handle R method breaking signature...

2017-10-25 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19557 LGTM. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h

[GitHub] spark pull request #19557: [SPARK-22281][SPARKR][WIP] Handle R method breaki...

2017-10-24 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19557#discussion_r146760035 --- Diff: R/pkg/R/DataFrame.R --- @@ -3249,9 +3249,12 @@ setMethod("as.data.frame", #' @note attach since 1.6.0 setMeth

[GitHub] spark issue #19557: [SPARK-22281][SPARKR][WIP] Handle R method breaking sign...

2017-10-24 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19557 @felixcheung Are the docs on this version good ? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

[GitHub] spark issue #19557: [SPARK-22281][SPARKR] Handle R method breaking signature...

2017-10-23 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19557 Is there a reason we can't use the same glm trick for attach ? I guess this was explained above but I'm wondering if there is a reason the base::attach is not compiled in the same way

[GitHub] spark issue #19550: [SPARK-22327][SPARKR][TEST][BACKPORT-2.0] check for vers...

2017-10-23 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19550 LGTM. Thanks @felixcheung --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail

[GitHub] spark issue #19514: [SPARK-21551][Python] Increase timeout for PythonRDD.ser...

2017-10-20 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19514 Good point. I'm not sure it counteracts it completely. We should run it to see the behavior I guess. I am not a big fan of mucking with Jenkins versions because it fundamentally looks

[GitHub] spark issue #19514: [SPARK-21551][Python] Increase timeout for PythonRDD.ser...

2017-10-20 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19514 We didn't foresee this but it looks like `R CMD check --as-cran` throws this error if we try to build a package with a version number older than the one uploaded to CRAN

[GitHub] spark pull request #19342: [MINOR][SparkR] minor fixes for CRAN compliance

2017-09-25 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19342#discussion_r140920250 --- Diff: R/pkg/R/DataFrame.R --- @@ -3250,6 +3250,7 @@ setMethod("attach", function(what, pos = 2, name = deparse(subst

[GitHub] spark issue #19290: [WIP][SPARK-22063][R] Upgrades lintr to latest commit sh...

2017-09-20 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19290 @HyukjinKwon Thanks for looking at this. The 5 min addition seems unfortunate though -- does that also happen with lintr-1.0.1 ? I wonder if we are seeing some specific performance slowdown

[GitHub] spark issue #19111: [SPARK-21801][SPARKR][TEST][WIP] set random seed for pre...

2017-09-05 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19111 Should we set this before the call to `test_package` ? It'll be good to have it for the CRAN tests as well

[GitHub] spark issue #19016: [SPARK-21805][SPARKR] Disable R vignettes code on Window...

2017-08-25 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19016 Thats great ! I will also run this by winbuilder later today. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark issue #19016: [SPARK-21805][SPARKR] Disable R vignettes code on Window...

2017-08-23 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19016 Sure - change LGTM. Lets see if @HyukjinKwon has any more comments ? If not we can merge to master, branch-2.2 and then do some more tests. --- If your project is set up for it, you can

[GitHub] spark pull request #19016: [SPARK-21805][SPARKR] Disable R vignettes code on...

2017-08-23 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/19016#discussion_r134815709 --- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd --- @@ -27,6 +27,17 @@ vignette: > limitations under the License. --> +```{r

[GitHub] spark issue #19016: [SPARK-21805][SPARKR] Disable R vignettes code on Window...

2017-08-23 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19016 Ah I see. Yeah the failed tests makes sense. We can also try to submit a custom tar.gz to r-hub to test it with the PDF and a different version number ? --- If your project is set up for it, you

[GitHub] spark issue #19016: [SPARK-21805][SPARKR] Disable R vignettes code on Window...

2017-08-23 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/19016 Thanks @felixcheung ! Are the warnings about the missing PDF unavoidable ? I see something like ``` checking package vignettes in 'inst/doc' ... WARNING Package vignette without

[GitHub] spark pull request #15471: [SPARK-17919] Make timeout to RBackend configurab...

2017-07-19 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/15471#discussion_r128301290 --- Diff: R/pkg/R/backend.R --- @@ -108,13 +108,27 @@ invokeJava <- function(isStatic, objId, methodName, ...) { conn <- get("

[GitHub] spark issue #18465: [SPARK-21093][R] Terminate R's worker processes in the p...

2017-06-30 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/18465 @felixcheung are these failures happening from the gapply tests ? Also do we have a way to map the error code to an error reason ? --- If your project is set up for it, you can reply

[GitHub] spark issue #14431: [SPARK-16258][SparkR] Automatically append the grouping ...

2017-06-30 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/14431 Compared to introducing a new API, I think @falaki 's idea of adding a non-default option is better --- If your project is set up for it, you can reply to this email and have your reply appear

[GitHub] spark issue #18465: [SPARK-21093][R] Terminate R's worker processes in the p...

2017-06-29 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/18465 Thanks @HyukjinKwon - I will try to look at this later tonight --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark issue #15821: [SPARK-13534][PySpark] Using Apache Arrow to increase pe...

2017-06-26 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/15821 cc @shaneknapp --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so

[GitHub] spark issue #18320: [SPARK-21093][R] Terminate R's worker processes in the p...

2017-06-25 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/18320 LGTM. The update looks good. Thanks for the thorough testing. We could also post a note on the dev list about this change and especially ask people who use `dapply` or `gapply` or the old `RDD

[GitHub] spark pull request #18320: [SPARK-21093][R] Terminate R's worker processes i...

2017-06-22 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/18320#discussion_r123564501 --- Diff: R/pkg/inst/worker/daemon.R --- @@ -30,8 +30,40 @@ port <- as.integer(Sys.getenv("SPARKR_WORKER_PORT")) inputCon <-

[GitHub] spark pull request #18320: [SPARK-21093][R] Terminate R's worker processes i...

2017-06-22 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/18320#discussion_r123569536 --- Diff: R/pkg/inst/worker/daemon.R --- @@ -30,8 +30,40 @@ port <- as.integer(Sys.getenv("SPARKR_WORKER_PORT")) inputCon <-

[GitHub] spark issue #18320: [SPARK-21093][R] Terminate R's worker processes in the p...

2017-06-21 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/18320 Thanks ! LGTM. Lets also wait to see if @felixcheung has anything more --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your

[GitHub] spark pull request #18320: [SPARK-21093][R] Terminate R's worker processes i...

2017-06-19 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/18320#discussion_r122875289 --- Diff: R/pkg/inst/worker/daemon.R --- @@ -30,8 +30,42 @@ port <- as.integer(Sys.getenv("SPARKR_WORKER_PORT")) inputCon <-

[GitHub] spark pull request #18320: [SPARK-21093][R] Terminate R's worker processes i...

2017-06-19 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/18320#discussion_r122847863 --- Diff: R/pkg/inst/worker/daemon.R --- @@ -31,7 +31,15 @@ inputCon <- socketConnection( port = port, open = "rb", blocking =

[GitHub] spark pull request #18320: [SPARK-21093][R] Terminate R's worker processes i...

2017-06-19 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/18320#discussion_r122804718 --- Diff: R/pkg/inst/worker/daemon.R --- @@ -31,7 +31,30 @@ inputCon <- socketConnection( port = port, open = "rb", blocking =

[GitHub] spark pull request #18320: [SPARK-21093][R] Terminate R's worker processes i...

2017-06-19 Thread shivaram
Github user shivaram commented on a diff in the pull request: https://github.com/apache/spark/pull/18320#discussion_r122803981 --- Diff: R/pkg/inst/worker/daemon.R --- @@ -31,7 +31,15 @@ inputCon <- socketConnection( port = port, open = "rb", blocking =

[GitHub] spark issue #14431: [SPARK-16258][SparkR] Automatically append the grouping ...

2017-06-19 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/14431 AFAIK this was dependent on #14742, but @NarineK may know better --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project

[GitHub] spark issue #18104: [SPARK-20877][SPARKR][WIP] add timestamps to test runs

2017-05-30 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/18104 LGTM. Thanks @felixcheung for the update and @marmbrus for the ping --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project

[GitHub] spark issue #18104: [SPARK-20877][SPARKR][WIP] add timestamps to test runs

2017-05-28 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/18104 @felixcheung This is very cool. Let me try this on a windows VM and winbuilder and get back to you. --- If your project is set up for it, you can reply to this email and have your reply appear

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-23 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 Thanks I'll try to kick off the winbuilder build soon (i'm out of town till tomorrow). One more thing we might need to fix is that winbuilder has a 10 or 20 minute time limit for tests (not sure

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-22 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 @felixcheung Unfortunately I'm out traveling and haven't been able to do the windows tests yet -- Would you have a chance to do that ? Also what are your thoughts on merging this while we test

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-19 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 @felixcheung I made the change - I'm right now going to test this in my Windows VM. Will update this PR with the results --- If your project is set up for it, you can reply to this email and have

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-19 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 Sorry I've been out traveling -- I'll try to update this by tonight --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 @HyukjinKwon Do we know why things sometime queue for a long time on AppVeyor ? Like this PR has been queued for around 5 hours right now. --- If your project is set up for it, you can reply

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 Actually thinking more about this, I think we should be checking for availability of `hadoop` library / binaries rather than `is_cran`. For example I just found that win-builder only runs `R CMD

[GitHub] spark issue #17966: [SPARK-20727] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 This is SPARK-20727 - I just happened to have the other JIRA also open and pasted it incorrectly --- If your project is set up for it, you can reply to this email and have your reply appear

[GitHub] spark issue #17966: [SPARK-20666] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 Sorry @vanzin I got the wrong JIRA number. Fixing it now --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark issue #17966: [SPARK-20666] Skip tests that use Hadoop utils on CRAN W...

2017-05-12 Thread shivaram
Github user shivaram commented on the issue: https://github.com/apache/spark/pull/17966 cc @felixcheung FWIW it might easier to view the diff by adding `w=1` to the URL. i.e. https://github.com/apache/spark/pull/17966/files?w=1 --- If your project is set up for it, you can

[GitHub] spark pull request #17966: [SPARK-20666] Skip tests that use Hadoop utils on...

2017-05-12 Thread shivaram
GitHub user shivaram opened a pull request: https://github.com/apache/spark/pull/17966 [SPARK-20666] Skip tests that use Hadoop utils on CRAN Windows ## What changes were proposed in this pull request? This change skips tests that use the Hadoop libraries while running

  1   2   3   4   5   6   7   8   9   10   >