[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-02-14 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/16720


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-02-14 Thread felixcheung
Github user felixcheung closed the pull request at:

https://github.com/apache/spark/pull/16720


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-02-14 Thread felixcheung
GitHub user felixcheung reopened a pull request:

https://github.com/apache/spark/pull/16720

[SPARK-19387][SPARKR] Tests do not run with SparkR source package in CRAN 
check

## What changes were proposed in this pull request?

- this is cause by changes in SPARK-18444, SPARK-18643 that we no longer 
install Spark when `master = ""` (default), but also related to SPARK-18449 
since the real `master` value is not known at the time the R code in 
`sparkR.session` is run. (`master` cannot default to "local" since it could be 
overridden by spark-submit commandline or spark config)
- as a result, while running SparkR as a package in IDE is working fine, 
CRAN check is not as it is launching it via non-interactive script
- fix is to add check to the beginning of each test and vignettes; the same 
would also work by changing `sparkR.session()` to `sparkR.session(master = 
"local")` in tests, but I think being more explicit is better.

## How was this patch tested?

Tested this by reverting version to 2.1, since it needs to download the 
release jar with matching version. But since there are changes in 2.2 
(specifically around SparkR ML) that are incompatible with 2.1, some tests are 
failing in this config. Will need to port this to branch-2.1 and retest with 
2.1 release jar.

manually as:
```
# modify DESCRIPTION to revert version to 2.1.0
SPARK_HOME=/usr/spark R CMD build pkg
# run cran check without SPARK_HOME 
R CMD check --as-cran SparkR_2.1.0.tar.gz
```


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/felixcheung/spark rcranchecktest

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/16720.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #16720


commit 318ecc876c64b7085df46618278861c2ea9ff422
Author: Felix Cheung 
Date:   2017-01-27T19:03:15Z

make sure Spark is installed

commit f51f504acb3a64da27bf0bddbb156c68d62d89bb
Author: Felix Cheung 
Date:   2017-01-28T06:55:54Z

simplify

commit cd1394a8aab39b20c9c747b5ddfe5a003081510a
Author: Felix Cheung 
Date:   2017-02-13T00:11:18Z

change install.spark




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-02-07 Thread shivaram
Github user shivaram commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r99975077
  
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -27,6 +27,9 @@ library(SparkR)
 
 We use default settings in which it runs in local mode. It auto downloads 
Spark package in the background if no previous installation is found. For more 
details about setup, see [Spark Session](#SetupSparkSession).
 
+```{r, include=FALSE}
+SparkR:::sparkCheckInstall()
--- End diff --

FWIW These vignette changes are still needed even if we update run-all.R


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-02-07 Thread shivaram
Github user shivaram commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r99975068
  
--- Diff: R/pkg/inst/tests/testthat/test_utils.R ---
@@ -17,6 +17,9 @@
 
 context("functions in utils.R")
 
+# Ensure Spark is installed
+sparkCheckInstall()
--- End diff --

I just tested this by putting SparkR:::checkInstall in run-all.R (before 
calling test_package) and that seems to do the trick on a custom 2.1.0 build ! 

@felixchueng when you get a chance can you update the PR with that ? The 
only thing that I'm concerned about is calling a private function from 
`run-all.R` - We could either export this function or move some of this 
functionality into `install.spark`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-02-04 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r99462457
  
--- Diff: R/pkg/inst/tests/testthat/test_utils.R ---
@@ -17,6 +17,9 @@
 
 context("functions in utils.R")
 
+# Ensure Spark is installed
+sparkCheckInstall()
--- End diff --

sorry, I"m really swamped, haven't had the chance to test that out yet


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-02-03 Thread shivaram
Github user shivaram commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r99456349
  
--- Diff: R/pkg/inst/tests/testthat/test_utils.R ---
@@ -17,6 +17,9 @@
 
 context("functions in utils.R")
 
+# Ensure Spark is installed
+sparkCheckInstall()
--- End diff --

Any luck testing this out ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-01-31 Thread shivaram
Github user shivaram commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r98838566
  
--- Diff: R/pkg/inst/tests/testthat/test_utils.R ---
@@ -17,6 +17,9 @@
 
 context("functions in utils.R")
 
+# Ensure Spark is installed
+sparkCheckInstall()
--- End diff --

Ah thats a great idea - Can you see if that works (unfortunately it needs 
manual verification) ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-01-31 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r98837222
  
--- Diff: R/pkg/inst/tests/testthat/test_utils.R ---
@@ -17,6 +17,9 @@
 
 context("functions in utils.R")
 
+# Ensure Spark is installed
+sparkCheckInstall()
--- End diff --

hmm, we could put it in 
https://github.com/apache/spark/blob/master/R/pkg/tests/run-all.R?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-01-31 Thread shivaram
Github user shivaram commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r98837066
  
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -27,6 +27,9 @@ library(SparkR)
 
 We use default settings in which it runs in local mode. It auto downloads 
Spark package in the background if no previous installation is found. For more 
details about setup, see [Spark Session](#SetupSparkSession).
 
+```{r, include=FALSE}
+SparkR:::sparkCheckInstall()
--- End diff --

Is the Rmd file a part of the install that the users see ? I just dont want 
to put in any code that people might copy-paste etc. Is it not good enough to 
pass in `master=local[*]` here ? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-01-31 Thread shivaram
Github user shivaram commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r98836638
  
--- Diff: R/pkg/inst/tests/testthat/test_utils.R ---
@@ -17,6 +17,9 @@
 
 context("functions in utils.R")
 
+# Ensure Spark is installed
+sparkCheckInstall()
--- End diff --

Sure - that sounds fine. I was looking to see if `testthat` had any support 
for writing a `setup` that gets called before each test - Doesn't look like it 
has that


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-01-31 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r98834936
  
--- Diff: R/pkg/inst/tests/testthat/test_utils.R ---
@@ -17,6 +17,9 @@
 
 context("functions in utils.R")
 
+# Ensure Spark is installed
+sparkCheckInstall()
--- End diff --

I understand that, but as pointed out 
https://github.com/apache/spark/pull/16720#issuecomment-275832013, some tests 
don't need SparkSession, and some tests will create/stop one as needed, and to 
have a function that does that all just mean more complicity?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-01-31 Thread felixcheung
Github user felixcheung commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r98834935
  
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -27,6 +27,9 @@ library(SparkR)
 
 We use default settings in which it runs in local mode. It auto downloads 
Spark package in the background if no previous installation is found. For more 
details about setup, see [Spark Session](#SetupSparkSession).
 
+```{r, include=FALSE}
+SparkR:::sparkCheckInstall()
--- End diff --

this has `include=FALSE` so it will run but the code and output will not be 
included in the vignettes text


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-01-31 Thread shivaram
Github user shivaram commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r98833756
  
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -27,6 +27,9 @@ library(SparkR)
 
 We use default settings in which it runs in local mode. It auto downloads 
Spark package in the background if no previous installation is found. For more 
details about setup, see [Spark Session](#SetupSparkSession).
 
+```{r, include=FALSE}
+SparkR:::sparkCheckInstall()
--- End diff --

Is it ok to include a `:::` function in the vignette ? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-01-31 Thread shivaram
Github user shivaram commented on a diff in the pull request:

https://github.com/apache/spark/pull/16720#discussion_r98833957
  
--- Diff: R/pkg/inst/tests/testthat/test_utils.R ---
@@ -17,6 +17,9 @@
 
 context("functions in utils.R")
 
+# Ensure Spark is installed
+sparkCheckInstall()
--- End diff --

What I had in mind was to combine the `sparkR.session` and this 
`sparkCheckInstall` into one function so its easy to remember for a new test 
file. Any thoughts on this ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #16720: [SPARK-19387][SPARKR] Tests do not run with Spark...

2017-01-27 Thread felixcheung
GitHub user felixcheung opened a pull request:

https://github.com/apache/spark/pull/16720

[SPARK-19387][SPARKR] Tests do not run with SparkR source package in CRAN 
check

## What changes were proposed in this pull request?

- this is cause by changes in SPARK-18444, SPARK-18643 that we no longer 
install Spark when `master = ""` (default), but also related to SPARK-18449 
since the real `master` value is not known at the time the R code in 
`sparkR.session` is run. (`master` cannot default to "local" since it could be 
overridden by spark-submit commandline or spark config)
- as a result, while running SparkR as a package in IDE is working fine, 
CRAN check is not as it is launching it via non-interactive script
- 

## How was this patch tested?

Tested this by reverting version to 2.1, since it needs to download the 
release jar with matching version. But since there are changes in 2.2 
(specifically around SparkR ML) that are incompatible with 2.1, some tests are 
failing in this config. Will need to port this to branch-2.1 and retest with 
2.1 release jar.

manually as:
```
# modify DESCRIPTION to revert version to 2.1.0
SPARK_HOME=/usr/spark R CMD build pkg
R CMD check --as-cran SparkR_2.1.0.tar.gz
```


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/felixcheung/spark rcranchecktest

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/16720.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #16720


commit 318ecc876c64b7085df46618278861c2ea9ff422
Author: Felix Cheung 
Date:   2017-01-27T19:03:15Z

make sure Spark is installed




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org