Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/10480
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user shivaram commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212160685
Merging this to master
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212160561
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212160558
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212160352
**[Test build #56265 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/56265/consoleFull)**
for PR 10480 at commit
Github user shivaram commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212127807
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212128129
**[Test build #56265 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/56265/consoleFull)**
for PR 10480 at commit
Github user shivaram commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212127912
Lets give it one more shot I guess.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212120117
hmm
```
[error] (docker-integration-tests/test:test) sbt.TestsFailedException:
Tests unsuccessful
[error] (streaming/test:test)
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212103788
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212103783
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212103489
**[Test build #56244 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/56244/consoleFull)**
for PR 10480 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212048976
**[Test build #56244 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/56244/consoleFull)**
for PR 10480 at commit
Github user shivaram commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-212026558
@felixcheung Could you bring this PR up to date ? I think the code changes
look fine to me and we can merge after this goes through Jenkins.
---
If your project is
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r60273460
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-210758860
@shivaram please check on this
[question](https://github.com/apache/spark/pull/10480#discussion_r50348037)
when you have a chance?
Users are running into
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r59658092
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user frreiss commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r59598252
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user shivaram commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-195425756
Sorry for the delay @felixcheung -- I'll get back on this today
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-195205927
@shivaram please check on this
[question](https://github.com/apache/spark/pull/10480#discussion_r50348037)
when you have a chance?
Users are running into
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-176510321
@shivaram any suggestion on how to proceed?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r50353509
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r50348037
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user sun-rui commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-173134843
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-173041946
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-173041949
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r50198797
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-173043756
jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-173052052
**[Test build #49739 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49739/consoleFull)**
for PR 10480 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-173073487
**[Test build #49739 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49739/consoleFull)**
for PR 10480 at commit
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r50215657
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2401,4 +2401,41 @@ setMethod("str",
cat(paste0("\nDisplaying first ", ncol(localDF), " columns
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r50216049
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r50218806
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2401,4 +2401,41 @@ setMethod("str",
cat(paste0("\nDisplaying first ", ncol(localDF), " columns
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-173073601
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-173073602
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-172279111
@shivaram this is ready, thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-172148495
**[Test build #49514 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49514/consoleFull)**
for PR 10480 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-172148499
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-172148498
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-172161857
**[Test build #49523 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49523/consoleFull)**
for PR 10480 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-172148329
**[Test build #49514 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49514/consoleFull)**
for PR 10480 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-172169926
**[Test build #49523 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49523/consoleFull)**
for PR 10480 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-172169964
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170436168
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170436166
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170436106
**[Test build #49087 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49087/consoleFull)**
for PR 10480 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170416396
**[Test build #49082 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49082/consoleFull)**
for PR 10480 at commit
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170424371
jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r49286033
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2272,3 +2260,40 @@ setMethod("with",
newEnv <- assignNewEnv(data)
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r49286036
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2272,3 +2260,40 @@ setMethod("with",
newEnv <- assignNewEnv(data)
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170415749
rebased and updated. thanks
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r49286045
--- Diff: R/pkg/R/SQLContext.R ---
@@ -556,3 +556,61 @@ createExternalTable <- function(sqlContext, tableName,
path = NULL, source = NUL
sdf <-
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170418270
```
spark-mllib: found 0 potential binary incompatibilities (filtered 10)
sbt.ResolveException: unresolved dependency:
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170418047
Merged build finished. Test FAILed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170418048
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170418031
**[Test build #49082 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49082/consoleFull)**
for PR 10480 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170425634
**[Test build #49087 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/49087/consoleFull)**
for PR 10480 at commit
Github user shivaram commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-170202498
@sun-rui Are there any more comments on this PR ?
@felixcheung Could you bring this up to date with `master`?
---
If your project is set up for it, you can reply
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48517423
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2272,3 +2260,40 @@ setMethod("with",
newEnv <- assignNewEnv(data)
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48517433
--- Diff: R/pkg/R/SQLContext.R ---
@@ -556,3 +556,61 @@ createExternalTable <- function(sqlContext, tableName,
path = NULL, source = NUL
sdf <-
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48517532
--- Diff: R/pkg/R/generics.R ---
@@ -537,6 +537,12 @@ setGeneric("write.df", function(df, path, ...) {
standardGeneric("write.df") })
#' @export
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48517634
--- Diff: R/pkg/R/SQLContext.R ---
@@ -556,3 +556,61 @@ createExternalTable <- function(sqlContext, tableName,
path = NULL, source = NUL
sdf <-
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48518375
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48518467
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48472149
--- Diff: R/pkg/R/SQLContext.R ---
@@ -556,3 +556,61 @@ createExternalTable <- function(sqlContext, tableName,
path = NULL, source = NUL
sdf <-
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48472173
--- Diff: R/pkg/R/SQLContext.R ---
@@ -556,3 +556,61 @@ createExternalTable <- function(sqlContext, tableName,
path = NULL, source = NUL
sdf <-
Github user sun-rui commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-167517583
For test JDBC, we can add a helper function in Scala side, which reuses
code in JDBCSuite to start a in-memory JDBC server?
---
If your project is set up for it, you
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48467198
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2272,3 +2260,40 @@ setMethod("with",
newEnv <- assignNewEnv(data)
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48472220
--- Diff: R/pkg/R/SQLContext.R ---
@@ -556,3 +556,61 @@ createExternalTable <- function(sqlContext, tableName,
path = NULL, source = NUL
sdf <-
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48515129
--- Diff: R/pkg/R/generics.R ---
@@ -537,6 +537,12 @@ setGeneric("write.df", function(df, path, ...) {
standardGeneric("write.df") })
#' @export
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48515179
--- Diff: R/pkg/R/SQLContext.R ---
@@ -556,3 +556,61 @@ createExternalTable <- function(sqlContext, tableName,
path = NULL, source = NUL
sdf <-
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48515677
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48515246
--- Diff: R/pkg/R/SQLContext.R ---
@@ -556,3 +556,61 @@ createExternalTable <- function(sqlContext, tableName,
path = NULL, source = NUL
sdf <-
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48515382
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2272,3 +2260,40 @@ setMethod("with",
newEnv <- assignNewEnv(data)
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48516227
--- Diff: R/pkg/R/SQLContext.R ---
@@ -556,3 +556,61 @@ createExternalTable <- function(sqlContext, tableName,
path = NULL, source = NUL
sdf <-
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48466540
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2272,3 +2260,40 @@ setMethod("with",
newEnv <- assignNewEnv(data)
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48466513
--- Diff: core/src/main/scala/org/apache/spark/api/r/SerDe.scala ---
@@ -355,6 +355,13 @@ private[spark] object SerDe {
writeInt(dos,
Github user sun-rui commented on a diff in the pull request:
https://github.com/apache/spark/pull/10480#discussion_r48466526
--- Diff: R/pkg/R/generics.R ---
@@ -537,6 +537,12 @@ setGeneric("write.df", function(df, path, ...) {
standardGeneric("write.df") })
#' @export
GitHub user felixcheung opened a pull request:
https://github.com/apache/spark/pull/10480
[SPARK-12224][SPARKR] R support for JDBC source
Add R API for `read.jdbc`, `write.jdbc`.
Tested this quite a bit manually with different combinations of parameters.
It's not clear if
Github user felixcheung commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-167276041
@shivaram @sun-rui
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-167276610
**[Test build #48335 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/48335/consoleFull)**
for PR 10480 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-167283619
**[Test build #48335 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/48335/consoleFull)**
for PR 10480 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-167283792
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/10480#issuecomment-167283791
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
84 matches
Mail list logo