Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14433#discussion_r77205301
--- Diff: core/src/main/scala/org/apache/spark/internal/Logging.scala ---
@@ -135,7 +136,12 @@ private[spark] trait Logging {
val replLevel
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14783
@sun-rui Any other comments ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14856
Thanks @keypointt for the PR and @junyangq @felixcheung for reviewing.
Merging this into master
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14783
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14903
Merging this into master and branch-2.0
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14903
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14783
Sorry I think this was a break that I just fixed in #14904
Jenkins, retest this please.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14904
Thanks - merging this to master, branch-2.0
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14904
cc @junyangq @felixcheung
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14903
The fix is in #14904
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
GitHub user shivaram opened a pull request:
https://github.com/apache/spark/pull/14904
[SPARKR][SPARK-16581] Fix JVM API tests in SparkR
## What changes were proposed in this pull request?
Remove cleanup.jobj test. Use JVM wrapper API for other test cases.
## How
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14903
@junyangq @felixcheung The test error is due to an unrelated issue caused
when we upgraded testthat on Jenkins. I'm sending a fix for it now
---
If your project is set up for it, you can reply
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14784#discussion_r77073459
--- Diff: R/pkg/R/sparkR.R ---
@@ -365,6 +365,10 @@ sparkR.session <- function(
}
overrideEnvs(sparkConfigMap, param
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14889
Yeah lets open a separate JIRA to update testthat on the Jenkins boxes.
LGTM. Merging this to master and branch-2.0
---
If your project is set up for it, you can reply to this email and have
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14784#discussion_r77031075
--- Diff: R/pkg/R/sparkR.R ---
@@ -365,6 +365,10 @@ sparkR.session <- function(
}
overrideEnvs(sparkConfigMap, param
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14859
Thanks @HyukjinKwon - Some comments:
- I think we can filter commits by `[SPARKR]` in the PR title. Changes to
core/ or sql/ could of course break the SparkR tests but it'll give us some
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14889
I just ran this on one of the Jenkins machines
```
> packageVersion("testthat")
[1] â0.11.0â
```
---
If your project is set up for it, you can reply to this e
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14890
LGTM. Thanks @HyukjinKwon - Merging this to master
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14889
Thanks @HyukjinKwon - This is a great catch. LGTM pending tests.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14744
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14859
Good point on only running this on the master branch. We could even run it
periodically (say nightly) instead of on every commit.
---
If your project is set up for it, you can reply to this email
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14775
Thanks @felixcheung - Addressed both the comments. Let me know if this
looks good.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76654300
--- Diff: R/pkg/R/backend.R ---
@@ -25,9 +25,23 @@ isInstanceOf <- function(jobj, className) {
callJMethod(cls, "isInstanc
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76654243
--- Diff: R/pkg/R/backend.R ---
@@ -37,12 +51,42 @@ callJMethod <- function(objId, methodName, ...) {
invokeJava(isStatic = FALSE, objId
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76645788
--- Diff: R/pkg/DESCRIPTION ---
@@ -11,7 +11,7 @@ Authors@R: c(person("Shivaram", "Venkataraman", role =
c("aut", "cr
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14859
Thanks @HyukjinKwon - I for one would at least like the SparkR client to
work on Windows as I think there are quite a few R users who are on Windows.
@HyukjinKwon Could you add some
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76526127
--- Diff: R/pkg/DESCRIPTION ---
@@ -11,7 +11,7 @@ Authors@R: c(person("Shivaram", "Venkataraman", role =
c("aut", "cr
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14775
@felixcheung Good point about having a wrapper -- That will make it easier
to update the methods going forward. I added a new file `jvm.R` with the
wrapper functions.
---
If your project is set
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76523010
--- Diff: R/pkg/R/jobj.R ---
@@ -82,7 +82,20 @@ getClassName.jobj <- function(x) {
callJMethod(cls, "getName")
}
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522998
--- Diff: R/pkg/inst/tests/testthat/test_jvm_api.R ---
@@ -0,0 +1,41 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522995
--- Diff: R/pkg/R/jobj.R ---
@@ -82,7 +82,20 @@ getClassName.jobj <- function(x) {
callJMethod(cls, "getName")
}
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522987
--- Diff: R/pkg/R/jobj.R ---
@@ -82,7 +82,20 @@ getClassName.jobj <- function(x) {
callJMethod(cls, "getName")
}
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522986
--- Diff: R/pkg/R/backend.R ---
@@ -37,12 +51,42 @@ callJMethod <- function(objId, methodName, ...) {
invokeJava(isStatic = FALSE, objId
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522977
--- Diff: R/pkg/R/backend.R ---
@@ -37,12 +51,42 @@ callJMethod <- function(objId, methodName, ...) {
invokeJava(isStatic = FALSE, objId
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522953
--- Diff: R/pkg/R/backend.R ---
@@ -37,12 +51,42 @@ callJMethod <- function(objId, methodName, ...) {
invokeJava(isStatic = FALSE, objId
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522957
--- Diff: R/pkg/R/backend.R ---
@@ -37,12 +51,42 @@ callJMethod <- function(objId, methodName, ...) {
invokeJava(isStatic = FALSE, objId
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522952
--- Diff: R/pkg/R/backend.R ---
@@ -25,9 +25,23 @@ isInstanceOf <- function(jobj, className) {
callJMethod(cls, "isInstanc
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522940
--- Diff: R/pkg/R/backend.R ---
@@ -25,9 +25,23 @@ isInstanceOf <- function(jobj, className) {
callJMethod(cls, "isInstanc
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76522943
--- Diff: R/pkg/R/backend.R ---
@@ -25,9 +25,23 @@ isInstanceOf <- function(jobj, className) {
callJMethod(cls, "isInstanc
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14775#discussion_r76444282
--- Diff: R/pkg/NAMESPACE ---
@@ -363,4 +363,9 @@ S3method(structField, jobj)
S3method(structType, jobj)
S3method(structType, structField
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/13584
Yeah I was going to say that we need to handle cases where `labels_output`
is also used. We can just add a numeric suffix maybe ?
---
If your project is set up for it, you can reply to this email
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14774
LGTM. Thanks @wangmiao1981
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/13584
Hmm - the problem still seems to be relevant. @mengxr @junyangq Would one
of you be able to look at this ?
---
If your project is set up for it, you can reply to this email and have your
reply
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/13584
@keypointt Is this PR still relevant ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14046
@sun-rui Is this PR still active ? Can we close it if its not relevant ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14744
LGTM. I agree that using SparkConf is preferable over environment variables
-- but it would be good to make the documentation clear on when each option is
used.
---
If your project is set up
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14744#discussion_r76083596
--- Diff: docs/configuration.md ---
@@ -1752,6 +1752,13 @@ showDF(properties, numRows = 200, truncate = FALSE)
Executable for executing R scripts
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14783
Jenkins, ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14783
Thanks @clarkfitzg -- I'll take a look at this tomorrow
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14775
cc @felixcheung @olarayej
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
GitHub user shivaram opened a pull request:
https://github.com/apache/spark/pull/14775
[SPARK-16581][SPARKR] Make JVM backend calling functions public
## What changes were proposed in this pull request?
This change exposes a public API in SparkR to create objects, call
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14741
The value of `what` shouldn't change if we are retrying the read. If it was
`integer` it should remain `integer`. There is a corner case of what happens if
we say read the first byte
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14447
@keypointt The error is happening because of the `href` line in the
`spark.mlp` documentation. If you move the opening brace to the previous line
the error goes away. i.e. the code should look
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14759
Merging this to master, branch-2.0
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14758
Hmm thats a good catch. Unfortunately it looks like CRAN only wants one
person / one email address to be in the Maintainer field. This is also
documented at http://r-pkgs.had.co.nz
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14759#discussion_r75781286
--- Diff: R/run-tests.sh ---
@@ -26,14 +26,35 @@ rm -f $LOGFILE
SPARK_TESTING=1 $FWDIR/../bin/spark-submit --driver-java-options
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14734
It was mostly manual -- I ran `dev/merge_spark_pr` and it prompted me to
say the cherry-pick has conflicts. I then opened another terminal and found
that the conflicts were in 3 files. I manually
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14734
I just did the merge in
https://github.com/apache/spark/commit/b65b041af8b64413c7d460d4ea110b2044d6f36e
-- Will be great if you can run CRAN checks using this and make sure I didn't
miss anything
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14759#discussion_r75776464
--- Diff: R/run-tests.sh ---
@@ -26,14 +26,35 @@ rm -f $LOGFILE
SPARK_TESTING=1 $FWDIR/../bin/spark-submit --driver-java-options
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14734
Hmm - let me give it a shot. If not I will open a fresh PR for `branch-2.0`
after doing a manual edit per file
---
If your project is set up for it, you can reply to this email and have your
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14759
So running `R/check-cran.sh` out of the box on master branch will fail.
However if you run `NO_TESTS=1 R/check-cran.sh` it should pass. Basically we
can't use check-cran.sh to run unit tests
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75772452
--- Diff: R/pkg/NAMESPACE ---
@@ -1,5 +1,9 @@
# Imports from base R
-importFrom(methods, setGeneric, setMethod, setOldClass)
+# Do not include
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14759
The new check I added doesn't run unit tests using `R CMD check`-- it only
checks the documentation etc. and thus works on the master branch as well.
---
If your project is set up for it, you
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14759
@junyangq I guess we could - but it should be pretty simple to figure out
what is the problem given the entire log file ? FWIW the error we have right
now will be fixed when we merge #14734
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14759#discussion_r75766425
--- Diff: R/run-tests.sh ---
@@ -26,14 +26,30 @@ rm -f $LOGFILE
SPARK_TESTING=1 $FWDIR/../bin/spark-submit --driver-java-options
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14759#discussion_r75758514
--- Diff: R/run-tests.sh ---
@@ -26,14 +26,30 @@ rm -f $LOGFILE
SPARK_TESTING=1 $FWDIR/../bin/spark-submit --driver-java-options
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14759#discussion_r75758122
--- Diff: R/run-tests.sh ---
@@ -26,14 +26,30 @@ rm -f $LOGFILE
SPARK_TESTING=1 $FWDIR/../bin/spark-submit --driver-java-options
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14758
Merging to master, branch-2.0
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14759
cc @felixcheung @junyangq
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
GitHub user shivaram opened a pull request:
https://github.com/apache/spark/pull/14759
[SPARK-16577][SPARKR] Add CRAN documentation checks to run-tests.sh
## What changes were proposed in this pull request?
(Please fill in changes proposed in this fix)
## How
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14735
Leaving it out of branch-2.0 sounds good to me.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14758
cc @mengxr @felixcheung
FYI - This is mostly to ensure that we can have more maintainers who can
update the CRAN submissions. This shouldn't affect anything else on the
development side
GitHub user shivaram opened a pull request:
https://github.com/apache/spark/pull/14758
[SPARKR][MINOR] Add Xiangrui and Felix to maintainers
## What changes were proposed in this pull request?
This change adds Xiangrui Meng and Felix Cheung to the maintainers field
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14735
@felixcheung I didn't look at the code very closely, but will this change
be required in `branch-2.0` as well ? If so the merge might be hard to
---
If your project is set up for it, you can
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75720021
--- Diff: R/pkg/R/DataFrame.R ---
@@ -3058,7 +3057,7 @@ setMethod("str",
#' @note drop since 2.0.0
setMethod("drop",
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14734
LGTM. I had a couple of minor comments inline.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75718809
--- Diff: R/pkg/R/generics.R ---
@@ -1339,7 +1339,6 @@ setGeneric("spark.naiveBayes", function(data,
formula, ...) { standardGeneric("s
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75718694
--- Diff: R/pkg/R/generics.R ---
@@ -1339,7 +1339,6 @@ setGeneric("spark.naiveBayes", function(data,
formula, ...) { standardGeneric("s
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75718243
--- Diff: R/pkg/R/DataFrame.R ---
@@ -3058,7 +3057,7 @@ setMethod("str",
#' @note drop since 2.0.0
setMethod("drop",
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75717898
--- Diff: R/pkg/NAMESPACE ---
@@ -1,5 +1,9 @@
# Imports from base R
-importFrom(methods, setGeneric, setMethod, setOldClass)
+# Do not include
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14734
@junyangq Could you take one more look ? I will also do a pass now
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14743
BTW LGTM. Merging this PR into master, branch-2.0
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14743
Thanks @HyukjinKwon -- this is a bit surprising as it was only recently
that you fixed the windows tests in
https://github.com/apache/spark/commit/1c403733b89258e57daf7b8b0a2011981ad7ed8a
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14743
@HyukjinKwon can you test this on Windows ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14705
I cherry-picked this PR into `branch-2.0` in
https://github.com/apache/spark/commit/0297896119e11f23da4b14f62f50ec72b5fac57f
-- The merge was a little awkward and but I think I got it to work
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14705
Yeah so we can do a couple of things. One is we try to cherry-pick this PR
to branch-2.0 and then fix all the merge conflicts that are thrown. I think
that should handle cases where the method
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14735
This seems a big enough change that it might be good to have a JIRA for
this ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14489
The warning I got in `branch-2.0` before this PR was
```
Duplicated \argument entries in documentation object 'coltypes':
âxâ
```
With the backport this warning goes away
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14705
Yeah I think we will be more careful about adding new algorithms to
override existing R methods -- but given that `glm` is already exposed I'd
think we can make an exception for just this one
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14705
Thanks @junyangq -- I just ran the check on `branch-2.0` with this PR and
in addition to `glm` there were two warnings for `...` in `first` and
`unpersist` that we can fix in this PR
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14489
@felixcheung FYI I'm also backporting this to branch-2.0 as it gets rid of
one of the CRAN warnings.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14639#discussion_r75344423
--- Diff: R/pkg/R/sparkR.R ---
@@ -344,6 +344,7 @@ sparkRHive.init <- function(jsc = NULL) {
#' @note sparkR.session since 2.0.0
sparkR.sess
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14639
@zjffdu Thanks for clarifying -- I now remember that in the YARN cluster
mode there is no `SPARK_HOME` set. However in this case the JVM comes up first
and the R process then connects to it. So
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14626
LGTM. Thanks @felixcheung - Merging this to master and branch-2.0
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14641
Thanks - Merging this to master
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14626#discussion_r74866185
--- Diff: R/pkg/R/generics.R ---
@@ -152,9 +146,9 @@ setGeneric("getNumPartitions", function(x) {
standardGeneric("getNumPartitions&q
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14182
This looks fine to me - @felixcheung feel free to merge this when you think
its good to go
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14522
Yeah LGTM. Merging this to master, branch-2.0 -- Thanks @junyangq
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/14626
I had one minor question about partitionBy -- otherwise change LGTM. Thanks
@felixcheung
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/14626#discussion_r74803446
--- Diff: R/pkg/R/generics.R ---
@@ -152,9 +146,9 @@ setGeneric("getNumPartitions", function(x) {
standardGeneric("getNumPartitions&q
501 - 600 of 2516 matches
Mail list logo