[
https://issues.apache.org/jira/browse/SPARK-11524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15581268#comment-15581268
]
Sun Rui edited comment on SPARK-11524 at 10/17/16 5:31 AM:
---
for cluster mode,
[
https://issues.apache.org/jira/browse/SPARK-11524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15581268#comment-15581268
]
Sun Rui commented on SPARK-11524:
-
for cluster mode, the R script needs to transferred to the slave node
Sun Rui created SPARK-17968:
---
Summary: Support using 3rd-party R packages on Mesos
Key: SPARK-17968
URL: https://issues.apache.org/jira/browse/SPARK-17968
Project: Spark
Issue Type: Sub-task
[
https://issues.apache.org/jira/browse/SPARK-11524?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15580960#comment-15580960
]
Sun Rui commented on SPARK-11524:
-
great, go ahead. Please look at the linking JIRA issues as references.
Sun Rui created SPARK-17966:
---
Summary: Support Spark packages with R code on Mesos
Key: SPARK-17966
URL: https://issues.apache.org/jira/browse/SPARK-17966
Project: Spark
Issue Type: Sub-task
Sun Rui created SPARK-17965:
---
Summary: Enable SparkR with Mesos cluster mode
Key: SPARK-17965
URL: https://issues.apache.org/jira/browse/SPARK-17965
Project: Spark
Issue Type: Sub-task
Sun Rui created SPARK-17964:
---
Summary: Enable SparkR with Mesos client mode
Key: SPARK-17964
URL: https://issues.apache.org/jira/browse/SPARK-17964
Project: Spark
Issue Type: Sub-task
[
https://issues.apache.org/jira/browse/SPARK-17904?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15571785#comment-15571785
]
Sun Rui commented on SPARK-17904:
-
this is a little bit tricky. You don't know the exact number of nodes
[
https://issues.apache.org/jira/browse/SPARK-17522?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15500078#comment-15500078
]
Sun Rui commented on SPARK-17522:
-
yes. It can be tuned by a config option according to the wordload.
[
https://issues.apache.org/jira/browse/SPARK-17522?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui updated SPARK-17522:
Description:
The MesosCoarseGrainedSchedulerBackend launch executors in a round-robin way
among accepted
Sun Rui created SPARK-17522:
---
Summary: [MESOS] More even distribution of executors on Mesos
cluster
Key: SPARK-17522
URL: https://issues.apache.org/jira/browse/SPARK-17522
Project: Spark
Issue
Sun Rui created SPARK-17519:
---
Summary: [MESOS] Enhance robustness when ExternalShuffleService is
broken
Key: SPARK-17519
URL: https://issues.apache.org/jira/browse/SPARK-17519
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-17428?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15475593#comment-15475593
]
Sun Rui edited comment on SPARK-17428 at 9/9/16 1:52 AM:
-
I don't understand the
[
https://issues.apache.org/jira/browse/SPARK-17428?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15475593#comment-15475593
]
Sun Rui edited comment on SPARK-17428 at 9/9/16 1:50 AM:
-
I don't understand the
[
https://issues.apache.org/jira/browse/SPARK-17428?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15475593#comment-15475593
]
Sun Rui commented on SPARK-17428:
-
I don't understand the meaning of exact version control. I think a
[
https://issues.apache.org/jira/browse/SPARK-17428?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15474162#comment-15474162
]
Sun Rui commented on SPARK-17428:
-
for your point 1, If we specify a normal temporary directory for
[
https://issues.apache.org/jira/browse/SPARK-17428?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15469806#comment-15469806
]
Sun Rui commented on SPARK-17428:
-
[~yanboliang] Allowing pass dependent R packages to executors is a
[
https://issues.apache.org/jira/browse/SPARK-13525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15447791#comment-15447791
]
Sun Rui commented on SPARK-13525:
-
yes, if spark fails to launch R worker as a process, it should throw
[
https://issues.apache.org/jira/browse/SPARK-13573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15447710#comment-15447710
]
Sun Rui commented on SPARK-13573:
-
[~chipsenkbeil], we have made public the method for creating Java
[
https://issues.apache.org/jira/browse/SPARK-13525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15444579#comment-15444579
]
Sun Rui commented on SPARK-13525:
-
Could you help to do some debugging by modifying R code?
The steps
[
https://issues.apache.org/jira/browse/SPARK-13525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15440505#comment-15440505
]
Sun Rui commented on SPARK-13525:
-
What's your spark cluster deployment mode? yarn or standalone?
>
[
https://issues.apache.org/jira/browse/SPARK-13525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15440476#comment-15440476
]
Sun Rui commented on SPARK-13525:
-
Another guess: could you check "localhost" works for local TCP
[
https://issues.apache.org/jira/browse/SPARK-13525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15438331#comment-15438331
]
Sun Rui commented on SPARK-13525:
-
sorry, due to missing log information in such point, it is hard to
[
https://issues.apache.org/jira/browse/SPARK-13525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15438316#comment-15438316
]
Sun Rui commented on SPARK-13525:
-
I checked the code and realized that SocketTimeoutException means the
[
https://issues.apache.org/jira/browse/SPARK-16581?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15425789#comment-15425789
]
Sun Rui commented on SPARK-16581:
-
sure. go ahead. But note that the discussion is continuing. There are
[
https://issues.apache.org/jira/browse/SPARK-17054?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15422423#comment-15422423
]
Sun Rui commented on SPARK-17054:
-
{code}
# do not download if it is run in the sparkR shell
if
[
https://issues.apache.org/jira/browse/SPARK-16944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15411319#comment-15411319
]
Sun Rui edited comment on SPARK-16944 at 8/8/16 5:29 AM:
-
I don't think it can be
[
https://issues.apache.org/jira/browse/SPARK-16944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15411320#comment-15411320
]
Sun Rui commented on SPARK-16944:
-
I don't think it can be improved without dynamic allocation. because
[
https://issues.apache.org/jira/browse/SPARK-16944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15411319#comment-15411319
]
Sun Rui commented on SPARK-16944:
-
I don't think it can be improved without dynamic allocation. because
[
https://issues.apache.org/jira/browse/SPARK-16944?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui updated SPARK-16944:
Comment: was deleted
(was: I don't think it can be improved without dynamic allocation. because
without
[
https://issues.apache.org/jira/browse/SPARK-16944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15411313#comment-15411313
]
Sun Rui commented on SPARK-16944:
-
Not quite sure. I think the Mesos scheduler backend can get the
[
https://issues.apache.org/jira/browse/SPARK-16944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15411297#comment-15411297
]
Sun Rui commented on SPARK-16944:
-
[~jerryshao] [~mgummelt]
> [MESOS] Improve data locality when
[
https://issues.apache.org/jira/browse/SPARK-16944?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui updated SPARK-16944:
Comment: was deleted
(was: [~jerryshao] [~mgummelt])
> [MESOS] Improve data locality when launching new
[
https://issues.apache.org/jira/browse/SPARK-16944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15411295#comment-15411295
]
Sun Rui commented on SPARK-16944:
-
[~jerryshao] [~mgummelt]
> [MESOS] Improve data locality when
Sun Rui created SPARK-16944:
---
Summary: [MESOS] Improve data locality when launching new
executors when dynamic allocation is enabled
Key: SPARK-16944
URL: https://issues.apache.org/jira/browse/SPARK-16944
[
https://issues.apache.org/jira/browse/SPARK-16522?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15409514#comment-15409514
]
Sun Rui commented on SPARK-16522:
-
sure. The PR is under review
> [MESOS] Spark application throws
[
https://issues.apache.org/jira/browse/SPARK-11977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui updated SPARK-11977:
Summary: Support accessing a DataFrame column using its name without
backticks if the name contains '.'
[
https://issues.apache.org/jira/browse/SPARK-16464?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui closed SPARK-16464.
---
Resolution: Duplicate
> withColumn() allows illegal creation of duplicate column names on DataFrame
>
[
https://issues.apache.org/jira/browse/SPARK-16464?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15385418#comment-15385418
]
Sun Rui commented on SPARK-16464:
-
This bug in SparkR has been fixed in spark 2.0, but exists in 1.6.x.
[
https://issues.apache.org/jira/browse/SPARK-16581?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15384187#comment-15384187
]
Sun Rui commented on SPARK-16581:
-
I am thinking we are going to make public the following methods:
[
https://issues.apache.org/jira/browse/SPARK-16581?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15380307#comment-15380307
]
Sun Rui commented on SPARK-16581:
-
I can work on this if this is not so urgent:)
> Making JVM backend
[
https://issues.apache.org/jira/browse/SPARK-15799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15378919#comment-15378919
]
Sun Rui commented on SPARK-15799:
-
1. Making some methods for the JVM backend public will benefit 3rd
[
https://issues.apache.org/jira/browse/SPARK-16509?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15376204#comment-15376204
]
Sun Rui commented on SPARK-16509:
-
I will send the PR
> Rename window.partitionBy and window.orderBy
>
[
https://issues.apache.org/jira/browse/SPARK-16522?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15376169#comment-15376169
]
Sun Rui commented on SPARK-16522:
-
[~mgummelt] The commit is e50efd53f073890d789a8448f850cc219cca7708
>
[
https://issues.apache.org/jira/browse/SPARK-16522?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15374757#comment-15374757
]
Sun Rui commented on SPARK-16522:
-
I checked SPARK-12967, this issue is different. SPARK-12967 is for the
Sun Rui created SPARK-16522:
---
Summary: [MESOS] Spark application throws exception on exit
Key: SPARK-16522
URL: https://issues.apache.org/jira/browse/SPARK-16522
Project: Spark
Issue Type: Bug
[
https://issues.apache.org/jira/browse/SPARK-16509?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15374259#comment-15374259
]
Sun Rui commented on SPARK-16509:
-
yes, let's rename them to pass R package check. maybe windowOrderBy
Sun Rui created SPARK-16366:
---
Summary: Time comparison failures in SparkR unit tests
Key: SPARK-16366
URL: https://issues.apache.org/jira/browse/SPARK-16366
Project: Spark
Issue Type: Bug
[
https://issues.apache.org/jira/browse/SPARK-15799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15356829#comment-15356829
]
Sun Rui commented on SPARK-15799:
-
we can refer to spark_install in sparklyr package.
{code}
[
https://issues.apache.org/jira/browse/SPARK-16326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15356828#comment-15356828
]
Sun Rui commented on SPARK-16326:
-
[~rxin] [~shivaram] [~davies] [~felixcheung] [~mengxr]
> Evaluate
Sun Rui created SPARK-16326:
---
Summary: Evaluate sparklyr package from RStudio
Key: SPARK-16326
URL: https://issues.apache.org/jira/browse/SPARK-16326
Project: Spark
Issue Type: Brainstorming
Sun Rui created SPARK-16300:
---
Summary: Capture errors from R workers in daemon.R to avoid
deletion of R session temporary directory
Key: SPARK-16300
URL: https://issues.apache.org/jira/browse/SPARK-16300
Sun Rui created SPARK-16299:
---
Summary: Capture errors from R workers in daemon.R to avoid
deletion of R session temporary directory
Key: SPARK-16299
URL: https://issues.apache.org/jira/browse/SPARK-16299
[
https://issues.apache.org/jira/browse/SPARK-12172?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15343215#comment-15343215
]
Sun Rui commented on SPARK-12172:
-
currently spark.lapply() internally depends on RDD, we have to change
[
https://issues.apache.org/jira/browse/SPARK-12173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15343212#comment-15343212
]
Sun Rui commented on SPARK-12173:
-
[~rxin] yes R don't need compile time type safety, but map/reduce
[
https://issues.apache.org/jira/browse/SPARK-16055?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui updated SPARK-16055:
Description:
This is an issue reported in the Spark user mailing list. Refer to
Sun Rui created SPARK-16055:
---
Summary: sparkR.init() can not load sparkPackages when executing
an R file
Key: SPARK-16055
URL: https://issues.apache.org/jira/browse/SPARK-16055
Project: Spark
Sun Rui created SPARK-16012:
---
Summary: add gapplyCollect() for SparkDataFrame
Key: SPARK-16012
URL: https://issues.apache.org/jira/browse/SPARK-16012
Project: Spark
Issue Type: Sub-task
Sun Rui created SPARK-15908:
---
Summary: Add varargs-type dropDuplicates() function in SparkR
Key: SPARK-15908
URL: https://issues.apache.org/jira/browse/SPARK-15908
Project: Spark
Issue Type: New
[
https://issues.apache.org/jira/browse/SPARK-15857?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15326153#comment-15326153
]
Sun Rui commented on SPARK-15857:
-
+1 for this feature
> Add Caller Context in Spark
>
[
https://issues.apache.org/jira/browse/SPARK-15799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15326146#comment-15326146
]
Sun Rui commented on SPARK-15799:
-
This request has been asked before. The question is that SparkR needs
[
https://issues.apache.org/jira/browse/SPARK-10043?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui resolved SPARK-10043.
-
Resolution: Fixed
Fix Version/s: 1.6.0
> Add window functions into SparkR
>
[
https://issues.apache.org/jira/browse/SPARK-15521?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15299483#comment-15299483
]
Sun Rui edited comment on SPARK-15521 at 5/25/16 5:30 AM:
--
cc [~felixcheung],
[
https://issues.apache.org/jira/browse/SPARK-15521?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15299483#comment-15299483
]
Sun Rui commented on SPARK-15521:
-
cc [~felixcheung]
> Add high level APIs based on dapply and gapply
Sun Rui created SPARK-15521:
---
Summary: Add high level APIs based on dapply and gapply for easier
usage
Key: SPARK-15521
URL: https://issues.apache.org/jira/browse/SPARK-15521
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-10053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui closed SPARK-10053.
---
Resolution: Won't Fix
> SparkR isn't exporting lapply
> -
>
>
[
https://issues.apache.org/jira/browse/SPARK-10053?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15299475#comment-15299475
]
Sun Rui commented on SPARK-10053:
-
as SparkDataFrame has supported dapply() similar to lapplyPartition,
[
https://issues.apache.org/jira/browse/SPARK-15196?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui closed SPARK-15196.
---
Resolution: Not A Problem
> Add a wrapper for dapply(repartition(col,...), ... )
>
[
https://issues.apache.org/jira/browse/SPARK-15196?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15299473#comment-15299473
]
Sun Rui commented on SPARK-15196:
-
As discussed in https://github.com/apache/spark/pull/12966, this has
[
https://issues.apache.org/jira/browse/SPARK-14751?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15281516#comment-15281516
]
Sun Rui commented on SPARK-14751:
-
if the key type is integer or long, toString works.
[~shivaram] how
[
https://issues.apache.org/jira/browse/SPARK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15281105#comment-15281105
]
Sun Rui edited comment on SPARK-15159 at 5/12/16 2:22 AM:
--
maybe this can be
[
https://issues.apache.org/jira/browse/SPARK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15281105#comment-15281105
]
Sun Rui commented on SPARK-15159:
-
maybe this can be split into several subtasks.
> SparkSession R API
>
[
https://issues.apache.org/jira/browse/SPARK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15281104#comment-15281104
]
Sun Rui commented on SPARK-15159:
-
@Vijary Parmar, I changed the scope of this PR. You can refer to the
[
https://issues.apache.org/jira/browse/SPARK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui updated SPARK-15159:
Summary: SparkSession R API (was: Remove usage of HiveContext in SparkR.)
> SparkSession R API
>
[
https://issues.apache.org/jira/browse/SPARK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15281100#comment-15281100
]
Sun Rui commented on SPARK-15159:
-
ok, let's do more to match pyspark:)
> Remove usage of HiveContext in
[
https://issues.apache.org/jira/browse/SPARK-14162?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15277682#comment-15277682
]
Sun Rui commented on SPARK-14162:
-
We met the same error. The cause is that in one worker node, mysql
[
https://issues.apache.org/jira/browse/SPARK-15237?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15277454#comment-15277454
]
Sun Rui commented on SPARK-15237:
-
SparkR supports two types of corr(), something like
[
https://issues.apache.org/jira/browse/SPARK-15202?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui updated SPARK-15202:
Issue Type: Sub-task (was: New Feature)
Parent: SPARK-6817
> add dapplyCollect() method for
Sun Rui created SPARK-15202:
---
Summary: add dapplyCollect() method for DataFrame in SparkR
Key: SPARK-15202
URL: https://issues.apache.org/jira/browse/SPARK-15202
Project: Spark
Issue Type: New
Sun Rui created SPARK-15201:
---
Summary: Handle integer overflow correctly in hash code computation
Key: SPARK-15201
URL: https://issues.apache.org/jira/browse/SPARK-15201
Project: Spark
Issue Type:
[
https://issues.apache.org/jira/browse/SPARK-14365?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15273664#comment-15273664
]
Sun Rui commented on SPARK-14365:
-
[~dselivanov] Could you verify if SPARK-15110 can solve your problem?
[
https://issues.apache.org/jira/browse/SPARK-14365?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui closed SPARK-14365.
---
Resolution: Duplicate
> Repartition by column
> -
>
> Key: SPARK-14365
>
[
https://issues.apache.org/jira/browse/SPARK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15273655#comment-15273655
]
Sun Rui commented on SPARK-15159:
-
[~felixcheung], I guess you are talking about SQLContext, not
[
https://issues.apache.org/jira/browse/SPARK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui updated SPARK-15159:
Description: HiveContext is to be deprecated in 2.0. Replace them with
SparkSession.withHiveSupport in
[
https://issues.apache.org/jira/browse/SPARK-15159?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sun Rui updated SPARK-15159:
Summary: Remove usage of HiveContext in SparkR. (was: Remove usage of
HiveContext in SparkR unit test
Sun Rui created SPARK-15159:
---
Summary: Remove usage of HiveContext in SparkR unit test cases.
Key: SPARK-15159
URL: https://issues.apache.org/jira/browse/SPARK-15159
Project: Spark
Issue Type:
Sun Rui created SPARK-15091:
---
Summary: Fix warnings and a failure in SparkR test cases with
testthat version 1.0.1
Key: SPARK-15091
URL: https://issues.apache.org/jira/browse/SPARK-15091
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-14995?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15266031#comment-15266031
]
Sun Rui commented on SPARK-14995:
-
[~felixcheung] I think no need to add "spark". Just something like
[
https://issues.apache.org/jira/browse/SPARK-14995?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15266031#comment-15266031
]
Sun Rui edited comment on SPARK-14995 at 5/2/16 1:07 AM:
-
[~felixcheung] I think
[
https://issues.apache.org/jira/browse/SPARK-14995?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15263352#comment-15263352
]
Sun Rui commented on SPARK-14995:
-
cc [~felixcheung], [~Narine], [~shivaram].
[~felixcheung] are you
Sun Rui created SPARK-14995:
---
Summary: Add "since" tag in Roxygen documentation for SparkR API
methods
Key: SPARK-14995
URL: https://issues.apache.org/jira/browse/SPARK-14995
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-12922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15261339#comment-15261339
]
Sun Rui commented on SPARK-12922:
-
[~Narine] does AppendColumns logical operator
[
https://issues.apache.org/jira/browse/SPARK-14751?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15255511#comment-15255511
]
Sun Rui edited comment on SPARK-14751 at 4/24/16 6:28 AM:
--
One possible
[
https://issues.apache.org/jira/browse/SPARK-14751?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15255511#comment-15255511
]
Sun Rui commented on SPARK-14751:
-
One possible workaround is on scala side if the key type is not
[
https://issues.apache.org/jira/browse/SPARK-13178?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15253518#comment-15253518
]
Sun Rui commented on SPARK-13178:
-
This is fixed as the SparkR unit tests can pass after removing the
[
https://issues.apache.org/jira/browse/SPARK-13525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15251888#comment-15251888
]
Sun Rui commented on SPARK-13525:
-
It is expected that read.df() has no such issue, because R worker
[
https://issues.apache.org/jira/browse/SPARK-14803?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15251862#comment-15251862
]
Sun Rui commented on SPARK-14803:
-
cc [~cloud_fan]
I will submit a PR for this, but not sure if it is the
Sun Rui created SPARK-14803:
---
Summary: A bug in EliminateSerialization rule in Catalyst
Optimizer
Key: SPARK-14803
URL: https://issues.apache.org/jira/browse/SPARK-14803
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-12922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15251196#comment-15251196
]
Sun Rui commented on SPARK-12922:
-
cool. If possible, could you make it a WIP PR so that I can take a
[
https://issues.apache.org/jira/browse/SPARK-14746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15251152#comment-15251152
]
Sun Rui commented on SPARK-14746:
-
fine.
> Support transformations in R source code for
1 - 100 of 374 matches
Mail list logo