[ https://issues.apache.org/jira/browse/SPARK-12922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15262583#comment-15262583 ]
Narine Kokhlikyan commented on SPARK-12922: ------------------------------------------- Hi [~sunrui], I've pushed my changes. Here is the link: https://github.com/apache/spark/compare/master...NarineK:gapply There are some things which I can reuse from dapply, I've copied those in but will remove after merging with dapply. I think we can use AppendColumnsWithObject but it fails at line: 76, sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/object.scala Not quite sure, why. assert(child.output.length == 1) Could you please verify the part with serializing and deserializing the rows ? Thank you, Narine > Implement gapply() on DataFrame in SparkR > ----------------------------------------- > > Key: SPARK-12922 > URL: https://issues.apache.org/jira/browse/SPARK-12922 > Project: Spark > Issue Type: Sub-task > Components: SparkR > Affects Versions: 1.6.0 > Reporter: Sun Rui > > gapply() applies an R function on groups grouped by one or more columns of a > DataFrame, and returns a DataFrame. It is like GroupedDataSet.flatMapGroups() > in the Dataset API. > Two API styles are supported: > 1. > {code} > gd <- groupBy(df, col1, ...) > gapply(gd, function(grouping_key, group) {}, schema) > {code} > 2. > {code} > gapply(df, grouping_columns, function(grouping_key, group) {}, schema) > {code} > R function input: grouping keys value, a local data.frame of this grouped > data > R function output: local data.frame > Schema specifies the Row format of the output of the R function. It must > match the R function's output. > Note that map-side combination (partial aggregation) is not supported, user > could do map-side combination via dapply(). -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org