[ 
https://issues.apache.org/jira/browse/SPARK-16679?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15388942#comment-15388942
 ] 

Narine Kokhlikyan commented on SPARK-16679:
-------------------------------------------

Two R helper methods on scala side are:
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala#L2087
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/RelationalGroupedDataset.scala#L407

Python helper methods are:
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala#L2533

Are there any specific python methods which you'd like to move to a helper 
class ? [~rxin], [~shivaram].

Also, in some cases R helper methods access to private fields in Dataset and 
RelationalGroupedDataset, when we move those into a helper class we need to 
find a way to access to those fields or find another solution.

cc [~sunrui]

>  Move `private[sql]` methods in public APIs used for Python/R into a single 
> ‘helper class’
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16679
>                 URL: https://issues.apache.org/jira/browse/SPARK-16679
>             Project: Spark
>          Issue Type: Improvement
>          Components: SparkR, SQL
>            Reporter: Narine Kokhlikyan
>            Priority: Minor
>
> Based on our discussions in:
> https://github.com/apache/spark/pull/12836#issuecomment-225403054
> We’d like to move/relocate `private[sql]` methods in public APIs used for 
> Python/R into a single ‘helper class’, 
> since this methods are public in generated java code and are hard to refactor.
> For instance:  private[sql] def mapPartitionsInR(…) method in Dataset.scala



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to