[jira] [Assigned] (SPARK-18335) Add a numSlices parameter to SparkR's createDataFrame

2017-01-08 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-18335?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-18335:


Assignee: (was: Apache Spark)

> Add a numSlices parameter to SparkR's createDataFrame
> -
>
> Key: SPARK-18335
> URL: https://issues.apache.org/jira/browse/SPARK-18335
> Project: Spark
>  Issue Type: Improvement
>  Components: SparkR
>Reporter: Shixiong Zhu
>
> SparkR's createDataFrame doesn't have a `numSlices` parameter. The user 
> cannot set a partition number when converting a large R dataframe to SparkR 
> dataframe. A workaround is using `repartition`, but it requires a shuffle 
> stage. It's better to support the `numSlices` parameter in the 
> `createDataFrame` method.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-18335) Add a numSlices parameter to SparkR's createDataFrame

2017-01-08 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-18335?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-18335:


Assignee: Apache Spark

> Add a numSlices parameter to SparkR's createDataFrame
> -
>
> Key: SPARK-18335
> URL: https://issues.apache.org/jira/browse/SPARK-18335
> Project: Spark
>  Issue Type: Improvement
>  Components: SparkR
>Reporter: Shixiong Zhu
>Assignee: Apache Spark
>
> SparkR's createDataFrame doesn't have a `numSlices` parameter. The user 
> cannot set a partition number when converting a large R dataframe to SparkR 
> dataframe. A workaround is using `repartition`, but it requires a shuffle 
> stage. It's better to support the `numSlices` parameter in the 
> `createDataFrame` method.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org