[jira] [Assigned] (SPARK-18855) Add RDD flatten function

2016-12-13 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-18855?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-18855:


Assignee: (was: Apache Spark)

> Add RDD flatten function
> 
>
> Key: SPARK-18855
> URL: https://issues.apache.org/jira/browse/SPARK-18855
> Project: Spark
>  Issue Type: New Feature
>  Components: Spark Core
>Reporter: Linbo
>  Labels: flatten, rdd
>
> A new RDD flatten function is similar to flatten function of scala 
> collections:
> {code:title=spark-shell|borderStyle=solid}
> scala> val rdd = sc.makeRDD(List(List(1, 2, 3), List(4, 5), List(6)))
> rdd: org.apache.spark.rdd.RDD[List[Int]] = ParallelCollectionRDD[0] at 
> makeRDD at :24
> scala> rdd.flatten.collect
> res0: Array[Int] = Array(1, 2, 3, 4, 5, 6)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-18855) Add RDD flatten function

2016-12-13 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-18855?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-18855:


Assignee: Apache Spark

> Add RDD flatten function
> 
>
> Key: SPARK-18855
> URL: https://issues.apache.org/jira/browse/SPARK-18855
> Project: Spark
>  Issue Type: New Feature
>  Components: Spark Core
>Reporter: Linbo
>Assignee: Apache Spark
>  Labels: flatten, rdd
>
> A new RDD flatten function is similar to flatten function of scala 
> collections:
> {code:title=spark-shell|borderStyle=solid}
> scala> val rdd = sc.makeRDD(List(List(1, 2, 3), List(4, 5), List(6)))
> rdd: org.apache.spark.rdd.RDD[List[Int]] = ParallelCollectionRDD[0] at 
> makeRDD at :24
> scala> rdd.flatten.collect
> res0: Array[Int] = Array(1, 2, 3, 4, 5, 6)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org