[jira] [Updated] (SPARK-2331) SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T]
[ https://issues.apache.org/jira/browse/SPARK-2331?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Reynold Xin updated SPARK-2331: --- Issue Type: Sub-task (was: Bug) Parent: SPARK-11806 > SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T] > -- > > Key: SPARK-2331 > URL: https://issues.apache.org/jira/browse/SPARK-2331 > Project: Spark > Issue Type: Sub-task > Components: Spark Core >Affects Versions: 1.0.0 >Reporter: Ian Hummel >Priority: Minor > > The return type for SparkContext.emptyRDD is EmptyRDD[T]. > It should be RDD[T]. That means you have to add extra type annotations on > code like the below (which creates a union of RDDs over some subset of paths > in a folder) > {code} > val rdds = Seq("a", "b", "c").foldLeft[RDD[String]](sc.emptyRDD[String]) { > (rdd, path) ⇒ > rdd.union(sc.textFile(path)) > } > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-2331) SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T]
[ https://issues.apache.org/jira/browse/SPARK-2331?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Reynold Xin updated SPARK-2331: --- Description: The return type for SparkContext.emptyRDD is EmptyRDD[T]. It should be RDD[T]. That means you have to add extra type annotations on code like the below (which creates a union of RDDs over some subset of paths in a folder) {code} val rdds = Seq("a", "b", "c").foldLeft[RDD[String]](sc.emptyRDD[String]) { (rdd, path) ⇒ rdd.union(sc.textFile(path)) } {code} was: The return type for SparkContext.emptyRDD is EmptyRDD[T]. It should be RDD[T]. That means you have to add extra type annotations on code like the below (which creates a union of RDDs over some subset of paths in a folder) val rdds = Seq("a", "b", "c").foldLeft[RDD[String]](sc.emptyRDD[String]) { (rdd, path) ⇒ rdd.union(sc.textFile(path)) } > SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T] > -- > > Key: SPARK-2331 > URL: https://issues.apache.org/jira/browse/SPARK-2331 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.0.0 >Reporter: Ian Hummel >Priority: Minor > > The return type for SparkContext.emptyRDD is EmptyRDD[T]. > It should be RDD[T]. That means you have to add extra type annotations on > code like the below (which creates a union of RDDs over some subset of paths > in a folder) > {code} > val rdds = Seq("a", "b", "c").foldLeft[RDD[String]](sc.emptyRDD[String]) { > (rdd, path) ⇒ > rdd.union(sc.textFile(path)) > } > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-2331) SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T]
[ https://issues.apache.org/jira/browse/SPARK-2331?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sean Owen updated SPARK-2331: - Priority: Minor (was: Major) Target Version/s: 2+ > SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T] > -- > > Key: SPARK-2331 > URL: https://issues.apache.org/jira/browse/SPARK-2331 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.0.0 >Reporter: Ian Hummel >Priority: Minor > > The return type for SparkContext.emptyRDD is EmptyRDD[T]. > It should be RDD[T]. That means you have to add extra type annotations on > code like the below (which creates a union of RDDs over some subset of paths > in a folder) > val rdds = Seq("a", "b", "c").foldLeft[RDD[String]](sc.emptyRDD[String]) { > (rdd, path) ⇒ > rdd.union(sc.textFile(path)) > } -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-2331) SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T]
[ https://issues.apache.org/jira/browse/SPARK-2331?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Patrick Wendell updated SPARK-2331: --- Component/s: Spark Core > SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T] > -- > > Key: SPARK-2331 > URL: https://issues.apache.org/jira/browse/SPARK-2331 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.0.0 >Reporter: Ian Hummel > > The return type for SparkContext.emptyRDD is EmptyRDD[T]. > It should be RDD[T]. That means you have to add extra type annotations on > code like the below (which creates a union of RDDs over some subset of paths > in a folder) > val rdds = Seq("a", "b", "c").foldLeft[RDD[String]](sc.emptyRDD[String]) { > (rdd, path) ⇒ > rdd.union(sc.textFile(path)) > } -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-2331) SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T]
[ https://issues.apache.org/jira/browse/SPARK-2331?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Patrick Wendell updated SPARK-2331: --- Summary: SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T] (was: SparkContext.emptyRDD has wrong return type) > SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T] > -- > > Key: SPARK-2331 > URL: https://issues.apache.org/jira/browse/SPARK-2331 > Project: Spark > Issue Type: Bug >Affects Versions: 1.0.0 >Reporter: Ian Hummel > > The return type for SparkContext.emptyRDD is EmptyRDD[T]. > It should be RDD[T]. That means you have to add extra type annotations on > code like the below (which creates a union of RDDs over some subset of paths > in a folder) > val rdds = Seq("a", "b", "c").foldLeft[RDD[String]](sc.emptyRDD[String]) { > (rdd, path) ⇒ > rdd.union(sc.textFile(path)) > } -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org