[ https://issues.apache.org/jira/browse/SPARK-16613?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin resolved SPARK-16613. --------------------------------- Resolution: Fixed Assignee: Sean Owen Fix Version/s: 2.1.0 2.0.1 > RDD.pipe returns values for empty partitions > -------------------------------------------- > > Key: SPARK-16613 > URL: https://issues.apache.org/jira/browse/SPARK-16613 > Project: Spark > Issue Type: Bug > Reporter: Alex Krasnyansky > Assignee: Sean Owen > Fix For: 2.0.1, 2.1.0 > > > Suppose we have such Spark code > {code} > object PipeExample { > def main(args: Array[String]) { > val fstRdd = sc.parallelize(List("hi", "hello", "how", "are", "you")) > val pipeRdd = > fstRdd.pipe("/Users/finkel/spark-pipe-example/src/main/resources/len.sh") > pipeRdd.collect.foreach(println) > } > } > {code} > It uses a bash script to convert a string to its length. > {code} > #!/bin/sh > read input > len=${#input} > echo $len > {code} > So far so good, but when I run the code, it prints incorrect output. For > example: > {code} > 0 > 2 > 0 > 5 > 3 > 0 > 3 > 3 > {code} > I expect to see > {code} > 2 > 5 > 3 > 3 > 3 > {code} > which is correct output for the app. I think it's a bug. It's expected to see > only positive integers and avoid zeros. > Environment: > 1. Spark version is 1.6.2 > 2. Scala version is 2.11.6 -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org