I'm not sure I understand what you are suggesting is wrong. It prints the
result of the last command. In the second case that is the whole pasted
block so you see 19.
On Apr 16, 2015 11:37 AM, "vinodkc" <vinod.kc...@gmail.com> wrote:

> Hi All,
>
> I faced below issue while working with spark. It seems spark shell paste
> mode is not consistent
>
> Example code
> ---------------
> val textFile = sc.textFile("README.md")
> textFile.count()
> textFile.first()
> val linesWithSpark = textFile.filter(line => line.contains("Spark"))
> textFile.filter(line => line.contains("Spark")).count()
>
> Step 1 : Run above code in spark-shell
> --------
>
> scala> val textFile = sc.textFile("README.md")
> textFile: org.apache.spark.rdd.RDD[String] = README.md MapPartitionsRDD[1]
> at textFile at <console>:21
>
> scala> textFile.count()
> res0: Long = 98
>
> scala> textFile.first()
> res1: String = # Apache Spark
>
> scala> val linesWithSpark = textFile.filter(line => line.contains("Spark"))
> linesWithSpark: org.apache.spark.rdd.RDD[String] = MapPartitionsRDD[2] at
> filter at <console>:23
>
> scala> textFile.filter(line => line.contains("Spark")).count()
> res2: Long = 19
>
> Result 1: Following actions are evaluated properly
> textFile.count() ,textFile.first() ,textFile.filter(line =>
> line.contains("Spark")).count()
> res0: Long = 98,res1: String = # Apache Spark,res2: Long = 19
>
> Step 2 : Run above code in spark-shell paste mode
> scala> :p
> // Entering paste mode (ctrl-D to finish)
>
> val textFile = sc.textFile("README.md")
> textFile.count()
> textFile.first()
> val linesWithSpark = textFile.filter(line => line.contains("Spark"))
> textFile.filter(line => line.contains("Spark")).count()
>
> // Exiting paste mode, now interpreting.
>
> textFile: org.apache.spark.rdd.RDD[String] = README.md MapPartitionsRDD[1]
> at textFile at <console>:21
> linesWithSpark: org.apache.spark.rdd.RDD[String] = MapPartitionsRDD[2] at
> filter at <console>:24
> res0: Long = 19
>
> scala>
>
> Result 2: Only one action is executed
> textFile.filter(line => line.contains("Spark")).count()
> res0: Long = 19
>
> Expected result : Result 1 and Result 2 should be same
>
> I feel this is an issue with spark shell . I fixed and verified it
> locally.If community also think that it need to be handled, I can make a
> PR.
>
> Thanks
> Vinod KC
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/spark-shell-paste-mode-is-not-consistent-tp11621.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to