HI All,
Please find fix info for users who are following the mail chain of this
issue and the respective solution below:
*reduceByKey: Non working snippet*
import org.apache.spark.Context
import org.apache.spark.Context._
import org.apache.spark.SparkConf
val conf = new SparkConf()
val sc = new
HI All,
Currently using DSE 4.7 and Spark 1.2.2 version
Regards,
Satish
On Fri, Aug 21, 2015 at 7:30 PM, java8964 java8...@hotmail.com wrote:
What version of Spark you are using, or comes with DSE 4.7?
We just cannot reproduce it in Spark.
yzhang@localhost$ more test.spark
val pairs =
I believe spark-shell -i scriptFile is there. We also use it, at least in
Spark 1.3.1.
dse spark will just wrap spark-shell command, underline it is just invoking
spark-shell.
I don't know too much about the original problem though.
Yong
Date: Fri, 21 Aug 2015 18:19:49 +0800
Subject: Re:
HI All,
Any inputs for the actual problem statement
Regards,
Satish
On Fri, Aug 21, 2015 at 5:57 PM, Jeff Zhang zjf...@gmail.com wrote:
Yong, Thanks for your reply.
I tried spark-shell -i script-file, it works fine for me. Not sure the
different with
dse spark --master local --jars
You had:
RDD.reduceByKey((x,y) = x+y)
RDD.take(3)
Maybe try:
rdd2 = RDD.reduceByKey((x,y) = x+y)
rdd2.take(3)
-Abhishek-
On Aug 20, 2015, at 3:05 AM, satish chandra j jsatishchan...@gmail.com wrote:
HI All,
I have data in RDD as mentioned below:
RDD : Array[(Int),(Int)] =
HI Abhishek,
I have even tried that but rdd2 is empty
Regards,
Satish
On Fri, Aug 21, 2015 at 6:47 PM, Abhishek R. Singh
abhis...@tetrationanalytics.com wrote:
You had:
RDD.reduceByKey((x,y) = x+y)
RDD.take(3)
Maybe try:
rdd2 = RDD.reduceByKey((x,y) = x+y)
rdd2.take(3)
What version of Spark you are using, or comes with DSE 4.7?
We just cannot reproduce it in Spark.
yzhang@localhost$ more test.sparkval pairs =
sc.makeRDD(Seq((0,1),(0,2),(1,20),(1,30),(2,40)))pairs.reduceByKey((x,y) = x +
y).collectyzhang@localhost$ ~/spark/bin/spark-shell --master local -i
Yes, DSE 4.7
Regards,
Satish Chandra
On Fri, Aug 21, 2015 at 3:06 PM, Robin East robin.e...@xense.co.uk wrote:
Not sure, never used dse - it’s part of DataStax Enterprise right?
On 21 Aug 2015, at 10:07, satish chandra j jsatishchan...@gmail.com
wrote:
HI Robin,
Yes, below mentioned
HI Robin,
Yes, it is DSE but issue is related to Spark only
Regards,
Satish Chandra
On Fri, Aug 21, 2015 at 3:06 PM, Robin East robin.e...@xense.co.uk wrote:
Not sure, never used dse - it’s part of DataStax Enterprise right?
On 21 Aug 2015, at 10:07, satish chandra j jsatishchan...@gmail.com
Hi Satish,
I don't see where spark support -i, so suspect it is provided by DSE. In
that case, it might be bug of DSE.
On Fri, Aug 21, 2015 at 6:02 PM, satish chandra j jsatishchan...@gmail.com
wrote:
HI Robin,
Yes, it is DSE but issue is related to Spark only
Regards,
Satish Chandra
HI Robin,
Yes, below mentioned piece or code works fine in Spark Shell but the same
when place in Script File and executed with -i file name it creating an
empty RDD
scala val pairs = sc.makeRDD(Seq((0,1),(0,2),(1,20),(1,30),(2,40)))
pairs: org.apache.spark.rdd.RDD[(Int, Int)] =
HI All,
I have data in RDD as mentioned below:
RDD : Array[(Int),(Int)] = Array((0,1), (0,2),(1,20),(1,30),(2,40))
I am expecting output as Array((0,3),(1,50),(2,40)) just a sum function on
Values for each key
Code:
RDD.reduceByKey((x,y) = x+y)
RDD.take(3)
Result in console:
RDD:
HI All,
Could anybody let me know what is that i missing here, it should work as
its a basic transformation
Please let me know if any additional information required
Regards,
Satish
On Thu, Aug 20, 2015 at 3:35 PM, satish chandra j jsatishchan...@gmail.com
wrote:
HI All,
I have data in RDD
13 matches
Mail list logo