It works fine on my *Spark 1.1.0*
Thanks
Best Regards
On Mon, Oct 27, 2014 at 12:22 AM, octavian.ganea octavian.ga...@inf.ethz.ch
wrote:
Hi Akhil,
Please see this related message.
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-td17263.html
I am curious if this
Just tried the below code and works for me, not sure why is sparkContext
being sent inside the mapPartitions function in your case. Can you try with
simple map() instead of mapPartition?
val ac = sc.accumulator(0)
val or = sc.parallelize(1 to 1)
val ps = or.map(x = (x,x+2)).map(x = ac +=1)
Hi Akhil,
Please see this related message.
http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-td17263.html
I am curious if this works for you also.
--
View this message in context:
Hi all,
I tried to use accumulators without any success so far.
My code is simple:
val sc = new SparkContext(conf)
val accum = sc.accumulator(0)
val partialStats = sc.textFile(f.getAbsolutePath())
.map(line = { val key = line.split(\t).head; (key , line)} )