Re: Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-27 Thread Akhil Das
It works fine on my *Spark 1.1.0* Thanks Best Regards On Mon, Oct 27, 2014 at 12:22 AM, octavian.ganea octavian.ga...@inf.ethz.ch wrote: Hi Akhil, Please see this related message. http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-td17263.html I am curious if this

Re: Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-26 Thread Akhil Das
Just tried the below code and works for me, not sure why is sparkContext being sent inside the mapPartitions function in your case. Can you try with simple map() instead of mapPartition? val ac = sc.accumulator(0) val or = sc.parallelize(1 to 1) val ps = or.map(x = (x,x+2)).map(x = ac +=1)

Re: Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-26 Thread octavian.ganea
Hi Akhil, Please see this related message. http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-td17263.html I am curious if this works for you also. -- View this message in context:

Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-25 Thread octavian.ganea
Hi all, I tried to use accumulators without any success so far. My code is simple: val sc = new SparkContext(conf) val accum = sc.accumulator(0) val partialStats = sc.textFile(f.getAbsolutePath()) .map(line = { val key = line.split(\t).head; (key , line)} )