That seems to work fine. Add to your example

def foo(i: Int, a: Accumulator[Int]) = a += i

and add an action at the end to get the expression to evaluate:

sc.parallelize(Array(1, 2, 3, 4)).map(x => foo(x,accum)).foreach(println)

and it works, and you have accum with value 10 at the end.

The similar example at
http://spark.apache.org/docs/latest/programming-guide.html#accumulators
also works.

You say AFAIK -- are you actually able to reproduce this?

On Sat, Nov 22, 2014 at 7:01 PM, octavian.ganea
<octavian.ga...@inf.ethz.ch> wrote:
> One month later, the same problem. I think that someone (e.g. inventors of
> Spark) should show us a big example of how to use accumulators. I can start
> telling that we need to see an example of the following form:
>
> val accum = sc.accumulator(0)
> sc.parallelize(Array(1, 2, 3, 4)).map(x => foo(x,accum))
>
> Passing accum as a parameter to function foo will require it to be
> serializable, but, a.f.a.i.k  any accumulator incapsulates the spark context
> sc which is not serializable and which lead a
> "java.io.NotSerializableException: SparkContext"  exception.
>
> I am really curious to see a real application that uses accumulators.
> Otherwise, you have to change their code such that the above issue does not
> appear anymore.
>
> Best,
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19567.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to