perhaps the closure ends up including the "main" object which is not
defined as serializable...try making it a "case object" or "object main
extends Serializable".

On Sat, Nov 22, 2014 at 4:16 PM, lordjoe <lordjoe2...@gmail.com> wrote:

> I posted several examples in java at http://lordjoesoftware.blogspot.com/
>
> Generally code like this works and I show how to accumulate more complex
> values.
>
>     // Make two accumulators using Statistics
>          final Accumulator<Integer> totalLetters= ctx.accumulator(0L,
> "ttl");
>          JavaRDD<string> lines = ...
>
>         JavaRDD<string> words = lines.flatMap(new FlatMapFunction<String,
> String>() {
>             @Override
>             public Iterable<string> call(final String s) throws Exception {
>                 // Handle accumulator here
>                 totalLetters.add(s.length()); // count all letters
>                 ....
>          });
>         ....
>          Long numberCalls = totalCounts.value();
>
> I believe the mistake is to pass the accumulator to the function rather
> than
> letting the function find the accumulator - I do this in this case by using
> a final local variable
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19579.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to