As Mohit said, making Main extend Serializable should fix this example. In
general, it's not a bad idea to mark the fields you don't want to serialize
(e.g., sc and conf in this case) as @transient as well, though this is not
the issue in this case.

Note that this problem would not have arisen in your very specific example
if you used a while loop instead of a for-each loop, but that's really more
of a happy coincidence than something you should rely on, as nested lambdas
are virtually unavoidable in Scala.

On Sat, Nov 22, 2014 at 5:16 PM, Mohit Jaggi <mohitja...@gmail.com> wrote:

> perhaps the closure ends up including the "main" object which is not
> defined as serializable...try making it a "case object" or "object main
> extends Serializable".
>
>
> On Sat, Nov 22, 2014 at 4:16 PM, lordjoe <lordjoe2...@gmail.com> wrote:
>
>> I posted several examples in java at http://lordjoesoftware.blogspot.com/
>>
>> Generally code like this works and I show how to accumulate more complex
>> values.
>>
>>     // Make two accumulators using Statistics
>>          final Accumulator<Integer> totalLetters= ctx.accumulator(0L,
>> "ttl");
>>          JavaRDD<string> lines = ...
>>
>>         JavaRDD<string> words = lines.flatMap(new FlatMapFunction<String,
>> String>() {
>>             @Override
>>             public Iterable<string> call(final String s) throws Exception
>> {
>>                 // Handle accumulator here
>>                 totalLetters.add(s.length()); // count all letters
>>                 ....
>>          });
>>         ....
>>          Long numberCalls = totalCounts.value();
>>
>> I believe the mistake is to pass the accumulator to the function rather
>> than
>> letting the function find the accumulator - I do this in this case by
>> using
>> a final local variable
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-tp17263p19579.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to