Great! Worked like a charm :)

On Mon, Nov 24, 2014 at 9:56 PM, Shixiong Zhu <zsxw...@gmail.com> wrote:

> int overflow? If so, you can use BigInt like this:
>
> scala> import org.apache.spark.AccumulatorParamimport 
> org.apache.spark.AccumulatorParam
>
> scala> :paste// Entering paste mode (ctrl-D to finish)
> implicit object BigIntAccumulatorParam extends AccumulatorParam[BigInt] {
>   def addInPlace(t1: BigInt, t2: BigInt) = t1 + t2
>   def zero(initialValue: BigInt) = BigInt(0)
> }
> // Exiting paste mode, now interpreting.
>
> defined module BigIntAccumulatorParam
>
> scala> val accu = sc.accumulator(BigInt(0))
> accu: org.apache.spark.Accumulator[scala.math.BigInt] = 0
>
> scala> accu += 100
>
> scala> accu.value
> res1: scala.math.BigInt = 100
>
> ​
>
> Best Regards,
> Shixiong Zhu
>
> 2014-11-25 10:31 GMT+08:00 Peter Thai <thai.pe...@gmail.com>:
>
>> Hello!
>>
>> Does anyone know why I may be receiving negative final accumulator values?
>>
>> Thanks!
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Negative-Accumulators-tp19706.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to