Sanity-check: would it be possible that `threshold_var` be negative ?


—
FG

On Fri, Jan 30, 2015 at 5:06 PM, Peter Thai <thai.pe...@gmail.com> wrote:

> Hello,
> I am seeing negative values for accumulators. Here's my implementation in a
> standalone app in Spark 1.1.1rc:
>   implicit object BigIntAccumulatorParam extends AccumulatorParam[BigInt] {
>     def addInPlace(t1: Int, t2: BigInt) = BigInt(t1) + t2
>     def addInPlace(t1: BigInt, t2: BigInt) = t1 + t2
>     def zero(initialValue: BigInt) = BigInt(0)
>   }
> val capped_numpings_accu = sc.accumulator(BigInt(0))(BigIntAccumulatorParam)
> myRDD.foreach(x=>{ capped_numpings_accu+=BigInt(x._1).min(threshold_var)})
> When I remove the min() condition, I no longer see negative values.
> Thanks!
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Negative-Accumulators-tp21441.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to