Re: [Spark 2] BigDecimal and 0

2016-10-24 Thread Efe Selcuk
I should have noted that I understand the notation of 0E-18 (exponential form, I think) and that in a normal case it is no different than 0; I just wanted to make sure that there wasn't something tricky going on since the representation was seemingly changing. Michael, that's a fair point. I keep

Re: [Spark 2] BigDecimal and 0

2016-10-24 Thread Jakob Odersky
Yes, thanks for elaborating Michael. The other thing that I wanted to highlight was that in this specific case the value is actually exactly zero (0E-18 = 0*10^(-18) = 0). On Mon, Oct 24, 2016 at 8:50 PM, Michael Matsko wrote: > Efe, > > I think Jakob's point is that that

Re: [Spark 2] BigDecimal and 0

2016-10-24 Thread Michael Matsko
Efe, I think Jakob's point is that that there is no problem. When you deal with real numbers, you don't get exact representations of numbers. There is always some slop in representations, things don't ever cancel out exactly. Testing reals for equality to zero will almost never work.

Re: [Spark 2] BigDecimal and 0

2016-10-24 Thread Efe Selcuk
Okay, so this isn't contributing to any kind of imprecision. I suppose I need to go digging further then. Thanks for the quick help. On Mon, Oct 24, 2016 at 7:34 PM Jakob Odersky wrote: > What you're seeing is merely a strange representation, 0E-18 is zero. > The E-18

Re: [Spark 2] BigDecimal and 0

2016-10-24 Thread Jakob Odersky
What you're seeing is merely a strange representation, 0E-18 is zero. The E-18 represents the precision that Spark uses to store the decimal On Mon, Oct 24, 2016 at 7:32 PM, Jakob Odersky wrote: > An even smaller example that demonstrates the same behaviour: > >

Re: [Spark 2] BigDecimal and 0

2016-10-24 Thread Jakob Odersky
An even smaller example that demonstrates the same behaviour: Seq(Data(BigDecimal(0))).toDS.head On Mon, Oct 24, 2016 at 7:03 PM, Efe Selcuk wrote: > I’m trying to track down what seems to be a very slight imprecision in our > Spark application; two of our columns, which

[Spark 2] BigDecimal and 0

2016-10-24 Thread Efe Selcuk
I’m trying to track down what seems to be a very slight imprecision in our Spark application; two of our columns, which should be netting out to exactly zero, are coming up with very small fractions of non-zero value. The only thing that I’ve found out of place is that a case class entry into a