Hello all,

I've been hitting a divide by zero error in Parquet though Spark detailed
(and fixed) here: https://github.com/apache/incubator-parquet-mr/pull/102

Is anyone else hitting this error? I hit it frequently.

It looks like the Parquet team is preparing to release 1.6.0 and, since they
have been completely unresponsive, I'm assuming its going to go with this
bug (without the fix). Other than the fact that the divide by zero mistake
is obvious, perhaps the conditions it occurs are rare and I'm doing
something wrong.

Has anyone else hit this and if so, have they resolved it?

Thanks
Jim




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Parquet-divide-by-zero-tp21406.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to