kazuyukitanimura commented on PR #3829: URL: https://github.com/apache/datafusion-comet/pull/3829#issuecomment-4157255825
Thanks @andygrove The existing implementation is correct. This PR is for testing to confirm that. The original analysis of #1036 is not quite right in my opinion, as I mentioned in https://github.com/apache/datafusion-comet/issues/1036#issuecomment-4146308038 The real reason for #1036 is (at least in the current main branch) due to #3825 > Could you confirm the tests actually catch the negative zero case? Yes, it does. `castTest` runs the tests > The Rust format! macro still formats -0.0 as "-0.0" and I don't see a code change that normalizes it. Spark does not normalize for casting float/double to String. You can quickly see that the following test passes. ``` scala test("cast negative zeros") { withTable("t1") { Seq(0.0f, -0.0f, 0.0f, -0.0f).toDF("a").write.saveAsTable("t1") val df = sql("SELECT CAST(a AS STRING) FROM t1") checkSparkAnswerAndOperator(df) } } ``` > It would be helpful to see the test output The following is the printout of `castTest` ``` !== Correct Answer - 10013 == == Spark Answer - 10013 == struct<a:float,a:string> struct<a:float,a:string> ... [-0.0,-0.0] [-0.0,-0.0] [0.0,0.0] [0.0,0.0] ... ``` > SELECT CAST(CAST('-0.0' AS FLOAT) AS STRING) I do not think you meant this. `CAST('-0.0' AS FLOAT)` is not a part of this PR scope. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
