sarutak opened a new pull request, #4:
URL: https://github.com/apache/spark-connect-rust/pull/4
# Description
This PR proposes to fix the issue that `test_df_unpivot` doesn't pass with
Spark 4.0.
```
---- dataframe::tests::test_df_unpivot stdout ----
SparkSession Setup
thread 'dataframe::tests::test_df_unpivot' panicked at
crates/connect/src/dataframe.rs:2463:9:
assertion `left == right` failed
left: RecordBatch { schema: Schema { fields: [Field { name: "id",
data_type: Int64, nullable: false, dict_id: 0, dict_is_ordered: false,
metadata: {} }, Field { name: "var", data_type: Utf8, nullable: false, dict_id:
0, dict_is_ordered: false, metadata: {} }, Field { name: "val", data_type:
Float32, nullable: false, dict_id: 0, dict_is_ordered: false, metadata: {} }],
metadata: {} }, columns: [PrimitiveArray<Int64>
[
1,
1,
2,
2,
], StringArray
[
"int",
"float",
"int",
"float",
], PrimitiveArray<Float32>
[
11.0,
1.1,
12.0,
1.2,
]], row_count: 4 }
right: RecordBatch { schema: Schema { fields: [Field { name: "id",
data_type: Int64, nullable: false, dict_id: 0, dict_is_ordered: false,
metadata: {} }, Field { name: "var", data_type: Utf8, nullable: false, dict_id:
0, dict_is_ordered: false, metadata: {} }, Field { name: "val", data_type:
Float64, nullable: false, dict_id: 0, dict_is_ordered: false, metadata: {} }],
metadata: {} }, columns: [PrimitiveArray<Int64>
[
1,
1,
2,
2,
], StringArray
[
"int",
"float",
"int",
"float",
], PrimitiveArray<Float64>
[
11.0,
1.100000023841858,
12.0,
1.2000000476837158,
]], row_count: 4 }
```
As of Spark 4.0, ANSI mode is enabled by default but this test doesn't
consider it.
To fix this issue, I tweaked the test so that the test pass in both ANSI
mode is enabled or not.
## Related Issue(s)
SPARK-53001
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]