Josh Rosen created SPARK-54493:
----------------------------------
Summary: PySpark's assertSchemaEqual doesn't compare MapType key
and value types when ignoreNullable=True
Key: SPARK-54493
URL: https://issues.apache.org/jira/browse/SPARK-54493
Project: Spark
Issue Type: Improvement
Components: PySpark
Affects Versions: 4.0.0
Reporter: Josh Rosen
Similar to SPARK-51062, the {{pyspark.testing.assertSchemaEqual}} function
doesn't properly handle {{{}MapType{}}}:
For arrays, decimals, and structs, the [existing
code|https://github.com/apache/spark/blame/d14209c6ffba991b1d8ca9580708dea1ee37920e/python/pyspark/testing/utils.py#L590-L603]
compares the types' parameters (e.g. element type, precision, scale, etc);
this needs to be extended to also handle MapType key and value types.
The following should fail with an assertion but currently (incorrectly) passes:
{code:python}
from pyspark.sql.types import StructType, StructField, MapType, StringType,
IntegerType
from pyspark.testing import assertSchemaEqual
# Note swapped key and value types:
s1 = StructType([StructField("m", MapType(StringType(), IntegerType()), True)])
s2 = StructType([StructField("m", MapType(IntegerType(), StringType()), True)])
# Should raise, does not:
assertSchemaEqual(s1, s2, ignoreNullable=True)
{code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]