[ https://issues.apache.org/jira/browse/SPARK-41835?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-41835: --------------------------------- Fix Version/s: (was: 3.4.0) > Implement `transform_keys` function > ----------------------------------- > > Key: SPARK-41835 > URL: https://issues.apache.org/jira/browse/SPARK-41835 > Project: Spark > Issue Type: Sub-task > Components: Connect, PySpark > Affects Versions: 3.4.0 > Reporter: Sandeep Singh > Priority: Major > > {code:java} > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/functions.py", > line 1611, in pyspark.sql.connect.functions.transform_keys > Failed example: > df.select(transform_keys( > "data", lambda k, _: upper(k)).alias("data_upper") > ).show(truncate=False) > Exception raised: > Traceback (most recent call last): > File > "/usr/local/Cellar/python@3.10/3.10.8/Frameworks/Python.framework/Versions/3.10/lib/python3.10/doctest.py", > line 1350, in __run > exec(compile(example.source, filename, "single", > File "<doctest pyspark.sql.connect.functions.transform_keys[1]>", line > 1, in <module> > df.select(transform_keys( > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", > line 534, in show > print(self._show_string(n, truncate, vertical)) > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", > line 423, in _show_string > ).toPandas() > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", > line 1031, in toPandas > return self._session.client.to_pandas(query) > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", > line 413, in to_pandas > return self._execute_and_fetch(req) > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", > line 573, in _execute_and_fetch > self._handle_error(rpc_error) > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/client.py", > line 619, in _handle_error > raise SparkConnectAnalysisException( > pyspark.sql.connect.client.SparkConnectAnalysisException: > [DATATYPE_MISMATCH.UNEXPECTED_INPUT_TYPE] Cannot resolve > "transform_keys(data, lambdafunction(upper(x_11), x_11, y_12))" due to data > type mismatch: Parameter 1 requires the "MAP" type, however "data" has the > type "STRUCT<bar: DOUBLE, foo: DOUBLE>". > Plan: 'Project [transform_keys(data#4493, lambdafunction('upper(lambda > 'x_11), lambda 'x_11, lambda 'y_12, false)) AS data_upper#4496] > +- Project [0#4488L AS id#4492L, 1#4489 AS data#4493] > +- LocalRelation [0#4488L, 1#4489] {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org