[ https://issues.apache.org/jira/browse/SPARK-38130?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-38130: ------------------------------------ Assignee: Apache Spark > array_sort does not allow non-orderable datatypes > ------------------------------------------------- > > Key: SPARK-38130 > URL: https://issues.apache.org/jira/browse/SPARK-38130 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.2.1 > Environment: > Reporter: Steven Aerts > Assignee: Apache Spark > Priority: Major > > {{array_sort}} has check to see if the entries it has to sort are orderable. > I think this check should be removed. Because even entries which are not > orderable can have a lambda function which makes them orderable. > {code:java} > Seq((Array[Map[String, Int]](Map("a" -> 1), Map()), "x")).toDF("a", > "b").selectExpr("array_sort(a, (x,y) -> cardinality(x) - > cardinality(y))"){code} > fails with: > {code:java} > org.apache.spark.sql.AnalysisException: cannot resolve 'array_sort(`a`, > lambdafunction((cardinality(namedlambdavariable()) - > cardinality(namedlambdavariable())), namedlambdavariable(), > namedlambdavariable()))' due to data type mismatch: array_sort does not > support sorting array of type map<string,int> which is not orderable {code} > While the case where this check is relevant, fails with a different error > which is triggered earlier in the code path: > {code:java} > > Seq((Array[Map[String, Int]](Map("a" -> 1), Map()), "x")).toDF("a", > > "b").selectExpr("array_sort(a)"){code} > Fails with: > {code:java} > org.apache.spark.sql.AnalysisException: cannot resolve > '(namedlambdavariable() < namedlambdavariable())' due to data type mismatch: > LessThan does not support ordering on type map<string,int>; line 1 pos 0; > {code} -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org