[ https://issues.apache.org/jira/browse/SPARK-25522?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dilip Biswal updated SPARK-25522: --------------------------------- Description: In ElementAt, when first argument is MapType, we should coerce the key type and the second argument based on findTightestCommonType. This is not happening currently. Also, when the first argument is ArrayType, the second argument should be an integer type or a smaller integral type that can be safely casted to an integer type. Currently we may do an unsafe cast. {code:java} spark-sql> select element_at(array(1,2), 1.24); 1{code} {code:java} spark-sql> select element_at(map(1,"one", 2, "two"), 2.2); two{code} was: In ElementAt, when first argument is MapType, we should coerce the key type and the second argument based on findTightestCommonType. This is not happening currently. Also, when the first argument is ArrayType, the second argument should be an integer type or a smaller integral type that can be safely casted to an integer type. Currently we may do an unsafe cast. > Improve type promotion for input arguments of elementAt function > ----------------------------------------------------------------- > > Key: SPARK-25522 > URL: https://issues.apache.org/jira/browse/SPARK-25522 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.4.0 > Reporter: Dilip Biswal > Assignee: Dilip Biswal > Priority: Major > Fix For: 2.4.0 > > > In ElementAt, when first argument is MapType, we should coerce the key type > and the second argument based on findTightestCommonType. This is not > happening currently. > Also, when the first argument is ArrayType, the second argument should be an > integer type or a smaller integral type that can be safely casted to an > integer type. Currently we may do an unsafe cast. > {code:java} > spark-sql> select element_at(array(1,2), 1.24); > 1{code} > {code:java} > spark-sql> select element_at(map(1,"one", 2, "two"), 2.2); > two{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org