[ https://issues.apache.org/jira/browse/SPARK-28298?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Zhu, Lipeng updated SPARK-28298: -------------------------------- Description: Execute below SQL in Spark, the result is "abcdef". But the result of other DBMS is "abc"(I think this is more sensible). {code:sql} select cast("abcdef" as char(3)); {code} And then I checked the source code, seems char/varchar only be used in DDL parse. {code:java} /** * Hive char type. Similar to other HiveStringType's, these datatypes should only used for * parsing, and should NOT be used anywhere else. Any instance of these data types should be * replaced by a [[StringType]] before analysis. */ case class CharType(length: Int) extends HiveStringType { override def simpleString: String = s"char($length)" } /** * Hive varchar type. Similar to other HiveStringType's, these datatypes should only used for * parsing, and should NOT be used anywhere else. Any instance of these data types should be * replaced by a [[StringType]] before analysis. */ case class VarcharType(length: Int) extends HiveStringType { override def simpleString: String = s"varchar($length)" } {code} Is this behavior expected? was: Execute below SQL, the result is "abcdef". But the result of other DBMS is "abc"(I think this is more sensible). {code:sql} select cast("abcdef" as char(3)); {code} And then I checked the source code, seems char/varchar only be used in DDL parse. {code:java} /** * Hive char type. Similar to other HiveStringType's, these datatypes should only used for * parsing, and should NOT be used anywhere else. Any instance of these data types should be * replaced by a [[StringType]] before analysis. */ case class CharType(length: Int) extends HiveStringType { override def simpleString: String = s"char($length)" } /** * Hive varchar type. Similar to other HiveStringType's, these datatypes should only used for * parsing, and should NOT be used anywhere else. Any instance of these data types should be * replaced by a [[StringType]] before analysis. */ case class VarcharType(length: Int) extends HiveStringType { override def simpleString: String = s"varchar($length)" } {code} Is this behavior expected? > Strange behavior of CAST string to char/varchar > ----------------------------------------------- > > Key: SPARK-28298 > URL: https://issues.apache.org/jira/browse/SPARK-28298 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.0.0 > Reporter: Zhu, Lipeng > Priority: Major > > Execute below SQL in Spark, the result is "abcdef". But the result of other > DBMS is "abc"(I think this is more sensible). > {code:sql} > select cast("abcdef" as char(3)); > {code} > And then I checked the source code, seems char/varchar only be used in DDL > parse. > {code:java} > /** > * Hive char type. Similar to other HiveStringType's, these datatypes should > only used for > * parsing, and should NOT be used anywhere else. Any instance of these data > types should be > * replaced by a [[StringType]] before analysis. > */ > case class CharType(length: Int) extends HiveStringType { > override def simpleString: String = s"char($length)" > } > /** > * Hive varchar type. Similar to other HiveStringType's, these datatypes > should only used for > * parsing, and should NOT be used anywhere else. Any instance of these data > types should be > * replaced by a [[StringType]] before analysis. > */ > case class VarcharType(length: Int) extends HiveStringType { > override def simpleString: String = s"varchar($length)" > } > {code} > Is this behavior expected? -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org