[ https://issues.apache.org/jira/browse/SPARK-3683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14187563#comment-14187563 ]
Davies Liu commented on SPARK-3683: ----------------------------------- This commit remove the special case for "NULL": {code} commit cf989601d0e784e1c3507720e64636891fe28292 Author: Cheng Lian <lian.cs....@gmail.com> Date: Fri May 30 22:13:11 2014 -0700 [SPARK-1959] String "NULL" shouldn't be interpreted as null value JIRA issue: [SPARK-1959](https://issues.apache.org/jira/browse/SPARK-1959) Author: Cheng Lian <lian.cs....@gmail.com> Closes #909 from liancheng/spark-1959 and squashes the following commits: 306659c [Cheng Lian] [SPARK-1959] String "NULL" shouldn't be interpreted as null value diff --git a/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveOperators.scala b/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveOperators.scala index f141139..d263c31 100644 --- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveOperators.scala +++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveOperators.scala @@ -113,7 +113,6 @@ case class HiveTableScan( } private def unwrapHiveData(value: Any) = value match { - case maybeNull: String if maybeNull.toLowerCase == "null" => null case varchar: HiveVarchar => varchar.getValue case decimal: HiveDecimal => BigDecimal(decimal.bigDecimalValue) case other => other {code} So this should be a bug from Hive. > PySpark Hive query generates "NULL" instead of None > --------------------------------------------------- > > Key: SPARK-3683 > URL: https://issues.apache.org/jira/browse/SPARK-3683 > Project: Spark > Issue Type: Bug > Components: PySpark, SQL > Affects Versions: 1.1.0 > Reporter: Tamas Jambor > Assignee: Davies Liu > > When I run a Hive query in Spark SQL, I get the new Row object, where it does > not convert Hive NULL into Python None instead it keeps it string 'NULL'. > It's only an issue with String type, works with other types. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org