[ https://issues.apache.org/jira/browse/SPARK-17620?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiao Li resolved SPARK-17620. ----------------------------- Resolution: Fixed Fix Version/s: 2.1.0 Issue resolved by pull request 15495 [https://github.com/apache/spark/pull/15495] > hive.default.fileformat=orc does not set OrcSerde > ------------------------------------------------- > > Key: SPARK-17620 > URL: https://issues.apache.org/jira/browse/SPARK-17620 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: Brian Cho > Assignee: Dilip Biswal > Priority: Minor > Fix For: 2.1.0 > > > Setting {{hive.default.fileformat=orc}} does not set OrcSerde. This behavior > is inconsistent with {{STORED AS ORC}}. This means we cannot set a default > behavior for creating tables using orc. > The behavior using stored as: > {noformat} > scala> spark.sql("CREATE TABLE tmp_stored_as(id INT) STORED AS ORC") > res0: org.apache.spark.sql.DataFrame = [] > scala> spark.sql("DESC FORMATTED tmp_stored_as").collect.foreach(println) > ... > [# Storage Information,,] > [SerDe Library:,org.apache.hadoop.hive.ql.io.orc.OrcSerde,] > [InputFormat:,org.apache.hadoop.hive.ql.io.orc.OrcInputFormat,] > [OutputFormat:,org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat,] > ... > {noformat} > Behavior setting default conf (SerDe Library is not set properly): > {noformat} > scala> spark.sql("SET hive.default.fileformat=orc") > res2: org.apache.spark.sql.DataFrame = [key: string, value: string] > scala> spark.sql("CREATE TABLE tmp_default(id INT)") > res3: org.apache.spark.sql.DataFrame = [] > scala> spark.sql("DESC FORMATTED tmp_default").collect.foreach(println) > ... > [# Storage Information,,] > [SerDe Library:,org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe,] > [InputFormat:,org.apache.hadoop.hive.ql.io.orc.OrcInputFormat,] > [OutputFormat:,org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat,] > ... > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org