[jira] [Commented] (SPARK-27555) cannot create table by using the hive default fileformat in both hive-site.xml and spark-defaults.conf
[ https://issues.apache.org/jira/browse/SPARK-27555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16832999#comment-16832999 ] Sandeep Katta commented on SPARK-27555: --- [~hyukjin.kwon] I think by mistake assignee name is incorrect, can you please update it > cannot create table by using the hive default fileformat in both > hive-site.xml and spark-defaults.conf > -- > > Key: SPARK-27555 > URL: https://issues.apache.org/jira/browse/SPARK-27555 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.2 >Reporter: Hui WANG >Assignee: Hyukjin Kwon >Priority: Major > Fix For: 3.0.0 > > Attachments: Try.pdf > > > *You can see details in attachment called Try.pdf.* > I already seen https://issues.apache.org/jira/browse/SPARK-17620 > and https://issues.apache.org/jira/browse/SPARK-18397 > and I check source code of Spark for the change of set > "spark.sql.hive.covertCTAS=true" and then spark will use > "spark.sql.sources.default" which is parquet as storage format in "create > table as select" scenario. > But my case is just create table without select. When I set > hive.default.fileformat=parquet in hive-site.xml or set > spark.hadoop.hive.default.fileformat=parquet in spark-defaults.conf, after > create a table, when i check the hive table, it still use textfile fileformat. > > It seems HiveSerDe gets the value of the hive.default.fileformat parameter > from SQLConf > The parameter values in SQLConf are copied from SparkContext's SparkConf at > SparkSession initialization, while the configuration parameters in > hive-site.xml are loaded into SparkContext's hadoopConfiguration parameters > by SharedState, And all the config with "spark.hadoop" conf are setted to > hadoopconfig, so the configuration does not take effect. > > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-27555) cannot create table by using the hive default fileformat in both hive-site.xml and spark-defaults.conf
[ https://issues.apache.org/jira/browse/SPARK-27555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16825772#comment-16825772 ] Hui WANG commented on SPARK-27555: -- ok, I already attached the details of reproduce in the attachment area called Try.pdf. > cannot create table by using the hive default fileformat in both > hive-site.xml and spark-defaults.conf > -- > > Key: SPARK-27555 > URL: https://issues.apache.org/jira/browse/SPARK-27555 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.2 >Reporter: Hui WANG >Priority: Major > Attachments: Try.pdf > > > I already seen https://issues.apache.org/jira/browse/SPARK-17620 > and https://issues.apache.org/jira/browse/SPARK-18397 > and I check source code of Spark for the change of set > "spark.sql.hive.covertCTAS=true" and then spark will use > "spark.sql.sources.default" which is parquet as storage format in "create > table as select" scenario. > But my case is just create table without select. When I set > hive.default.fileformat=parquet in hive-site.xml or set > spark.hadoop.hive.default.fileformat=parquet in spark-defaults.conf, after > create a table, when i check the hive table, it still use textfile fileformat. > > It seems HiveSerDe gets the value of the hive.default.fileformat parameter > from SQLConf > The parameter values in SQLConf are copied from SparkContext's SparkConf at > SparkSession initialization, while the configuration parameters in > hive-site.xml are loaded into SparkContext's hadoopConfiguration parameters > by SharedState, And all the config with "spark.hadoop" conf are setted to > hadoopconfig, so the configuration does not take effect. > > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-27555) cannot create table by using the hive default fileformat in both hive-site.xml and spark-defaults.conf
[ https://issues.apache.org/jira/browse/SPARK-27555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16825747#comment-16825747 ] Hyukjin Kwon commented on SPARK-27555: -- can you post a self-contained reproducer please? > cannot create table by using the hive default fileformat in both > hive-site.xml and spark-defaults.conf > -- > > Key: SPARK-27555 > URL: https://issues.apache.org/jira/browse/SPARK-27555 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.2 >Reporter: Hui WANG >Priority: Major > > I already seen https://issues.apache.org/jira/browse/SPARK-17620 > and https://issues.apache.org/jira/browse/SPARK-18397 > and I check source code of Spark for the change of set > "spark.sql.hive.covertCTAS=true" and then spark will use > "spark.sql.sources.default" which is parquet as storage format in "create > table as select" scenario. > But my case is just create table without select. When I set > hive.default.fileformat=parquet in hive-site.xml or set > spark.hadoop.hive.default.fileformat=parquet in spark-defaults.conf, after > create a table, when i check the hive table, it still use textfile fileformat. > > It seems HiveSerDe gets the value of the hive.default.fileformat parameter > from SQLConf > The parameter values in SQLConf are copied from SparkContext's SparkConf at > SparkSession initialization, while the configuration parameters in > hive-site.xml are loaded into SparkContext's hadoopConfiguration parameters > by SharedState, And all the config with "spark.hadoop" conf are setted to > hadoopconfig, so the configuration does not take effect. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org