[ https://issues.apache.org/jira/browse/SPARK-9944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14705315#comment-14705315 ]
Yin Huai commented on SPARK-9944: --------------------------------- OK. I guess {{/user/ec2-user/warehouse}} is not the one you set, right? I took a look and here is my finding. The reason is that in Spark SQL, if you do not specify a database name when you create a table, your table will be created under the current database and we will use the location of current database as the parent dir of your table dir. If you create a table in the {{default}} database, we are using the location of {{default}} database as the parent dir of your tables. Once this default db is created, {{hive.metastore.warehouse.dir}} cannot override it. However, Hive will allow you use {{hive.metastore.warehouse.dir}} as the parent dir of your tables if you are using {{default}} database and you do not specify the location of your table. See https://github.com/apache/hive/blob/release-1.2.1/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java#L4473-L4479 (this is added by https://issues.apache.org/jira/browse/HIVE-6374). > hive.metastore.warehouse.dir is not respected > --------------------------------------------- > > Key: SPARK-9944 > URL: https://issues.apache.org/jira/browse/SPARK-9944 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.4.0, 1.4.1 > Reporter: Manku Timma > > In 1.3.1, {{hive.metastore.warehouse.dir}} was honoured and table data was > stored there. In 1.4.0, this is no longer used. Instead > {{DBS.DB_LOCATION_URI}} of the metastore is used always. This breaks use > cases where the param is used to override the warehouse location. > To reproduce the issue, start spark-shell with > {{hive.metastore.warehouse.dir}} set in hive-site.xml and run > {{df.saveAsTable("x")}}. You will see that the param is not honoured. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org