[ 
https://issues.apache.org/jira/browse/SPARK-11778?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15009251#comment-15009251
 ] 

Huaxin Gao commented on SPARK-11778:
------------------------------------

I can reproduce the problem OK. 

In hiveContext.table("db_name.table"), it goes through 
SqlParser.parseTableIdentifier(tableName)
and the table name "db_name.table" got resolved to 'db_name'.'table', and 
later,  when trying to get the the qualified table name, the database name is 
resolved to db_name, and table name is table, and  it can get the qualified 
table name OK.

In hiveContext.read.table("db_name.table"), it doesn't go through SQlParser to 
parse the table name, so the table name "db_name.table" remain as is.  Later, 
when trying to get the the qualified table name, the database name resolved as 
default, and table name is "db_name.table", it can't get the qualified table 
name correctly. 


> HiveContext.read.table does not support user-specified database names
> ---------------------------------------------------------------------
>
>                 Key: SPARK-11778
>                 URL: https://issues.apache.org/jira/browse/SPARK-11778
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.1
>            Reporter: Stanislav Hadjiiski
>            Priority: Minor
>
> If we have defined a HiveContext instance
> {quote}
> val hiveContext = new org.apache.spark.sql.hive.HiveContext(sparkContext)
> {quote}
> then
> {quote}
> hiveContext.table("db_name.table")
> {quote}
> works but
> {quote}
> hiveContext.read.table("db_name.table")
> {quote}
> throws an {{org.apache.spark.sql.catalyst.analysis.NoSuchTableException}}
> However,
> {quote}
> hiveContext.sql("use db_name")
> hiveContext.read.table("table")
> {quote}
> works as expected



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to