[ 
https://issues.apache.org/jira/browse/SPARK-11683?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15003251#comment-15003251
 ] 

Xin Wu commented on SPARK-11683:
--------------------------------

It seems like in 1.6.0, this issue is resolved:
{code}
scala> val df = sqlContext.sql("select 1")
df: org.apache.spark.sql.DataFrame = [_c0: int]

scala> df.write.insertInto("my_db.some_table")

scala> sqlContext.sql("select * from my_db.some_table")
res3: org.apache.spark.sql.DataFrame = [c1: int]

scala> sqlContext.sql("select * from my_db.some_table").show
+---+
| c1|
+---+
|  1|
+---+
{code}

In 1.5.1, this issue is still present:
{code}
scala> val df = sqlContext.sql("select 2")
df: org.apache.spark.sql.DataFrame = [_c0: int]

scala> df.write.insertInto("my_db.some_table")
org.apache.spark.sql.AnalysisException: no such table my_db.some_table;

{code}

> DataFrameWriter.insertInto() fails on non-default schema
> --------------------------------------------------------
>
>                 Key: SPARK-11683
>                 URL: https://issues.apache.org/jira/browse/SPARK-11683
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.1
>            Reporter: Luis Antonio Torres
>            Priority: Minor
>
> On Spark 1.4.1, when using DataFramWriter.insertInto() method on a Hive table 
> created in a non-default schema like so:
> `myDF.write.insertInto("my_db.some_table")`
> the following exception is thrown:
> org.apache.spark.sql.AnalysisException: no such table my_db.some_table;
> The table exists because I can query it with `sqlContext.sql("SELECT * FROM 
> my_db.some_table")`
> However, when some_table is created in the default schema and the call to 
> insertInto becomes:
> `myDF.write.insertInto("some_table")`
> it works fine.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to