[ 
https://issues.apache.org/jira/browse/SPARK-13342?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

neo updated SPARK-13342:
------------------------
    Priority: Critical  (was: Major)

> Cannot run INSERT statements in Spark
> -------------------------------------
>
>                 Key: SPARK-13342
>                 URL: https://issues.apache.org/jira/browse/SPARK-13342
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.5.1, 1.6.0
>            Reporter: neo
>            Priority: Critical
>
> I cannot run a INSERT statement using spark-sql. I tried with both versions 
> 1.5.1 and 1.6.0 without any luck. But it runs ok on hive.
> These are the steps I took.
> 1) Launch hive and create the table / insert a record.
> create database test
> use test
> CREATE TABLE stgTable
> (
> sno string,
> total bigint
> );
> INSERT INTO TABLE stgTable VALUES ('12',12)
> 2) Launch spark-sql (1.5.1 or 1.6.0)
> 3) Try inserting a record from the shell
> INSERT INTO table stgTable SELECT 'sno2',224 from stgTable limit 1
> I got this error message 
> "Invalid method name: 'alter_table_with_cascade'"
> I tried changing the hive version inside the spark-sql shell  using SET 
> command.
> I changed the hive version
> from
> SET spark.sql.hive.version=1.2.1  (this is the default setting for my spark 
> installation)
> to
> SET spark.sql.hive.version=0.14.0
> but that did not help either



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to