[jira] [Created] (SPARK-28674) Spark should support select into from
where as PostgreSQL supports

Fri, 09 Aug 2019 00:00:07 -0700

ABHISHEK KUMAR GUPTA created SPARK-28674:
--------------------------------------------

             Summary: Spark should support select <columnname> into <table> 
from <table> where <condition> as PostgreSQL supports 
                 Key: SPARK-28674
                 URL: https://issues.apache.org/jira/browse/SPARK-28674
             Project: Spark
          Issue Type: Sub-task
          Components: SQL
    Affects Versions: 2.4.0
            Reporter: ABHISHEK KUMAR GUPTA


Spark should support select <columnname> into <table> from <table> where 
<condition> as PostgreSQL supports 
create table dup(id int);
insert into dup values(1);
insert into dup values(2);
select id into test_dup from dup where id=1;
select * from test_dup;
**Result: Success in PostgreSQL**

But select id into test_dup from dup where id=1; in Spark gives ParseException
scala> sql("show tables").show();
+--------+---------+-----------+
|database|tableName|isTemporary|
+--------+---------+-----------+
|    func|      dup|      false|
+--------+---------+-----------+


scala> sql("select id into test_dup from dup where id=1").show()
org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input 'test_dup' expecting <EOF>(line 1, pos 15)

== SQL ==
select id into test_dup from dup where id=1
---------------^^^

  at 
org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:241)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:117)
  at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:69)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
  ... 49 elided





--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to