[ 
https://issues.apache.org/jira/browse/SPARK-20962?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Takeshi Yamamuro updated SPARK-20962:
-------------------------------------
    Description: 
Currently, we do not support subquery column aliases;
{code}

scala> sql("SELECT * FROM (SELECT 1 AS col1, 1 AS col2) t(a, b)").show
org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input '(' expecting {<EOF>, ',', 'WHERE', 'GROUP', 'ORDER', 
'HAVING', 'LIMIT', 'JOIN', 'CROSS', 'INNER', 'LEFT', 'RIGHT', 'FULL', 
'NATURAL', 'LATERAL', 'WINDOW', 'UNION', 'EXCEPT', 'MINUS', 'INTERSECT', 
'SORT', 'CLUSTER', 'DISTRIBUTE', 'ANTI'}(line 1, pos 45)


== SQL ==
SELECT * FROM (SELECT 1 AS col1, 1 AS col2) t(a, b)
---------------------------------------------^^^

  at 
org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:217)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:114)
  at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:68)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
{code}
We could support this by referring;
http://docs.aws.amazon.com/redshift/latest/dg/r_FROM_clause30.html

  was:
Currently, we do not support subquery aliases;
{code}

scala> sql("SELECT * FROM (SELECT 1 AS col1, 1 AS col2) t(a, b)").show
org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input '(' expecting {<EOF>, ',', 'WHERE', 'GROUP', 'ORDER', 
'HAVING', 'LIMIT', 'JOIN', 'CROSS', 'INNER', 'LEFT', 'RIGHT', 'FULL', 
'NATURAL', 'LATERAL', 'WINDOW', 'UNION', 'EXCEPT', 'MINUS', 'INTERSECT', 
'SORT', 'CLUSTER', 'DISTRIBUTE', 'ANTI'}(line 1, pos 45)


== SQL ==
SELECT * FROM (SELECT 1 AS col1, 1 AS col2) t(a, b)
---------------------------------------------^^^

  at 
org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:217)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:114)
  at 
org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
  at 
org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:68)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
{code}
We could support this by referring;
http://docs.aws.amazon.com/redshift/latest/dg/r_FROM_clause30.html


> Support subquery column aliases in FROM clause
> ----------------------------------------------
>
>                 Key: SPARK-20962
>                 URL: https://issues.apache.org/jira/browse/SPARK-20962
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 2.1.1
>            Reporter: Takeshi Yamamuro
>             Fix For: 2.3.0
>
>
> Currently, we do not support subquery column aliases;
> {code}
> scala> sql("SELECT * FROM (SELECT 1 AS col1, 1 AS col2) t(a, b)").show
> org.apache.spark.sql.catalyst.parser.ParseException:
> mismatched input '(' expecting {<EOF>, ',', 'WHERE', 'GROUP', 'ORDER', 
> 'HAVING', 'LIMIT', 'JOIN', 'CROSS', 'INNER', 'LEFT', 'RIGHT', 'FULL', 
> 'NATURAL', 'LATERAL', 'WINDOW', 'UNION', 'EXCEPT', 'MINUS', 'INTERSECT', 
> 'SORT', 'CLUSTER', 'DISTRIBUTE', 'ANTI'}(line 1, pos 45)
> == SQL ==
> SELECT * FROM (SELECT 1 AS col1, 1 AS col2) t(a, b)
> ---------------------------------------------^^^
>   at 
> org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:217)
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:114)
>   at 
> org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48)
>   at 
> org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:68)
>   at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)
> {code}
> We could support this by referring;
> http://docs.aws.amazon.com/redshift/latest/dg/r_FROM_clause30.html



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to