[ https://issues.apache.org/jira/browse/SPARK-20916?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16028694#comment-16028694 ]
Liang-Chi Hsieh commented on SPARK-20916: ----------------------------------------- I will look into this. Thanks. > Improve error message for unaliased subqueries in FROM clause > ------------------------------------------------------------- > > Key: SPARK-20916 > URL: https://issues.apache.org/jira/browse/SPARK-20916 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.3.0 > Reporter: Josh Rosen > > The following query parses in branch-2.2, but doesn't parse correctly as of > today's master: > {code} > SELECT x FROM (SELECT 1 AS x) > {code} > It still parses if you name the subquery in the FROM clause: > {code} > SELECT x FROM (SELECT 1 AS x) t > {code} > In master, this gives the following error: > {code} > scala> sql("""SELECT x FROM (SELECT 1 AS x)""") > org.apache.spark.sql.catalyst.parser.ParseException: > mismatched input 'FROM' expecting {<EOF>, 'WHERE', 'GROUP', 'ORDER', > 'HAVING', 'LIMIT', 'LATERAL', 'WINDOW', 'UNION', 'EXCEPT', 'MINUS', > 'INTERSECT', 'SORT', 'CLUSTER', 'DISTRIBUTE'}(line 1, pos 9) > == SQL == > SELECT x FROM (SELECT 1 AS x) > ---------^^^ > at > org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:217) > at > org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:114) > at > org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:48) > at > org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:68) > at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623) > ... 48 elided > {code} > It looks like this change is intentional due to SPARK-20690, but the error > message that we give here isn't very clear. I think that we should improve it > so as not to confuse users. -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org