[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user yhuai commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202562672 https://github.com/apache/spark/pull/12010 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user yhuai commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202559495 let me try to fix it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user yhuai commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202558809 looks like we are missing antlr4 maven plugin. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user zsxwing commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202555774 Seems maven stuff is not set correctly? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user yhuai commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202555657 https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Compile/job/spark-master-compile-maven-hadoop-2.3/956/consoleFull ``` [error] /home/jenkins/workspace/spark-master-compile-maven-hadoop-2.3/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ng/AstBuilder.scala:31: object SqlBaseParser is not a member of package org.apache.spark.sql.catalyst.parser.ng [error] import org.apache.spark.sql.catalyst.parser.ng.SqlBaseParser._ [error]^ [error] /home/jenkins/workspace/spark-master-compile-maven-hadoop-2.3/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ng/AstBuilder.scala:42: not found: type SqlBaseBaseVisitor [error] class AstBuilder extends SqlBaseBaseVisitor[AnyRef] with Logging { [error] ^ [error] /home/jenkins/workspace/spark-master-compile-maven-hadoop-2.3/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ng/AstBuilder.scala:46: type mismatch; [error] found : org.apache.spark.sql.catalyst.parser.ng.AstBuilder [error] required: org.antlr.v4.runtime.tree.ParseTreeVisitor[_] [error] ctx.accept(this).asInstanceOf[T] [error]^ [error] /home/jenkins/workspace/spark-master-compile-maven-hadoop-2.3/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ng/AstBuilder.scala:49: not found: type SingleStatementContext [error] override def visitSingleStatement(ctx: SingleStatementContext): LogicalPlan = withOrigin(ctx) { [error] ^ [error] /home/jenkins/workspace/spark-master-compile-maven-hadoop-2.3/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ng/AstBuilder.scala:53: not found: type SingleExpressionContext [error] override def visitSingleExpression(ctx: SingleExpressionContext): Expression = withOrigin(ctx) { [error] ^ ``` --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user yhuai commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202555073 @hvanhovell Seems it breaks the build... --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/11557 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user rxin commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202547496 Merging this in master. Thanks. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202495324 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54320/ Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202495312 Merged build finished. Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202494895 **[Test build #54320 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54320/consoleFull)** for PR 11557 at commit [`6f1c535`](https://github.com/apache/spark/commit/6f1c535162397f01acf0405bdc80b8c4c141fc64). * This patch passes all tests. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202481762 Merged build finished. Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202481765 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54318/ Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202481169 **[Test build #54318 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54318/consoleFull)** for PR 11557 at commit [`4200ad6`](https://github.com/apache/spark/commit/4200ad670d6c6d137f2f43470d744b1d9087cbed). * This patch **fails Spark unit tests**. * This patch merges cleanly. * This patch adds the following public classes _(experimental)_: * ` implicit class EnhancedLogicalPlan(val plan: LogicalPlan) extends AnyVal ` --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202429698 **[Test build #54320 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54320/consoleFull)** for PR 11557 at commit [`6f1c535`](https://github.com/apache/spark/commit/6f1c535162397f01acf0405bdc80b8c4c141fc64). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-202420516 **[Test build #54318 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54318/consoleFull)** for PR 11557 at commit [`4200ad6`](https://github.com/apache/spark/commit/4200ad670d6c6d137f2f43470d744b1d9087cbed). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-201084701 LGTM except we parse `foo(*)` into `foo(1)`. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57354160 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) +
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57281611 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57281018 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279846 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279658 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279657 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279440 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) +
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279420 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) +
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279436 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279310 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) +
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user liancheng commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279151 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279097 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) +
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279041 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala --- @@ -0,0 +1,250 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.execution + +import scala.collection.JavaConverters._ + +import org.antlr.v4.runtime.misc.Interval + +import org.apache.spark.sql.SaveMode +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.parser.ng.{AbstractSqlParser, AstBuilder} +import org.apache.spark.sql.catalyst.parser.ng.SqlBaseParser._ +import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, OneRowRelation} +import org.apache.spark.sql.execution.command._ +import org.apache.spark.sql.execution.datasources._ + +/** + * Concrete parser for Spark SQL statements. + */ +object SparkSqlParser extends AbstractSqlParser{ + val astBuilder = new SparkSqlAstBuilder +} + +/** + * Builder that converts an ANTLR ParseTree into a LogicalPlan/Expression/TableIdentifier. + */ +class SparkSqlAstBuilder extends AstBuilder { + import AstBuilder._ + import org.apache.spark.sql.catalyst.parser.ParseUtils._ + + /** + * Create a [[SetCommand]] logical plan. + * + * Note that we assume that everything after the SET keyword is assumed to be a part of the + * key-value pair. The split between key and value is made by searching for the first `=` + * character in the raw string. + */ + override def visitSetConfiguration(ctx: SetConfigurationContext): LogicalPlan = withOrigin(ctx) { --- End diff -- If we don't support a grammar in the parser. The parse rule is not implemented and returns a `null`; this is then captured by the 'ParseDriver` which will throw a ParseException. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user liancheng commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57279004 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57278951 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user liancheng commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57278901 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) + }
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user liancheng commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57278832 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57272561 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57272283 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57272248 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57271935 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) + }
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57271566 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) + }
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57271266 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) + }
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57271242 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/ExpressionParserSuite.scala --- @@ -0,0 +1,494 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.analysis.{UnresolvedAttribute, _} +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans.PlanTest +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval + +/** + * Test basic expression parsing. If a type of expression is supported it should be tested here. + * + * Please note that some of the expressions test don't have to be sound expressions, only their + * structure needs to be valid. Unsound expressions should be caught by the Analyzer or + * CheckAnalysis classes. + */ +class ExpressionParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, e: Expression): Unit = { +compareExpressions(parseExpression(sqlCommand), e) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parseExpression(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("star expressions") { +// Global Star +assertEqual("*", UnresolvedStar(None)) + +// Targeted Star +assertEqual("a.b.*", UnresolvedStar(Option(Seq("a", "b" + } + + // NamedExpression (Alias/Multialias) + test("named expressions") { +// No Alias +val r0 = 'a +assertEqual("a", r0) + +// Single Alias. +val r1 = 'a as "b" +assertEqual("a as b", r1) +assertEqual("a b", r1) + +// Multi-Alias +assertEqual("a as (b, c)", MultiAlias('a, Seq("b", "c"))) +assertEqual("a() (b, c)", MultiAlias('a.function(), Seq("b", "c"))) + +// Numeric literals without a space between the literal qualifier and the alias, should not be +// interpreted as such. An unresolved reference should be returned instead. +// TODO add the JIRA-ticket number. +assertEqual("1SL", Symbol("1SL")) + +// Aliased star is allowed. +assertEqual("a.* b", UnresolvedStar(Option(Seq("a"))) as 'b) + } + + test("binary logical expressions") { +// And +assertEqual("a and b", 'a && 'b) + +// Or +assertEqual("a or b", 'a || 'b) + +// Combination And/Or check precedence +assertEqual("a and b or c and d", ('a && 'b) || ('c && 'd)) +assertEqual("a or b or c and d", 'a || 'b || ('c && 'd)) + +// Multiple AND/OR get converted into a balanced tree +assertEqual("a or b or c or d or e or f", (('a || 'b) || 'c) || (('d || 'e) || 'f)) +assertEqual("a and b and c and d and e and f", (('a && 'b) && 'c) && (('d && 'e) && 'f)) + } + + test("long binary logical expressions") { +def testVeryBinaryExpression(op: String, clazz: Class[_]): Unit = { + val sql = (1 to 1000).map(x => s"$x == $x").mkString(op) + val e = parseExpression(sql) + assert(e.collect { case _: EqualTo => true }.size === 1000) + assert(e.collect { case x if clazz.isInstance(x) => true }.size === 999) +} +testVeryBinaryExpression(" AND ", classOf[And]) +testVeryBinaryExpression(" OR ", classOf[Or]) + } + + test("not expressions") { +assertEqual("not a", !'a) +assertEqual("!a", !'a) +assertEqual("not true > true", Not(GreaterThan(true, true))) + }
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57269652 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala --- @@ -0,0 +1,250 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.execution + +import scala.collection.JavaConverters._ + +import org.antlr.v4.runtime.misc.Interval + +import org.apache.spark.sql.SaveMode +import org.apache.spark.sql.catalyst.TableIdentifier +import org.apache.spark.sql.catalyst.parser.ng.{AbstractSqlParser, AstBuilder} +import org.apache.spark.sql.catalyst.parser.ng.SqlBaseParser._ +import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, OneRowRelation} +import org.apache.spark.sql.execution.command._ +import org.apache.spark.sql.execution.datasources._ + +/** + * Concrete parser for Spark SQL statements. + */ +object SparkSqlParser extends AbstractSqlParser{ + val astBuilder = new SparkSqlAstBuilder +} + +/** + * Builder that converts an ANTLR ParseTree into a LogicalPlan/Expression/TableIdentifier. + */ +class SparkSqlAstBuilder extends AstBuilder { + import AstBuilder._ + import org.apache.spark.sql.catalyst.parser.ParseUtils._ + + /** + * Create a [[SetCommand]] logical plan. + * + * Note that we assume that everything after the SET keyword is assumed to be a part of the + * key-value pair. The split between key and value is made by searching for the first `=` + * character in the raw string. + */ + override def visitSetConfiguration(ctx: SetConfigurationContext): LogicalPlan = withOrigin(ctx) { --- End diff -- So we have some grammars that are only allowed in sql module, not catalyst module. What if we use these grammars at catalyst module? Will we throw parser exception? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57269065 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/ng/PlanParserSuite.scala --- @@ -0,0 +1,408 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import org.apache.spark.sql.Row +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.types.IntegerType + +class PlanParserSuite extends PlanTest { + import CatalystSqlParser._ + import org.apache.spark.sql.catalyst.dsl.expressions._ + import org.apache.spark.sql.catalyst.dsl.plans._ + + def assertEqual(sqlCommand: String, plan: LogicalPlan): Unit = { +comparePlans(parsePlan(sqlCommand), plan) + } + + def intercept(sqlCommand: String, messages: String*): Unit = { +val e = intercept[ParseException](parsePlan(sqlCommand)) +messages.foreach { message => + assert(e.message.contains(message)) +} + } + + test("case insensitive") { +val plan = table("a").select(star()) +assertEqual("sELEct * FroM a", plan) +assertEqual("select * fRoM a", plan) +assertEqual("SELECT * FROM a", plan) + } + + test("show functions") { +assertEqual("show functions", ShowFunctions(None, None)) +assertEqual("show functions foo", ShowFunctions(None, Some("foo"))) +assertEqual("show functions foo.bar", ShowFunctions(Some("foo"), Some("bar"))) +assertEqual("show functions 'foo.*'", ShowFunctions(None, Some("foo\\.*"))) +intercept("show functions foo.bar.baz", "SHOW FUNCTIONS unsupported name") + } + + test("describe function") { +assertEqual("describe function bar", DescribeFunction("bar", isExtended = false)) +assertEqual("describe function extended bar", DescribeFunction("bar", isExtended = true)) +assertEqual("describe function foo.bar", DescribeFunction("foo.bar", isExtended = false)) +assertEqual("describe function extended f.bar", DescribeFunction("f.bar", isExtended = true)) + } + + test("set operations") { +val a = table("a").select(star()) +val b = table("b").select(star()) + +assertEqual("select * from a union select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union distinct select * from b", Distinct(a.unionAll(b))) +assertEqual("select * from a union all select * from b", a.unionAll(b)) +assertEqual("select * from a except select * from b", a.except(b)) +intercept("select * from a except all select * from b", "EXCEPT ALL is not supported.") +assertEqual("select * from a except distinct select * from b", a.except(b)) +assertEqual("select * from a intersect select * from b", a.intersect(b)) +intercept("select * from a intersect all select * from b", "INTERSECT ALL is not supported.") +assertEqual("select * from a intersect distinct select * from b", a.intersect(b)) + } + + test("common table expressions") { +def cte(plan: LogicalPlan, namedPlans: (String, LogicalPlan)*): With = { + val ctes = namedPlans.map { +case (name, cte) => + name -> SubqueryAlias(name, cte) + }.toMap + With(plan, ctes) +} +assertEqual( + "with cte1 as (select * from a) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> table("a").select(star( +assertEqual( + "with cte1 (select 1) select * from cte1", + cte(table("cte1").select(star()), "cte1" -> OneRowRelation.select(1))) +assertEqual( + "with cte1 (select 1), cte2 as (select * from cte1) select * from cte2", + cte(table("cte2").select(star()), +"cte1" -> OneRowRelation.select(1), +"cte2" ->
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-200562104 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/53964/ Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-200562102 Merged build finished. Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-200561508 **[Test build #53964 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53964/consoleFull)** for PR 11557 at commit [`be6f5c8`](https://github.com/apache/spark/commit/be6f5c82956152ac33c78d484ee5fe8952e18632). * This patch passes all tests. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-200527025 @cloud-fan I have created a gist of the generated files: https://gist.github.com/hvanhovell/2ee2149efd6cab79e880 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-200518822 **[Test build #53964 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53964/consoleFull)** for PR 11557 at commit [`be6f5c8`](https://github.com/apache/spark/commit/be6f5c82956152ac33c78d484ee5fe8952e18632). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57226227 --- Diff: LICENSE --- @@ -238,6 +238,7 @@ The text of each license is also included at licenses/LICENSE-[project].txt. (BSD 3 Clause) netlib core (com.github.fommil.netlib:core:1.1.2 - https://github.com/fommil/netlib-java/core) (BSD 3 Clause) JPMML-Model (org.jpmml:pmml-model:1.2.7 - https://github.com/jpmml/jpmml-model) (BSD License) AntLR Parser Generator (antlr:antlr:2.7.7 - http://www.antlr.org/) + (BSD License) ANTLR 4.5.2-1 (org.antlr:antlr4:4.5.2-1 - http://.antlr.org/) --- End diff -- @srowen is this sufficient? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57226156 --- Diff: project/plugins.sbt --- @@ -23,3 +23,9 @@ libraryDependencies += "org.ow2.asm" % "asm" % "5.0.3" libraryDependencies += "org.ow2.asm" % "asm-commons" % "5.0.3" libraryDependencies += "org.antlr" % "antlr" % "3.5.2" + + +// TODO I am not sure we want such a dep. +resolvers += "simplytyped" at "http://simplytyped.github.io/repo/releases; --- End diff -- We can do without the plugin (like we did for ANTLR3) but it'll take a little bit of effort. I'll probably end up doing this because I want ANTLR warnings to be treated like errors. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user liancheng commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57185205 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ng/AstBuilder.scala --- @@ -0,0 +1,1450 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import scala.collection.JavaConverters._ +import scala.collection.mutable.ArrayBuffer + +import org.antlr.v4.runtime.{ParserRuleContext, Token} +import org.antlr.v4.runtime.tree.{ParseTree, TerminalNode} + +import org.apache.spark.internal.Logging +import org.apache.spark.sql.catalyst.{InternalRow, TableIdentifier} +import org.apache.spark.sql.catalyst.analysis._ +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.parser.ParseUtils +import org.apache.spark.sql.catalyst.parser.ng.SqlBaseParser._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.catalyst.trees.CurrentOrigin +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval +import org.apache.spark.util.random.RandomSampler + +/** + * The AstBuilder converts an ANTLR4 ParseTree into a catalyst Expression, LogicalPlan or + * TableIdentifier. + */ +class AstBuilder extends SqlBaseBaseVisitor[AnyRef] with Logging { + import AstBuilder._ + import ParseUtils._ + + protected def typedVisit[T](ctx: ParseTree): T = { +ctx.accept(this).asInstanceOf[T] + } + + override def visitSingleStatement(ctx: SingleStatementContext): LogicalPlan = withOrigin(ctx) { +visit(ctx.statement).asInstanceOf[LogicalPlan] + } + + override def visitSingleExpression(ctx: SingleExpressionContext): Expression = withOrigin(ctx) { +visitNamedExpression(ctx.namedExpression) + } + + override def visitSingleTableIdentifier( + ctx: SingleTableIdentifierContext): TableIdentifier = withOrigin(ctx) { +visitTableIdentifier(ctx.tableIdentifier) + } + + override def visitSingleDataType(ctx: SingleDataTypeContext): DataType = withOrigin(ctx) { +visit(ctx.dataType).asInstanceOf[DataType] + } + + /* + * Plan parsing + * */ + protected def plan(tree: ParserRuleContext): LogicalPlan = typedVisit(tree) + + /** + * Create a plan for a SHOW FUNCTIONS command. + */ + override def visitShowFunctions(ctx: ShowFunctionsContext): LogicalPlan = withOrigin(ctx) { +import ctx._ +if (qualifiedName != null) { + val names = qualifiedName().identifier().asScala.map(_.getText).toList + names match { +case db :: name :: Nil => + ShowFunctions(Some(db), Some(name)) +case name :: Nil => + ShowFunctions(None, Some(name)) +case _ => + throw new ParseException("SHOW FUNCTIONS unsupported name", ctx) + } +} else if (pattern != null) { + ShowFunctions(None, Some(unescapeSQLString(pattern.getText))) +} else { + ShowFunctions(None, None) +} + } + + /** + * Create a plan for a DESCRIBE FUNCTION command. + */ + override def visitDescribeFunction(ctx: DescribeFunctionContext): LogicalPlan = withOrigin(ctx) { +val functionName = ctx.qualifiedName().identifier().asScala.map(_.getText).mkString(".") +DescribeFunction(functionName, ctx.EXTENDED != null) + } + + /** + * Create a top-level plan with Common Table Expressions. + */ + override def visitQuery(ctx: QueryContext): LogicalPlan = withOrigin(ctx) { +val query = plan(ctx.queryNoWith) + +
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-200385241 Overall LGTM(I haven't looked into every parser rule though), it's much more readable than the antlr 3 version! Could you upload the generated java files somewhere and link them at this PR? I think they will be helpful for detailed code review, thanks! --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-200368863 @srowen thanks for the heads-up. I'll add an entry to `LICENSE`. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57168260 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/internal/SessionState.scala --- @@ -80,7 +80,7 @@ private[sql] class SessionState(ctx: SQLContext) { /** * Parser that extracts expressions, plans, table identifiers etc. from SQL texts. */ - lazy val sqlParser: ParserInterface = new SparkQl(conf) + lazy val sqlParser: ParserInterface = SparkSqlParser --- End diff -- It is still in use in Hive. I'll open up a follow-up PR in the following week or so which will also parse the Hive commands. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57166640 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/internal/SessionState.scala --- @@ -80,7 +80,7 @@ private[sql] class SessionState(ctx: SQLContext) { /** * Parser that extracts expressions, plans, table identifiers etc. from SQL texts. */ - lazy val sqlParser: ParserInterface = new SparkQl(conf) + lazy val sqlParser: ParserInterface = SparkSqlParser --- End diff -- To confirm: is the antlr 3 code path still in use? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57165747 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/dsl/package.scala --- @@ -161,6 +161,10 @@ package object dsl { def lower(e: Expression): Expression = Lower(e) def sqrt(e: Expression): Expression = Sqrt(e) def abs(e: Expression): Expression = Abs(e) +def all(names: String*): Expression = names match { --- End diff -- +1 for `star` --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57165625 --- Diff: project/plugins.sbt --- @@ -23,3 +23,9 @@ libraryDependencies += "org.ow2.asm" % "asm" % "5.0.3" libraryDependencies += "org.ow2.asm" % "asm-commons" % "5.0.3" libraryDependencies += "org.antlr" % "antlr" % "3.5.2" + + +// TODO I am not sure we want such a dep. +resolvers += "simplytyped" at "http://simplytyped.github.io/repo/releases; --- End diff -- Is it the standard way to run antlr 4 in SBT? Just curious about how hard it would be if we don't use this plugin. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user srowen commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-200331100 @hvanhovell tiny footnote: I double-checked that ANTLR4 is still BSD-licensed and it is. You might briefly update the entries in `LICENSE` to have one representative entry for ANTLR4. The artifact doesn't really matter (was just auto-generated once upon a time). It's not essential but good form. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57149953 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/CatalystQlSuite.scala --- @@ -21,15 +21,18 @@ import org.apache.spark.sql.AnalysisException import org.apache.spark.sql.catalyst.TableIdentifier import org.apache.spark.sql.catalyst.analysis._ import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.parser.ng.CatalystSqlParser import org.apache.spark.sql.catalyst.plans.PlanTest import org.apache.spark.sql.catalyst.plans.logical._ import org.apache.spark.unsafe.types.CalendarInterval class CatalystQlSuite extends PlanTest { - val parser = new CatalystQl() + val parser = CatalystSqlParser --- End diff -- We can revert this line. The new ExpressionParserSuite/PlanParserSuite/TableIdentifierSuite contain all CatalystQlSuite tests or similar ones. So this is not really needed anymore. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57149320 --- Diff: sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/ng/SqlBase.g4 --- @@ -0,0 +1,742 @@ +/* + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + * This file is an adaptation of Presto's presto-parser/src/main/antlr4/com/facebook/presto/sql/parser/SqlBase.g4 grammar. + */ + +grammar SqlBase; + +tokens { +DELIMITER +} + +singleStatement +: statement EOF +; + +singleExpression +: namedExpression EOF +; + +singleTableIdentifier +: tableIdentifier EOF +; + +singleDataType +: dataType EOF +; + +statement +: query #statementDefault +| USE db=identifier#use +| createTable ('(' colTypeList ')')? tableProvider tableProperties #createTableUsing +| createTable tableProvider tableProperties? AS? query #createTableUsingAsSelect +| DROP TABLE (IF EXISTS)? qualifiedName #dropTable +| DELETE FROM qualifiedName (WHERE booleanExpression)? #delete +| ALTER TABLE from=qualifiedName RENAME TO to=qualifiedName #renameTable +| ALTER TABLE tableName=qualifiedName +RENAME COLUMN from=identifier TO to=identifier #renameColumn +| ALTER TABLE tableName=qualifiedName +ADD COLUMN column=colType #addColumn +| CREATE (OR REPLACE)? VIEW qualifiedName AS query #createView +| DROP VIEW (IF EXISTS)? qualifiedName #dropView +| CALL qualifiedName '(' (callArgument (',' callArgument)*)? ')' #call +| EXPLAIN explainOption* statement #explain +| SHOW TABLES ((FROM | IN) db=identifier)? +(LIKE (qualifiedName | pattern=STRING))? #showTables +| SHOW SCHEMAS ((FROM | IN) identifier)? #showSchemas +| SHOW CATALOGS #showCatalogs +| SHOW COLUMNS (FROM | IN) qualifiedName #showColumns +| SHOW FUNCTIONS (LIKE? (qualifiedName | pattern=STRING))? #showFunctions +| (DESC | DESCRIBE) FUNCTION EXTENDED? qualifiedName #describeFunction +| (DESC | DESCRIBE) option=(EXTENDED | FORMATTED)? +tableIdentifier partitionSpec? describeColName? #describeTable +| SHOW SESSION #showSession +| SET SESSION qualifiedName EQ expression #setSession +| RESET SESSION qualifiedName #resetSession +| START TRANSACTION (transactionMode (',' transactionMode)*)? #startTransaction +| COMMIT WORK? #commit +| ROLLBACK WORK? #rollback +| SHOW PARTITIONS (FROM | IN) qualifiedName +(WHERE booleanExpression)? +(ORDER BY sortItem (',' sortItem)*)? +(LIMIT limit=(INTEGER_VALUE | ALL))? #showPartitions +| REFRESH TABLE tableIdentifier #refreshTable +| CACHE LAZY? TABLE identifier (AS? query)? #cacheTable +| UNCACHE TABLE identifier #uncacheTable +| CLEAR CACHE #clearCache +| SET .*? #setConfiguration +; + +createTable +: CREATE TEMPORARY? TABLE (IF NOT EXISTS)? tableIdentifier +; + +query +: ctes? queryNoWith +; + +insertInto +: INSERT OVERWRITE TABLE tableIdentifier partitionSpec? (IF NOT EXISTS)? +| INSERT INTO TABLE? tableIdentifier partitionSpec? +; + +partitionSpec +
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57148383 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/CatalystQlSuite.scala --- @@ -21,15 +21,18 @@ import org.apache.spark.sql.AnalysisException import org.apache.spark.sql.catalyst.TableIdentifier import org.apache.spark.sql.catalyst.analysis._ import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.parser.ng.CatalystSqlParser import org.apache.spark.sql.catalyst.plans.PlanTest import org.apache.spark.sql.catalyst.plans.logical._ import org.apache.spark.unsafe.types.CalendarInterval class CatalystQlSuite extends PlanTest { - val parser = new CatalystQl() + val parser = CatalystSqlParser --- End diff -- And these dsl simplifications LGTM, we can still keep them here. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57148298 --- Diff: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/parser/CatalystQlSuite.scala --- @@ -21,15 +21,18 @@ import org.apache.spark.sql.AnalysisException import org.apache.spark.sql.catalyst.TableIdentifier import org.apache.spark.sql.catalyst.analysis._ import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.parser.ng.CatalystSqlParser import org.apache.spark.sql.catalyst.plans.PlanTest import org.apache.spark.sql.catalyst.plans.logical._ import org.apache.spark.unsafe.types.CalendarInterval class CatalystQlSuite extends PlanTest { - val parser = new CatalystQl() + val parser = CatalystSqlParser --- End diff -- Since we still keep the antlr 3 code, I think we should create a new `CatalystSqlParserSuite` and copy code here to it. We can remove this file after we complete migrating to antlr 4. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57147410 --- Diff: sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/ng/SqlBase.g4 --- @@ -0,0 +1,742 @@ +/* + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * + * This file is an adaptation of Presto's presto-parser/src/main/antlr4/com/facebook/presto/sql/parser/SqlBase.g4 grammar. + */ + +grammar SqlBase; + +tokens { +DELIMITER +} + +singleStatement +: statement EOF +; + +singleExpression +: namedExpression EOF +; + +singleTableIdentifier +: tableIdentifier EOF +; + +singleDataType +: dataType EOF +; + +statement +: query #statementDefault +| USE db=identifier#use +| createTable ('(' colTypeList ')')? tableProvider tableProperties #createTableUsing +| createTable tableProvider tableProperties? AS? query #createTableUsingAsSelect +| DROP TABLE (IF EXISTS)? qualifiedName #dropTable +| DELETE FROM qualifiedName (WHERE booleanExpression)? #delete +| ALTER TABLE from=qualifiedName RENAME TO to=qualifiedName #renameTable +| ALTER TABLE tableName=qualifiedName +RENAME COLUMN from=identifier TO to=identifier #renameColumn +| ALTER TABLE tableName=qualifiedName +ADD COLUMN column=colType #addColumn +| CREATE (OR REPLACE)? VIEW qualifiedName AS query #createView +| DROP VIEW (IF EXISTS)? qualifiedName #dropView +| CALL qualifiedName '(' (callArgument (',' callArgument)*)? ')' #call +| EXPLAIN explainOption* statement #explain +| SHOW TABLES ((FROM | IN) db=identifier)? +(LIKE (qualifiedName | pattern=STRING))? #showTables +| SHOW SCHEMAS ((FROM | IN) identifier)? #showSchemas +| SHOW CATALOGS #showCatalogs +| SHOW COLUMNS (FROM | IN) qualifiedName #showColumns +| SHOW FUNCTIONS (LIKE? (qualifiedName | pattern=STRING))? #showFunctions +| (DESC | DESCRIBE) FUNCTION EXTENDED? qualifiedName #describeFunction +| (DESC | DESCRIBE) option=(EXTENDED | FORMATTED)? +tableIdentifier partitionSpec? describeColName? #describeTable +| SHOW SESSION #showSession +| SET SESSION qualifiedName EQ expression #setSession +| RESET SESSION qualifiedName #resetSession +| START TRANSACTION (transactionMode (',' transactionMode)*)? #startTransaction +| COMMIT WORK? #commit +| ROLLBACK WORK? #rollback +| SHOW PARTITIONS (FROM | IN) qualifiedName +(WHERE booleanExpression)? +(ORDER BY sortItem (',' sortItem)*)? +(LIMIT limit=(INTEGER_VALUE | ALL))? #showPartitions +| REFRESH TABLE tableIdentifier #refreshTable +| CACHE LAZY? TABLE identifier (AS? query)? #cacheTable +| UNCACHE TABLE identifier #uncacheTable +| CLEAR CACHE #clearCache +| SET .*? #setConfiguration +; + +createTable +: CREATE TEMPORARY? TABLE (IF NOT EXISTS)? tableIdentifier +; + +query +: ctes? queryNoWith +; + +insertInto +: INSERT OVERWRITE TABLE tableIdentifier partitionSpec? (IF NOT EXISTS)? +| INSERT INTO TABLE? tableIdentifier partitionSpec? +; + +partitionSpec +
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57147327 --- Diff: project/plugins.sbt --- @@ -23,3 +23,9 @@ libraryDependencies += "org.ow2.asm" % "asm" % "5.0.3" libraryDependencies += "org.ow2.asm" % "asm-commons" % "5.0.3" libraryDependencies += "org.antlr" % "antlr" % "3.5.2" + + +// TODO I am not sure we want such a dep. +resolvers += "simplytyped" at "http://simplytyped.github.io/repo/releases; --- End diff -- This is for the ANTLR4 SBT plugin. I am just curious if we want to depend on a github repo in our build. This will remain in the build even after we drop ANTLR3. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57146948 --- Diff: project/plugins.sbt --- @@ -23,3 +23,9 @@ libraryDependencies += "org.ow2.asm" % "asm" % "5.0.3" libraryDependencies += "org.ow2.asm" % "asm-commons" % "5.0.3" libraryDependencies += "org.antlr" % "antlr" % "3.5.2" + + +// TODO I am not sure we want such a dep. +resolvers += "simplytyped" at "http://simplytyped.github.io/repo/releases; --- End diff -- Why we need it? Is it because we have to support both antlr 3 and 4 for now? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57077787 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/dsl/package.scala --- @@ -161,6 +161,10 @@ package object dsl { def lower(e: Expression): Expression = Lower(e) def sqrt(e: Expression): Expression = Sqrt(e) def abs(e: Expression): Expression = Abs(e) +def all(names: String*): Expression = names match { --- End diff -- star is also fine. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user hvanhovell commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57077678 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ng/AstBuilder.scala --- @@ -0,0 +1,1450 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import scala.collection.JavaConverters._ +import scala.collection.mutable.ArrayBuffer + +import org.antlr.v4.runtime.{ParserRuleContext, Token} +import org.antlr.v4.runtime.tree.{ParseTree, TerminalNode} + +import org.apache.spark.internal.Logging +import org.apache.spark.sql.catalyst.{InternalRow, TableIdentifier} +import org.apache.spark.sql.catalyst.analysis._ +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.parser.ParseUtils +import org.apache.spark.sql.catalyst.parser.ng.SqlBaseParser._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.catalyst.trees.CurrentOrigin +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval +import org.apache.spark.util.random.RandomSampler + +/** + * The AstBuilder converts an ANTLR4 ParseTree into a catalyst Expression, LogicalPlan or + * TableIdentifier. + */ +class AstBuilder extends SqlBaseBaseVisitor[AnyRef] with Logging { + import AstBuilder._ + import ParseUtils._ + + protected def typedVisit[T](ctx: ParseTree): T = { +ctx.accept(this).asInstanceOf[T] + } + + override def visitSingleStatement(ctx: SingleStatementContext): LogicalPlan = withOrigin(ctx) { +visit(ctx.statement).asInstanceOf[LogicalPlan] + } + + override def visitSingleExpression(ctx: SingleExpressionContext): Expression = withOrigin(ctx) { +visitNamedExpression(ctx.namedExpression) + } + + override def visitSingleTableIdentifier( + ctx: SingleTableIdentifierContext): TableIdentifier = withOrigin(ctx) { +visitTableIdentifier(ctx.tableIdentifier) + } + + override def visitSingleDataType(ctx: SingleDataTypeContext): DataType = withOrigin(ctx) { +visit(ctx.dataType).asInstanceOf[DataType] + } + + /* + * Plan parsing + * */ + protected def plan(tree: ParserRuleContext): LogicalPlan = typedVisit(tree) + + /** + * Create a plan for a SHOW FUNCTIONS command. + */ + override def visitShowFunctions(ctx: ShowFunctionsContext): LogicalPlan = withOrigin(ctx) { +import ctx._ +if (qualifiedName != null) { + val names = qualifiedName().identifier().asScala.map(_.getText).toList + names match { +case db :: name :: Nil => + ShowFunctions(Some(db), Some(name)) +case name :: Nil => + ShowFunctions(None, Some(name)) +case _ => + throw new ParseException("SHOW FUNCTIONS unsupported name", ctx) + } +} else if (pattern != null) { + ShowFunctions(None, Some(unescapeSQLString(pattern.getText))) +} else { + ShowFunctions(None, None) +} + } + + /** + * Create a plan for a DESCRIBE FUNCTION command. + */ + override def visitDescribeFunction(ctx: DescribeFunctionContext): LogicalPlan = withOrigin(ctx) { +val functionName = ctx.qualifiedName().identifier().asScala.map(_.getText).mkString(".") +DescribeFunction(functionName, ctx.EXTENDED != null) + } + + /** + * Create a top-level plan with Common Table Expressions. + */ + override def visitQuery(ctx: QueryContext): LogicalPlan = withOrigin(ctx) { +val query = plan(ctx.queryNoWith) + +
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user rxin commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57071107 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ng/AstBuilder.scala --- @@ -0,0 +1,1450 @@ +/* + * Licensed to the Apache Software Foundation (ASF) under one or more + * contributor license agreements. See the NOTICE file distributed with + * this work for additional information regarding copyright ownership. + * The ASF licenses this file to You under the Apache License, Version 2.0 + * (the "License"); you may not use this file except in compliance with + * the License. You may obtain a copy of the License at + * + *http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + */ +package org.apache.spark.sql.catalyst.parser.ng + +import java.sql.{Date, Timestamp} + +import scala.collection.JavaConverters._ +import scala.collection.mutable.ArrayBuffer + +import org.antlr.v4.runtime.{ParserRuleContext, Token} +import org.antlr.v4.runtime.tree.{ParseTree, TerminalNode} + +import org.apache.spark.internal.Logging +import org.apache.spark.sql.catalyst.{InternalRow, TableIdentifier} +import org.apache.spark.sql.catalyst.analysis._ +import org.apache.spark.sql.catalyst.expressions._ +import org.apache.spark.sql.catalyst.parser.ParseUtils +import org.apache.spark.sql.catalyst.parser.ng.SqlBaseParser._ +import org.apache.spark.sql.catalyst.plans._ +import org.apache.spark.sql.catalyst.plans.logical._ +import org.apache.spark.sql.catalyst.trees.CurrentOrigin +import org.apache.spark.sql.types._ +import org.apache.spark.unsafe.types.CalendarInterval +import org.apache.spark.util.random.RandomSampler + +/** + * The AstBuilder converts an ANTLR4 ParseTree into a catalyst Expression, LogicalPlan or + * TableIdentifier. + */ +class AstBuilder extends SqlBaseBaseVisitor[AnyRef] with Logging { + import AstBuilder._ + import ParseUtils._ + + protected def typedVisit[T](ctx: ParseTree): T = { +ctx.accept(this).asInstanceOf[T] + } + + override def visitSingleStatement(ctx: SingleStatementContext): LogicalPlan = withOrigin(ctx) { +visit(ctx.statement).asInstanceOf[LogicalPlan] + } + + override def visitSingleExpression(ctx: SingleExpressionContext): Expression = withOrigin(ctx) { +visitNamedExpression(ctx.namedExpression) + } + + override def visitSingleTableIdentifier( + ctx: SingleTableIdentifierContext): TableIdentifier = withOrigin(ctx) { +visitTableIdentifier(ctx.tableIdentifier) + } + + override def visitSingleDataType(ctx: SingleDataTypeContext): DataType = withOrigin(ctx) { +visit(ctx.dataType).asInstanceOf[DataType] + } + + /* + * Plan parsing + * */ + protected def plan(tree: ParserRuleContext): LogicalPlan = typedVisit(tree) + + /** + * Create a plan for a SHOW FUNCTIONS command. + */ + override def visitShowFunctions(ctx: ShowFunctionsContext): LogicalPlan = withOrigin(ctx) { +import ctx._ +if (qualifiedName != null) { + val names = qualifiedName().identifier().asScala.map(_.getText).toList + names match { +case db :: name :: Nil => + ShowFunctions(Some(db), Some(name)) +case name :: Nil => + ShowFunctions(None, Some(name)) +case _ => + throw new ParseException("SHOW FUNCTIONS unsupported name", ctx) + } +} else if (pattern != null) { + ShowFunctions(None, Some(unescapeSQLString(pattern.getText))) +} else { + ShowFunctions(None, None) +} + } + + /** + * Create a plan for a DESCRIBE FUNCTION command. + */ + override def visitDescribeFunction(ctx: DescribeFunctionContext): LogicalPlan = withOrigin(ctx) { +val functionName = ctx.qualifiedName().identifier().asScala.map(_.getText).mkString(".") +DescribeFunction(functionName, ctx.EXTENDED != null) + } + + /** + * Create a top-level plan with Common Table Expressions. + */ + override def visitQuery(ctx: QueryContext): LogicalPlan = withOrigin(ctx) { +val query = plan(ctx.queryNoWith) + +//
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user rxin commented on a diff in the pull request: https://github.com/apache/spark/pull/11557#discussion_r57070615 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/dsl/package.scala --- @@ -161,6 +161,10 @@ package object dsl { def lower(e: Expression): Expression = Lower(e) def sqrt(e: Expression): Expression = Sqrt(e) def abs(e: Expression): Expression = Abs(e) +def all(names: String*): Expression = names match { --- End diff -- star maybe? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-199785654 Merged build finished. Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-199785656 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/53764/ Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-199785459 **[Test build #53764 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53764/consoleFull)** for PR 11557 at commit [`a5d12ba`](https://github.com/apache/spark/commit/a5d12ba93a3c29194800d5e7de11704a96741833). * This patch passes all tests. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-199734370 **[Test build #53764 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53764/consoleFull)** for PR 11557 at commit [`a5d12ba`](https://github.com/apache/spark/commit/a5d12ba93a3c29194800d5e7de11704a96741833). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user rxin commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-199512015 btw here's another idea -- we don't need to do it as part of this pr. it'd still be great to have those parser tests compare plans. The problem in the past was it's annoying to update those plans when the parser changes. However, if we can make these compare against some reference files that are just the plans in their json representations, it becomes much easier to update these plans. Same thing goes for optimization rules. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user rxin commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-199448401 Let me know if we should start reviewing this. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198919390 Merged build finished. Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198919394 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/53631/ Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198919053 **[Test build #53631 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53631/consoleFull)** for PR 11557 at commit [`4f1da75`](https://github.com/apache/spark/commit/4f1da75388424c6d4169af900655c769524bee02). * This patch passes all tests. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198889006 **[Test build #53631 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53631/consoleFull)** for PR 11557 at commit [`4f1da75`](https://github.com/apache/spark/commit/4f1da75388424c6d4169af900655c769524bee02). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198146148 **[Test build #53488 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53488/consoleFull)** for PR 11557 at commit [`c5d0bcf`](https://github.com/apache/spark/commit/c5d0bcff0d41c1f302d842d6e111fa85a8baf74b). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198167846 **[Test build #53488 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53488/consoleFull)** for PR 11557 at commit [`c5d0bcf`](https://github.com/apache/spark/commit/c5d0bcff0d41c1f302d842d6e111fa85a8baf74b). * This patch **fails Spark unit tests**. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198468855 **[Test build #2651 has finished](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/2651/consoleFull)** for PR 11557 at commit [`c5d0bcf`](https://github.com/apache/spark/commit/c5d0bcff0d41c1f302d842d6e111fa85a8baf74b). * This patch passes all tests. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198134876 **[Test build #53480 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53480/consoleFull)** for PR 11557 at commit [`b87f2b8`](https://github.com/apache/spark/commit/b87f2b80e770f311326eeefb23e9d9b6ced62aa3). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198136161 Merged build finished. Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198167945 Merged build finished. Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198136162 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/53480/ Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198136155 **[Test build #53480 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53480/consoleFull)** for PR 11557 at commit [`b87f2b8`](https://github.com/apache/spark/commit/b87f2b80e770f311326eeefb23e9d9b6ced62aa3). * This patch **fails to build**. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198416483 **[Test build #2651 has started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/2651/consoleFull)** for PR 11557 at commit [`c5d0bcf`](https://github.com/apache/spark/commit/c5d0bcff0d41c1f302d842d6e111fa85a8baf74b). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-198167947 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/53488/ Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195911930 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/53022/ Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195911928 Merged build finished. Test PASSed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195911883 **[Test build #53022 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53022/consoleFull)** for PR 11557 at commit [`460ef0d`](https://github.com/apache/spark/commit/460ef0d77a366602c89d87e9518fc94a8b721dd3). * This patch passes all tests. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195885833 **[Test build #53022 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53022/consoleFull)** for PR 11557 at commit [`460ef0d`](https://github.com/apache/spark/commit/460ef0d77a366602c89d87e9518fc94a8b721dd3). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195882234 Merged build finished. Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195882238 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/53020/ Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195881884 **[Test build #53020 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53020/consoleFull)** for PR 11557 at commit [`303394f`](https://github.com/apache/spark/commit/303394f6c114eda0c085c871dd34b7ae70e9c81f). * This patch **fails Spark unit tests**. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195862507 **[Test build #53020 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53020/consoleFull)** for PR 11557 at commit [`303394f`](https://github.com/apache/spark/commit/303394f6c114eda0c085c871dd34b7ae70e9c81f). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195808208 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/53011/ Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user AmplabJenkins commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195808206 Merged build finished. Test FAILed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request: [SPARK-13713][SQL] Migrate parser from ANTLR3 ...
Github user SparkQA commented on the pull request: https://github.com/apache/spark/pull/11557#issuecomment-195808198 **[Test build #53011 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53011/consoleFull)** for PR 11557 at commit [`5d4e13f`](https://github.com/apache/spark/commit/5d4e13f0a5161b6431936641649599e3f8f37134). * This patch **fails to build**. * This patch merges cleanly. * This patch adds no public classes. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org