[GitHub] [spark] imback82 commented on a change in pull request #30229: [SPARK-33321][SQL] Migrate ANALYZE TABLE commands to use UnresolvedTableOrView to resolve the identifier
imback82 commented on a change in pull request #30229: URL: https://github.com/apache/spark/pull/30229#discussion_r517105993 ## File path: sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala ## @@ -2606,6 +2606,13 @@ class DataSourceV2SQLSuite } } + private def testNotSupportedV2Command(sqlCommand: String, sqlParams: String): Unit = { +val e = intercept[AnalysisException] { + sql(s"$sqlCommand $sqlParams") +} +assert(e.message.contains(s"$sqlCommand is not supported for v2 tables")) Review comment: Yes, the plan is to move all the commands using `parseV1Table` in `ResolveSessionCatalog` to use the new framework and unify the message in the process. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] imback82 commented on a change in pull request #30229: [SPARK-33321][SQL] Migrate ANALYZE TABLE commands to use UnresolvedTableOrView to resolve the identifier
imback82 commented on a change in pull request #30229: URL: https://github.com/apache/spark/pull/30229#discussion_r516853032 ## File path: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala ## @@ -280,6 +280,9 @@ class DataSourceV2Strategy(session: SparkSession) extends Strategy with Predicat case r @ ShowTableProperties(rt: ResolvedTable, propertyKey) => ShowTablePropertiesExec(r.output, rt.table, propertyKey) :: Nil +case AnalyzeTable(_: ResolvedTable, _, _) | AnalyzeColumn(_: ResolvedTable, _, _) => + throw new AnalysisException("ANALYZE TABLE is not supported for v2 tables.") Review comment: Do you want a separate rule to handle these? There will be more commands not supported in v2. Btw, we have similar checks in this file: ```scala case DescribeColumn(_: ResolvedTable, _, _) => throw new AnalysisException("Describing columns is not supported for v2 tables.") ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] imback82 commented on a change in pull request #30229: [SPARK-33321][SQL] Migrate ANALYZE TABLE commands to use UnresolvedTableOrView to resolve the identifier
imback82 commented on a change in pull request #30229: URL: https://github.com/apache/spark/pull/30229#discussion_r516853691 ## File path: sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala ## @@ -419,17 +410,16 @@ class ResolveSessionCatalog( } ShowTablesCommand(db, Some(pattern), true, partitionsSpec) -case AnalyzeTableStatement(tbl, partitionSpec, noScan) => - val v1TableName = parseV1Table(tbl, "ANALYZE TABLE") +// ANALYZE TABLE works on views if the views are cached. Review comment: Updated the comment to reflect this works on permanent views. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] imback82 commented on a change in pull request #30229: [SPARK-33321][SQL] Migrate ANALYZE TABLE commands to use UnresolvedTableOrView to resolve the identifier
imback82 commented on a change in pull request #30229: URL: https://github.com/apache/spark/pull/30229#discussion_r516749544 ## File path: sql/core/src/main/scala/org/apache/spark/sql/execution/command/AnalyzePartitionCommand.scala ## @@ -75,6 +75,9 @@ case class AnalyzePartitionCommand( override def run(sparkSession: SparkSession): Seq[Row] = { val sessionState = sparkSession.sessionState +if (sessionState.catalog.getTempView(tableIdent.identifier).isDefined) { + throw new AnalysisException("ANALYZE TABLE is not supported on a temporary view.") +} Review comment: OK. What if we want to support the scenario where a temp view is allowed but not a permanent view (not this PR)? Do we want something like the following? ```scala case class UnresolvedTableOrView( multipartIdentifier: Seq[String], allowTempView: Boolean = true, allowPermanentView: Boolean = true) extends LeafNode { require(allowTempView || allowPermanentView) } ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org