dtenedor commented on code in PR #36745:
URL: https://github.com/apache/spark/pull/36745#discussion_r889272134


##########
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala:
##########
@@ -122,7 +122,7 @@ abstract class SessionCatalogSuite extends AnalysisTest 
with Eventually {
   }
 
   test("create table with default columns") {
-    withBasicCatalog { catalog =>
+    if (!isHiveExternalCatalog) withBasicCatalog { catalog =>

Review Comment:
   Good question. This test is part of the base class `SessionCatalogSuite`. It 
fails when running with the `HiveExternalSessionCatalogSuite` subclass, since 
Hive tables are not included in the allowlist of providers that allow DEFAULT 
columns. To simplify this, I edited the config to add Hive tables to the list 
of allowed providers and this test passes now in all cases.



##########
sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:
##########
@@ -41,21 +41,28 @@ import org.apache.spark.sql.types.{MetadataBuilder, 
StructField, StructType}
  *
  * We can remove this rule once we implement all the catalog functionality in 
`V2SessionCatalog`.
  */
-class ResolveSessionCatalog(val catalogManager: CatalogManager)
+class ResolveSessionCatalog(val analyzer: Analyzer)
   extends Rule[LogicalPlan] with LookupCatalog {
+  val catalogManager = analyzer.catalogManager

Review Comment:
   SG, done.



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##########
@@ -83,16 +83,26 @@ object ResolveDefaultColumns {
    *
    * @param analyzer      used for analyzing the result of parsing the 
expression stored as text.
    * @param tableSchema   represents the names and types of the columns of the 
statement to process.
+   * @param tableProvider provider of the target table to store default values 
for, if any.
    * @param statementType name of the statement being processed, such as 
INSERT; useful for errors.
    * @return a copy of `tableSchema` with field metadata updated with the 
constant-folded values.
    */
   def constantFoldCurrentDefaultsToExistDefaults(
       analyzer: Analyzer,
       tableSchema: StructType,
+      tableProvider: Option[String],
       statementType: String): StructType = {
     if (SQLConf.get.enableDefaultColumns) {
       val newFields: Seq[StructField] = tableSchema.fields.map { field =>
         if (field.metadata.contains(CURRENT_DEFAULT_COLUMN_METADATA_KEY)) {
+          // Make sure that the target table has a provider that supports 
default column values.
+          val allowedProviders: Array[String] =
+            SQLConf.get.getConf(SQLConf.DEFAULT_COLUMN_ALLOWED_PROVIDERS)

Review Comment:
   Good idea, done!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to