srielau commented on code in PR #52173:
URL: https://github.com/apache/spark/pull/52173#discussion_r2341689053


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSetVariable.scala:
##########
@@ -53,24 +68,17 @@ class ResolveSetVariable(val catalogManager: 
CatalogManager) extends Rule[Logica
           "Unexpected target variable expression in SetVariable: " + other)
       }
 
-      // Protect against duplicate variable names
-      // Names are normalized when the variables are created.
-      // No need for case insensitive comparison here.
-      // TODO: we need to group by the qualified variable name once other 
catalogs support it.
-      val dups = resolvedVars.groupBy(_.identifier).filter(kv => kv._2.length 
> 1)
-      if (dups.nonEmpty) {
-        throw new AnalysisException(
-          errorClass = "DUPLICATE_ASSIGNMENTS",
-          messageParameters = Map("nameList" ->
-            dups.keys.map(key => toSQLId(key.name())).mkString(", ")))
-      }
-
       setVariable.copy(targetVariables = resolvedVars)
 
     case setVariable: SetVariable
         if 
setVariable.targetVariables.forall(_.isInstanceOf[VariableReference]) &&
           setVariable.sourceQuery.resolved =>
       val targetVariables = 
setVariable.targetVariables.map(_.asInstanceOf[VariableReference])
+
+      // Check for duplicate variable names - this handles both regular SET 
VAR (after resolution)
+      // and EXECUTE IMMEDIATE ... INTO (which comes pre-resolved)
+      checkForDuplicateVariables(targetVariables)

Review Comment:
   Correct. Delaying it made sure it's caught.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to