davidm-db commented on code in PR #52173:
URL: https://github.com/apache/spark/pull/52173#discussion_r2313560501


##########
sql/api/src/main/scala/org/apache/spark/sql/SparkSession.scala:
##########
@@ -523,6 +523,24 @@ abstract class SparkSession extends Serializable with 
Closeable {
     sql(sqlText, args.asScala.toMap)
   }
 
+  /**
+   * Executes a SQL query substituting parameters by the given arguments with 
optional names,
+   * returning the result as a `DataFrame`. This API eagerly runs DDL/DML 
commands, but not for
+   * SELECT queries. This method allows the inner query to determine whether 
to use positional
+   * or named parameters based on its parameter markers.
+   *
+   * @param sqlText
+   *   A SQL statement with named or positional parameters to execute.
+   * @param args
+   *   An array of Java/Scala objects that can be converted to SQL literal 
expressions.
+   * @param paramNames
+   *   An optional array of parameter names corresponding to args. If 
provided, enables named
+   *   parameter binding where parameter names are available. If None or 
shorter than args,
+   *   remaining parameters are treated as positional.

Review Comment:
   Judging by the fact that we are throwing 
`INVALID_QUERY_MIXED_QUERY_PARAMETERS` in the case we have both, I'd say that 
you're correct - i.e. new API that supports both types of parameters.
   
   However, in `parameters.scala` the logic for handling both types of 
parameters at the same time is implemented as well, but obviously it doesn't 
look allowed.
   
   I'm struggling to understand it a bit, doesn't look that the idea is obvious 
and that implementation is consistent, but I'm probably missing something.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to