xkrogen commented on PR #38864: URL: https://github.com/apache/spark/pull/38864#issuecomment-1338091122
I managed to find a better SQL standard reference in the form of _SQL: The Complete Reference_ (2003), which has an entire chapter devoted to Dynamic SQL (beginning from page 547). You are correct regarding colons; those are indeed _host variables_. The standard for `PREPARE`/`EXECUTE` refers to positional arguments using `?`; this makes sense given how widely support that syntax is ([as mentioned in my last comment](https://github.com/apache/spark/pull/38712#issuecomment-1334227405)). However, I think it is safe to make an argument that what you are trying to add here is not, in fact, dynamic SQL, but rather host variables. We have no notion of precompiled statements with `PREPARE`/`EXECUTE`, which is the hallmark of dynamic SQL. Rather we have a query with a few "blanks" to be filled in, and we want to provide those blanks from the host execution environment. That is exactly what a host variable is for, right? You happen to be proposing an API (in Scala) that provides the values of the variables as map, instead of pulling them directory from the JVM execution environment, but from a SQL syntax perspective they are identical. i.e. you propose this example (syntax updated to `:`): ```scala spark.sql( sqlText = "SELECT * FROM tbl WHERE date > :startDate LIMIT :maxRows", args = Map( "startDate" -> "DATE'2022-12-01'", "maxRows" -> "100")) ``` This SQL syntax is fully compatible with the standard for host variables, it's just that the Scala interface is slightly different. This would _not_ preclude us from supporting the following in the future using the exact same SQL syntax: ```scala val startDate = "DATE'2022-12-01'" val maxRows = "100" spark.sql(sqlText = "SELECT * FROM tbl WHERE date > :startDate LIMIT :maxRows") ``` Both are fully compatible; one is explicit and one is implicit. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org