moomindani commented on a change in pull request #28953:
URL: https://github.com/apache/spark/pull/28953#discussion_r448709215



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
##########
@@ -122,6 +122,37 @@ object JdbcUtils extends Logging {
     }
   }
 
+  /**
+   * Runs a custom query against a table from the JDBC database.
+   */
+  def runQuery(conn: Connection, actions: String, options: JDBCOptions): Unit 
= {
+    val autoCommit = conn.getAutoCommit
+    conn.setAutoCommit(false)
+    val queries = actions.split(";")
+    try {
+      queries.foreach { query =>
+        val queryString = query.trim()
+        val statement = conn.prepareStatement(queryString)
+        try {
+          statement.setQueryTimeout(options.queryTimeout)
+          val hasResultSet = statement.execute()

Review comment:
       Currently result sets are always ignored if `preActions` or 
`postActions` return them. As you said, cursor won't be able to be used for 
reading as DataFrame.
   
   By design, basically DDL queries are expected for `preActions` and 
`postActions`.
   The reason why I used `execute` instead of `executeUpdate` is just to avoid 
exceptions due to stored procedure which returns result sets. Not to use result 
sets as an input for DataFrame.
   
   Do you think it is enough for us to explain that result sets are always 
ignored in `preActions` and `postActions` if query results the result sets?
   Or should we narrow down use-cases to support only DDL queries (which do not 
return result sets) by using `executeUpdate` instead of `execute`?
   
   My idea was the first one, but I want to follow community's opinion.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to