cloud-fan commented on a change in pull request #30025:
URL: https://github.com/apache/spark/pull/30025#discussion_r509878979



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/jdbc/JDBCTableCatalog.scala
##########
@@ -139,7 +139,10 @@ class JDBCTableCatalog extends TableCatalog with Logging {
     checkNamespace(ident.namespace())
     withConnection { conn =>
       classifyException(s"Failed table altering: $ident") {
-        JdbcUtils.alterTable(conn, getTableName(ident), changes, options)
+        val optionsWithTableName = new JDBCOptions(
+          options.parameters + (JDBCOptions.JDBC_TABLE_NAME -> 
getTableName(ident)))
+        val tableSchema: StructType = JdbcUtils.getSchemaOption(conn, 
optionsWithTableName).get

Review comment:
       ah now I get what you mean. The `AlterTable` logical plan does have the 
table schema, but the catalog API doesn't pass it in. It's not possible to 
change the catalog API at this point, and it's also not worthy to add an extra 
table lookup here just to support update nullability in MySQL.
   
   I think the first version is fine. Sorry for the back and forth!




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to