cloud-fan commented on a change in pull request #30154:
URL: https://github.com/apache/spark/pull/30154#discussion_r515763944



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/jdbc/JDBCTableCatalog.scala
##########
@@ -117,14 +117,33 @@ class JDBCTableCatalog extends TableCatalog with Logging {
     if (partitions.nonEmpty) {
       throw new UnsupportedOperationException("Cannot create JDBC table with 
partition")
     }
-    // TODO (SPARK-32405): Apply table options while creating tables in JDBC 
Table Catalog
+
+    var tableOptions = options.parameters + (JDBCOptions.JDBC_TABLE_NAME -> 
getTableName(ident))
+    var tableComment: String = ""
+    var tableProperties: String = ""
     if (!properties.isEmpty) {
-      logWarning("Cannot create JDBC table with properties, these properties 
will be " +
-        "ignored: " + properties.asScala.map { case (k, v) => s"$k=$v" 
}.mkString("[", ", ", "]"))
+      properties.asScala.map {
+        case (k, v) => k match {
+          case "comment" => tableComment = v
+          case "provider" | "owner" | "location" => // ignore provider, owner, 
and location
+          case _ => tableProperties = tableProperties + " " + s"$k=$v"
+        }
+      }
     }
 
-    val writeOptions = new JdbcOptionsInWrite(
-      options.parameters + (JDBCOptions.JDBC_TABLE_NAME -> 
getTableName(ident)))
+    if (tableComment != "") {
+      tableOptions = tableOptions + (JDBCOptions.JDBC_TABLE_COMMENT -> 
tableComment)
+    }
+    if (tableProperties != "") {
+      // table property is set in JDBC_CREATE_TABLE_OPTIONS, which will be 
appended
+      // to CREATE TABLE statement.
+      // E.g., "CREATE TABLE t (name string) ENGINE=InnoDB DEFAULT 
CHARSET=utf8"
+      // Spark doesn't check if these table properties are supported by 
databases. If
+      // table property is invalid, database will fail the table creation.

Review comment:
       This is a good idea. We can make the rule a bit more clearer:
   1. "provider", "owner" and "location" can't be set. We can ignore "provider" 
for now, and wait for the CREATE TABLE unification PR to be merged.
   2. For other options, just add them to the `writeOptions`, as let the 
dialect to decide. (fail if a dialect doesn't support one of the properties)

##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
##########
@@ -872,6 +873,15 @@ object JdbcUtils extends Logging {
     // E.g., "CREATE TABLE t (name string) ENGINE=InnoDB DEFAULT CHARSET=utf8"
     val sql = s"CREATE TABLE $tableName ($strSchema) $createTableOptions"
     executeStatement(conn, options, sql)
+    if (options.tableComment.nonEmpty) {
+      try {
+        executeStatement(
+          conn, options, dialect.getTableCommentQuery(tableName, 
options.tableComment))

Review comment:
       instead of having a dialect API to set table comment, can we have a 
dialect API to create table?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to