Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16664#discussion_r100127337
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala ---
    @@ -218,7 +247,14 @@ final class DataFrameWriter[T] private[sql](ds: 
Dataset[T]) {
           bucketSpec = getBucketSpec,
           options = extraOptions.toMap)
     
    -    dataSource.write(mode, df)
    +    val destination = source match {
    +      case "jdbc" => extraOptions.get(JDBCOptions.JDBC_TABLE_NAME)
    +      case _ => extraOptions.get("path")
    --- End diff --
    
    > Could we make it more general? For example, using a Map[String, String]
    
    Being the person who requested this class instead of an opaque map, I think 
using an opaque map makes for a really bad user API. The listener now needs to 
know about "magic keys" that have special meaning, which can vary depending on 
the destination. So you end up making up some contract that certain keys have 
some special meanings an all sources need to use them that way, so basically 
you end up encoding this class in a map.
    
    That being said I'm not super happy with the way JDBC works, because 
there's still some information embedded in the map. I thought about it a little 
but didn't come up with a good solution; embedding the table name in the JDBC 
URI sounds hacky and brittle. Best one I got is a separate field in this class 
(e.g. `serverUri`) that can be used to identify the server that is hosting the 
`destination` value (not needed for FS-based destinations since it's in the 
URI, but could be useful in other cases - maybe other table-based systems like 
Kudu or HBase).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to