Github user HyukjinKwon commented on the pull request:

    https://github.com/apache/spark/pull/12601#issuecomment-213382065
  
    BTW, it looks it is pretty general that `Properties` just works like 
`HashMap[String, String]` in most cases.
    
    Firstly, I just checked [java.sql.Driver 
API](https://docs.oracle.com/javase/7/docs/api/java/sql/Driver.html) and it 
describes the argument for `Properties` as below:
    
    >info - a list of arbitrary string tag/value pairs as connection arguments. 
Normally at least a "user" and "password" property should be included.
    
    Secondly, apparently Spark uses the methods below in `Properties`. 
    
    ```java
    public Set<String> stringPropertyNames()
    public String getProperty(String key, String defaultValue)
    public void store(OutputStream out, String comments)  // This uses convert 
for keys and values internally.
    public synchronized Object setProperty(String key, String value)
    ```
    
    It looks they uses `String` for keys and values.
    
    
    So, I think it might be OK to support `write.format("jdbc")`. I believe 
`read.format("jdbc")` is already being supported and I could not find JIRA 
issues about the problem for giving some options for `read.format("jdbc")`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to