[ 
https://issues.apache.org/jira/browse/SPARK-25013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16569811#comment-16569811
 ] 

Takeshi Yamamuro commented on SPARK-25013:
------------------------------------------

Spark currently doesn't have a dialect for mariadb. Does the workaround work 
fine in any case?

> JDBC urls with jdbc:mariadb don't work as expected
> --------------------------------------------------
>
>                 Key: SPARK-25013
>                 URL: https://issues.apache.org/jira/browse/SPARK-25013
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.1
>            Reporter: Dieter Vekeman
>            Priority: Minor
>
> When using the MariaDB JDBC driver, the JDBC connection url should be  
> {code:java}
> jdbc:mariadb://localhost:3306/DB?user=someuser&password=somepassword
> {code}
> https://mariadb.com/kb/en/library/about-mariadb-connector-j/
> However this does not work well in Spark (see below)
> *Workaround*
> The MariaDB driver also supports using mysql which does work.
> The problem seems to have been described and identified in:
> https://jira.mariadb.org/browse/CONJ-421
> All works well with spark using connection string with {{"jdbc:mysql:..."}}, 
> but not using {{"jdbc:mariadb:..."}} because MySQL dialect is then not used.
> when not used, defaut quote is {{"}}, not {{`}}
> So, some internal query generated by spark like {{SELECT `i`,`ip` FROM tmp}} 
> will then be executed as {{SELECT "i","ip" FROM tmp}} with dataType 
> previously retrieved, causing the exception
> The author of the comment says
> {quote}I'll make a pull request to spark so "jdbc:mariadb:" connection string 
> can be handle{quote}
> Did the pull request get lost or should a new one be made?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to