Hi, Kevin,
Unable to reproduce it.
test("load API") {
val dfUsingOption =
spark.read
.option("url", url)
.option("dbtable", "(SELECT * FROM TEST.PEOPLE)")
.option("user", "testUser")
.option("password", "testPass")
.format("jdbc")
.load()
assert(dfUsing
I was fix it by :
val jdbcDF =
spark.read.format("org.apache.spark.sql.execution.datasources.jdbc.DefaultSource")
.options(Map("url" -> s"jdbc:mysql://${mysqlhost}:3306/test",
"driver" -> "com.mysql.jdbc.Driver", "dbtable" -> "i_user", "user" ->
"root", "password" -> "123456"))
.load()
You should specify classpath for your jdbc connection.
As example, if you want connect to Impala, you can try it snippet:
import java.util.Properties
import org.apache.spark._
import org.apache.spark.sql.SQLContext
import java.sql.Connection
import java.sql.DriverManager
Class.forName("com.cloud
maybe there is another version spark on the classpath?
2016-08-01 14:30 GMT+08:00 kevin :
> hi,all:
>I try to load data from jdbc datasource,but I got error with :
> java.lang.RuntimeException: Multiple sources found for jdbc
> (org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProv
hi,all:
I try to load data from jdbc datasource,but I got error with :
java.lang.RuntimeException: Multiple sources found for jdbc
(org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider,
org.apache.spark.sql.execution.datasources.jdbc.DefaultSource), please
specify the fully quali