OK, I have known that I could use jdbc connector to create DataFrame with
this command:

val jdbcDF = sqlContext.load("jdbc", Map("url" ->
"jdbc:mysql://localhost:3306/video_rcmd?user=root&password=123456",
"dbtable" -> "video"))

But I got this error: 

java.sql.SQLException: No suitable driver found for ...

And I have tried to add jdbc jar to spark_path with both commands but
failed:

- spark-shell --jars mysql-connector-java-5.0.8-bin.jar
- SPARK_CLASSPATH=mysql-connector-java-5.0.8-bin.jar spark-shell

My Spark version is 1.3.0 while
`Class.forName("com.mysql.jdbc.Driver").newInstance` is worked.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-DataFrame-with-MySQL-tp22178.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to