Hi Everyone,

I'm building a prototype that fundamentally grabs data from a MySQL
instance, crunches some numbers, and then moves it on down the pipeline.
I've been using SBT with assembly tool to build a single jar for deployment.

I've gone through the paces of stomping out many dependency problems and
have come down to one last (hopefully) zinger.

java.lang.ClassNotFoundException: Failed to load class for data source:
> jdbc.
>
> at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
>
> at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
>
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>
> at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
>
> at her.recommender.getDataframe(her.recommender.scala:45)
>
> at her.recommender.getRecommendations(her.recommender.scala:60)
>

I'm assuming this has to do with mysql-connector because this is the
problem I run into when I'm working with spark-shell and I forget to
include my classpath with my mysql-connect jar file.

I've tried:

   - Using different versions of mysql-connector-java in my build.sbt file
   - Copying the connector jar to my_project/src/main/lib
   - Copying the connector jar to my_project/lib <-- (this is where I keep
   my build.sbt)

Everything loads fine and works, except my call that does
"sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
question but in my defense, I'm fairly new to Scala, and this is my first
go at deploying a fat jar with sbt-assembly.

Thanks for any advice!

-- 
David Yerrington
yerrington.net

Reply via email to