imho, if you succeeded to fetch something from your mysql with same jar in
classpath, then Manifest is ok and you indeed should look at your spark sql
- jdbc configs

On 22 December 2015 at 12:21, David Yerrington <da...@yerrington.net> wrote:

> Igor, I think it's available.  After I extract the jar file, I see a
> directory with class files that look very relevant in "/com/mysql/jdbc".
>
> After reading this, I started to wonder if MySQL connector was really the
> problem.  Perhaps it's something to do with SQLcontext?  I just wired a
> test endpoint to run a very basic mysql query, outside of Spark, and it
> worked just fine (yay!).  I copied and pasted this example to verify my
> MySQL connector availability, and it worked just fine:
> https://mkaz.github.io/2011/05/27/using-scala-with-jdbc-to-connect-to-mysql/
>
> As far as the Maven manifest goes, I'm really not sure.  I will research
> it though.  Now I'm wondering if my mergeStrategy is to blame?  I'm going
> to try there next.
>
> Thank you for the help!
>
> On Tue, Dec 22, 2015 at 1:18 AM, Igor Berman <igor.ber...@gmail.com>
> wrote:
>
>> David, can you verify that mysql connector classes indeed in your single
>> jar?
>> open it with zip tool available at your platform
>>
>> another options that might be a problem - if there is some dependency in
>> MANIFEST(not sure though this is the case of mysql connector) then it might
>> be broken after preparing single jar
>> so you need to verify that it's ok(in maven usually it's possible to
>> define merging policy for resources while creating single jar)
>>
>> On 22 December 2015 at 10:04, Vijay Kiran <m...@vijaykiran.com> wrote:
>>
>>> Can you paste your libraryDependencies from build.sbt ?
>>>
>>> ./Vijay
>>>
>>> > On 22 Dec 2015, at 06:12, David Yerrington <da...@yerrington.net>
>>> wrote:
>>> >
>>> > Hi Everyone,
>>> >
>>> > I'm building a prototype that fundamentally grabs data from a MySQL
>>> instance, crunches some numbers, and then moves it on down the pipeline.
>>> I've been using SBT with assembly tool to build a single jar for deployment.
>>> >
>>> > I've gone through the paces of stomping out many dependency problems
>>> and have come down to one last (hopefully) zinger.
>>> >
>>> > java.lang.ClassNotFoundException: Failed to load class for data
>>> source: jdbc.
>>> >
>>> > at
>>> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
>>> >
>>> > at
>>> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
>>> >
>>> > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>>> >
>>> > at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
>>> >
>>> > at her.recommender.getDataframe(her.recommender.scala:45)
>>> >
>>> > at her.recommender.getRecommendations(her.recommender.scala:60)
>>> >
>>> >
>>> > I'm assuming this has to do with mysql-connector because this is the
>>> problem I run into when I'm working with spark-shell and I forget to
>>> include my classpath with my mysql-connect jar file.
>>> >
>>> > I've tried:
>>> >       • Using different versions of mysql-connector-java in my
>>> build.sbt file
>>> >       • Copying the connector jar to my_project/src/main/lib
>>> >       • Copying the connector jar to my_project/lib <-- (this is where
>>> I keep my build.sbt)
>>> > Everything loads fine and works, except my call that does
>>> "sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
>>> question but in my defense, I'm fairly new to Scala, and this is my first
>>> go at deploying a fat jar with sbt-assembly.
>>> >
>>> > Thanks for any advice!
>>> >
>>> > --
>>> > David Yerrington
>>> > yerrington.net
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>
>
> --
> David Yerrington
> yerrington.net
>

Reply via email to