Re: Fat jar can't find jdbc

2015-12-25 Thread Chris Fregly
JDBC Drivers need to be on the system classpath.

try passing --jars /path/to/local/mysql-connector.jar when you submit the
job.

this will also copy the jars to each of the worker nodes and should set you
straight.

On Tue, Dec 22, 2015 at 11:42 AM, Igor Berman  wrote:

> imho, if you succeeded to fetch something from your mysql with same jar in
> classpath, then Manifest is ok and you indeed should look at your spark sql
> - jdbc configs
>
> On 22 December 2015 at 12:21, David Yerrington 
> wrote:
>
>> Igor, I think it's available.  After I extract the jar file, I see a
>> directory with class files that look very relevant in "/com/mysql/jdbc".
>>
>> After reading this, I started to wonder if MySQL connector was really the
>> problem.  Perhaps it's something to do with SQLcontext?  I just wired a
>> test endpoint to run a very basic mysql query, outside of Spark, and it
>> worked just fine (yay!).  I copied and pasted this example to verify my
>> MySQL connector availability, and it worked just fine:
>> https://mkaz.github.io/2011/05/27/using-scala-with-jdbc-to-connect-to-mysql/
>>
>> As far as the Maven manifest goes, I'm really not sure.  I will research
>> it though.  Now I'm wondering if my mergeStrategy is to blame?  I'm going
>> to try there next.
>>
>> Thank you for the help!
>>
>> On Tue, Dec 22, 2015 at 1:18 AM, Igor Berman 
>> wrote:
>>
>>> David, can you verify that mysql connector classes indeed in your single
>>> jar?
>>> open it with zip tool available at your platform
>>>
>>> another options that might be a problem - if there is some dependency in
>>> MANIFEST(not sure though this is the case of mysql connector) then it might
>>> be broken after preparing single jar
>>> so you need to verify that it's ok(in maven usually it's possible to
>>> define merging policy for resources while creating single jar)
>>>
>>> On 22 December 2015 at 10:04, Vijay Kiran  wrote:
>>>
 Can you paste your libraryDependencies from build.sbt ?

 ./Vijay

 > On 22 Dec 2015, at 06:12, David Yerrington 
 wrote:
 >
 > Hi Everyone,
 >
 > I'm building a prototype that fundamentally grabs data from a MySQL
 instance, crunches some numbers, and then moves it on down the pipeline.
 I've been using SBT with assembly tool to build a single jar for 
 deployment.
 >
 > I've gone through the paces of stomping out many dependency problems
 and have come down to one last (hopefully) zinger.
 >
 > java.lang.ClassNotFoundException: Failed to load class for data
 source: jdbc.
 >
 > at
 org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
 >
 > at
 org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
 >
 > at
 org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
 >
 > at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
 >
 > at her.recommender.getDataframe(her.recommender.scala:45)
 >
 > at her.recommender.getRecommendations(her.recommender.scala:60)
 >
 >
 > I'm assuming this has to do with mysql-connector because this is the
 problem I run into when I'm working with spark-shell and I forget to
 include my classpath with my mysql-connect jar file.
 >
 > I've tried:
 >   • Using different versions of mysql-connector-java in my
 build.sbt file
 >   • Copying the connector jar to my_project/src/main/lib
 >   • Copying the connector jar to my_project/lib <-- (this is
 where I keep my build.sbt)
 > Everything loads fine and works, except my call that does
 "sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
 question but in my defense, I'm fairly new to Scala, and this is my first
 go at deploying a fat jar with sbt-assembly.
 >
 > Thanks for any advice!
 >
 > --
 > David Yerrington
 > yerrington.net


 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org


>>>
>>
>>
>> --
>> David Yerrington
>> yerrington.net
>>
>
>


-- 

*Chris Fregly*
Principal Data Solutions Engineer
IBM Spark Technology Center, San Francisco, CA
http://spark.tc | http://advancedspark.com


Re: Fat jar can't find jdbc

2015-12-22 Thread David Yerrington
Igor, I think it's available.  After I extract the jar file, I see a
directory with class files that look very relevant in "/com/mysql/jdbc".

After reading this, I started to wonder if MySQL connector was really the
problem.  Perhaps it's something to do with SQLcontext?  I just wired a
test endpoint to run a very basic mysql query, outside of Spark, and it
worked just fine (yay!).  I copied and pasted this example to verify my
MySQL connector availability, and it worked just fine:
https://mkaz.github.io/2011/05/27/using-scala-with-jdbc-to-connect-to-mysql/

As far as the Maven manifest goes, I'm really not sure.  I will research it
though.  Now I'm wondering if my mergeStrategy is to blame?  I'm going to
try there next.

Thank you for the help!

On Tue, Dec 22, 2015 at 1:18 AM, Igor Berman  wrote:

> David, can you verify that mysql connector classes indeed in your single
> jar?
> open it with zip tool available at your platform
>
> another options that might be a problem - if there is some dependency in
> MANIFEST(not sure though this is the case of mysql connector) then it might
> be broken after preparing single jar
> so you need to verify that it's ok(in maven usually it's possible to
> define merging policy for resources while creating single jar)
>
> On 22 December 2015 at 10:04, Vijay Kiran  wrote:
>
>> Can you paste your libraryDependencies from build.sbt ?
>>
>> ./Vijay
>>
>> > On 22 Dec 2015, at 06:12, David Yerrington 
>> wrote:
>> >
>> > Hi Everyone,
>> >
>> > I'm building a prototype that fundamentally grabs data from a MySQL
>> instance, crunches some numbers, and then moves it on down the pipeline.
>> I've been using SBT with assembly tool to build a single jar for deployment.
>> >
>> > I've gone through the paces of stomping out many dependency problems
>> and have come down to one last (hopefully) zinger.
>> >
>> > java.lang.ClassNotFoundException: Failed to load class for data source:
>> jdbc.
>> >
>> > at
>> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
>> >
>> > at
>> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
>> >
>> > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>> >
>> > at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
>> >
>> > at her.recommender.getDataframe(her.recommender.scala:45)
>> >
>> > at her.recommender.getRecommendations(her.recommender.scala:60)
>> >
>> >
>> > I'm assuming this has to do with mysql-connector because this is the
>> problem I run into when I'm working with spark-shell and I forget to
>> include my classpath with my mysql-connect jar file.
>> >
>> > I've tried:
>> >   • Using different versions of mysql-connector-java in my
>> build.sbt file
>> >   • Copying the connector jar to my_project/src/main/lib
>> >   • Copying the connector jar to my_project/lib <-- (this is where
>> I keep my build.sbt)
>> > Everything loads fine and works, except my call that does
>> "sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
>> question but in my defense, I'm fairly new to Scala, and this is my first
>> go at deploying a fat jar with sbt-assembly.
>> >
>> > Thanks for any advice!
>> >
>> > --
>> > David Yerrington
>> > yerrington.net
>>
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


-- 
David Yerrington
yerrington.net


Re: Fat jar can't find jdbc

2015-12-22 Thread Vijay Kiran
Can you paste your libraryDependencies from build.sbt ?

./Vijay

> On 22 Dec 2015, at 06:12, David Yerrington  wrote:
> 
> Hi Everyone,
> 
> I'm building a prototype that fundamentally grabs data from a MySQL instance, 
> crunches some numbers, and then moves it on down the pipeline.  I've been 
> using SBT with assembly tool to build a single jar for deployment.
> 
> I've gone through the paces of stomping out many dependency problems and have 
> come down to one last (hopefully) zinger.  
> 
> java.lang.ClassNotFoundException: Failed to load class for data source: jdbc.
> 
> at 
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
> 
> at 
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
> 
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
> 
> at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
> 
> at her.recommender.getDataframe(her.recommender.scala:45)
> 
> at her.recommender.getRecommendations(her.recommender.scala:60)
> 
> 
> I'm assuming this has to do with mysql-connector because this is the problem 
> I run into when I'm working with spark-shell and I forget to include my 
> classpath with my mysql-connect jar file.
> 
> I've tried:
>   • Using different versions of mysql-connector-java in my build.sbt file
>   • Copying the connector jar to my_project/src/main/lib
>   • Copying the connector jar to my_project/lib <-- (this is where I keep 
> my build.sbt)
> Everything loads fine and works, except my call that does 
> "sqlContext.load("jdbc", myOptions)".  I know this is a total newbie question 
> but in my defense, I'm fairly new to Scala, and this is my first go at 
> deploying a fat jar with sbt-assembly.
> 
> Thanks for any advice!
> 
> -- 
> David Yerrington
> yerrington.net


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Fat jar can't find jdbc

2015-12-22 Thread David Yerrington
Sure here it is:

import AssemblyKeys._

assemblySettings

// assemblyJarName in assembly := "recommender.jar"

test in assembly := {}

organization  := "com.example"

version   := "0.1"

scalaVersion  := "2.11.6"

scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8")

libraryDependencies ++= {
  val akkaV = "2.3.9"
  val sprayV = "1.3.3"
  Seq(
"org.apache.spark" %% "spark-core" % "1.5.2",
"org.apache.spark" %% "spark-sql" % "1.5.2",
"org.apache.spark" %% "spark-hive" % "1.5.2",
"org.apache.spark" %% "spark-streaming" % "1.5.2",
"org.apache.spark" %% "spark-streaming-kafka" % "1.5.2",
"org.apache.spark" %% "spark-streaming-flume" % "1.5.2",
"org.apache.spark" %% "spark-mllib" % "1.5.2",
"org.apache.commons" % "commons-lang3" % "3.0",
"io.spray"%%  "spray-can" % sprayV,
"io.spray"%%  "spray-routing" % sprayV,
"io.spray"%%  "spray-testkit" % sprayV  % "test",
"io.spray"%%  "spray-json"% "1.3.2",
"com.typesafe.akka"   %%  "akka-actor"% akkaV,
"com.typesafe.akka"   %%  "akka-testkit"  % akkaV   % "test",
"com.zaxxer"  %   "HikariCP-java6"% "2.3.3",
"com.typesafe.slick"  %%  "slick" % "3.1.0",
"org.specs2"  %%  "specs2-core"   % "2.3.11" % "test",
"mysql"   %   "mysql-connector-java" % "5.1.35",
"org.slf4j"   %   "slf4j-nop" % "1.6.4",
"net.liftweb" %%  "lift-json" % "2.6+",
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.4.0"
  )
}

// For the jackson resolves business
resolvers += "Sonatype OSS Snapshots" at "
https://oss.sonatype.org/content/repositories/snapshots;

mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
  {
case m if m.toLowerCase.endsWith("manifest.mf") => MergeStrategy.discard
case m if m.startsWith("META-INF") => MergeStrategy.discard
case PathList("javax", "servlet", xs @ _*) => MergeStrategy.first
case PathList("org", "apache", xs @ _*) => MergeStrategy.first
case PathList("org", "jboss", xs @ _*) => MergeStrategy.first
case "about.html"  => MergeStrategy.rename
case "reference.conf" => MergeStrategy.concat
case _ => MergeStrategy.first
  }
}

Revolver.settings




Also, when I run "assembly", I see this, without warning / errors:

...

[info] Including: datanucleus-rdbms-3.2.9.jar
[info] Including: reactive-streams-1.0.0.jar
*[info] Including: mysql-connector-java-5.1.35.jar*
[info] Including: commons-pool-1.5.4.jar
[info] Including: commons-dbcp-1.4.jar

...


On Tue, Dec 22, 2015 at 12:04 AM, Vijay Kiran  wrote:

> Can you paste your libraryDependencies from build.sbt ?
>
> ./Vijay
>
> > On 22 Dec 2015, at 06:12, David Yerrington  wrote:
> >
> > Hi Everyone,
> >
> > I'm building a prototype that fundamentally grabs data from a MySQL
> instance, crunches some numbers, and then moves it on down the pipeline.
> I've been using SBT with assembly tool to build a single jar for deployment.
> >
> > I've gone through the paces of stomping out many dependency problems and
> have come down to one last (hopefully) zinger.
> >
> > java.lang.ClassNotFoundException: Failed to load class for data source:
> jdbc.
> >
> > at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
> >
> > at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
> >
> > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
> >
> > at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
> >
> > at her.recommender.getDataframe(her.recommender.scala:45)
> >
> > at her.recommender.getRecommendations(her.recommender.scala:60)
> >
> >
> > I'm assuming this has to do with mysql-connector because this is the
> problem I run into when I'm working with spark-shell and I forget to
> include my classpath with my mysql-connect jar file.
> >
> > I've tried:
> >   • Using different versions of mysql-connector-java in my build.sbt
> file
> >   • Copying the connector jar to my_project/src/main/lib
> >   • Copying the connector jar to my_project/lib <-- (this is where I
> keep my build.sbt)
> > Everything loads fine and works, except my call that does
> "sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
> question but in my defense, I'm fairly new to Scala, and this is my first
> go at deploying a fat jar with sbt-assembly.
> >
> > Thanks for any advice!
> >
> > --
> > David Yerrington
> > yerrington.net
>
>


-- 
David Yerrington
yerrington.net

On Tue, Dec 22, 2015 at 12:04 AM, Vijay Kiran  wrote:

> Can you paste your libraryDependencies from build.sbt ?
>
> ./Vijay
>
> > On 22 Dec 2015, at 06:12, David Yerrington  wrote:
> >
> > Hi Everyone,
> >
> > I'm building a prototype that fundamentally grabs data from a 

Re: Fat jar can't find jdbc

2015-12-22 Thread Igor Berman
David, can you verify that mysql connector classes indeed in your single
jar?
open it with zip tool available at your platform

another options that might be a problem - if there is some dependency in
MANIFEST(not sure though this is the case of mysql connector) then it might
be broken after preparing single jar
so you need to verify that it's ok(in maven usually it's possible to define
merging policy for resources while creating single jar)

On 22 December 2015 at 10:04, Vijay Kiran  wrote:

> Can you paste your libraryDependencies from build.sbt ?
>
> ./Vijay
>
> > On 22 Dec 2015, at 06:12, David Yerrington  wrote:
> >
> > Hi Everyone,
> >
> > I'm building a prototype that fundamentally grabs data from a MySQL
> instance, crunches some numbers, and then moves it on down the pipeline.
> I've been using SBT with assembly tool to build a single jar for deployment.
> >
> > I've gone through the paces of stomping out many dependency problems and
> have come down to one last (hopefully) zinger.
> >
> > java.lang.ClassNotFoundException: Failed to load class for data source:
> jdbc.
> >
> > at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
> >
> > at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
> >
> > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
> >
> > at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
> >
> > at her.recommender.getDataframe(her.recommender.scala:45)
> >
> > at her.recommender.getRecommendations(her.recommender.scala:60)
> >
> >
> > I'm assuming this has to do with mysql-connector because this is the
> problem I run into when I'm working with spark-shell and I forget to
> include my classpath with my mysql-connect jar file.
> >
> > I've tried:
> >   • Using different versions of mysql-connector-java in my build.sbt
> file
> >   • Copying the connector jar to my_project/src/main/lib
> >   • Copying the connector jar to my_project/lib <-- (this is where I
> keep my build.sbt)
> > Everything loads fine and works, except my call that does
> "sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
> question but in my defense, I'm fairly new to Scala, and this is my first
> go at deploying a fat jar with sbt-assembly.
> >
> > Thanks for any advice!
> >
> > --
> > David Yerrington
> > yerrington.net
>
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Fat jar can't find jdbc

2015-12-22 Thread Igor Berman
imho, if you succeeded to fetch something from your mysql with same jar in
classpath, then Manifest is ok and you indeed should look at your spark sql
- jdbc configs

On 22 December 2015 at 12:21, David Yerrington  wrote:

> Igor, I think it's available.  After I extract the jar file, I see a
> directory with class files that look very relevant in "/com/mysql/jdbc".
>
> After reading this, I started to wonder if MySQL connector was really the
> problem.  Perhaps it's something to do with SQLcontext?  I just wired a
> test endpoint to run a very basic mysql query, outside of Spark, and it
> worked just fine (yay!).  I copied and pasted this example to verify my
> MySQL connector availability, and it worked just fine:
> https://mkaz.github.io/2011/05/27/using-scala-with-jdbc-to-connect-to-mysql/
>
> As far as the Maven manifest goes, I'm really not sure.  I will research
> it though.  Now I'm wondering if my mergeStrategy is to blame?  I'm going
> to try there next.
>
> Thank you for the help!
>
> On Tue, Dec 22, 2015 at 1:18 AM, Igor Berman 
> wrote:
>
>> David, can you verify that mysql connector classes indeed in your single
>> jar?
>> open it with zip tool available at your platform
>>
>> another options that might be a problem - if there is some dependency in
>> MANIFEST(not sure though this is the case of mysql connector) then it might
>> be broken after preparing single jar
>> so you need to verify that it's ok(in maven usually it's possible to
>> define merging policy for resources while creating single jar)
>>
>> On 22 December 2015 at 10:04, Vijay Kiran  wrote:
>>
>>> Can you paste your libraryDependencies from build.sbt ?
>>>
>>> ./Vijay
>>>
>>> > On 22 Dec 2015, at 06:12, David Yerrington 
>>> wrote:
>>> >
>>> > Hi Everyone,
>>> >
>>> > I'm building a prototype that fundamentally grabs data from a MySQL
>>> instance, crunches some numbers, and then moves it on down the pipeline.
>>> I've been using SBT with assembly tool to build a single jar for deployment.
>>> >
>>> > I've gone through the paces of stomping out many dependency problems
>>> and have come down to one last (hopefully) zinger.
>>> >
>>> > java.lang.ClassNotFoundException: Failed to load class for data
>>> source: jdbc.
>>> >
>>> > at
>>> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
>>> >
>>> > at
>>> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
>>> >
>>> > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>>> >
>>> > at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
>>> >
>>> > at her.recommender.getDataframe(her.recommender.scala:45)
>>> >
>>> > at her.recommender.getRecommendations(her.recommender.scala:60)
>>> >
>>> >
>>> > I'm assuming this has to do with mysql-connector because this is the
>>> problem I run into when I'm working with spark-shell and I forget to
>>> include my classpath with my mysql-connect jar file.
>>> >
>>> > I've tried:
>>> >   • Using different versions of mysql-connector-java in my
>>> build.sbt file
>>> >   • Copying the connector jar to my_project/src/main/lib
>>> >   • Copying the connector jar to my_project/lib <-- (this is where
>>> I keep my build.sbt)
>>> > Everything loads fine and works, except my call that does
>>> "sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
>>> question but in my defense, I'm fairly new to Scala, and this is my first
>>> go at deploying a fat jar with sbt-assembly.
>>> >
>>> > Thanks for any advice!
>>> >
>>> > --
>>> > David Yerrington
>>> > yerrington.net
>>>
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>
>
> --
> David Yerrington
> yerrington.net
>


Fat jar can't find jdbc

2015-12-21 Thread David Yerrington
Hi Everyone,

I'm building a prototype that fundamentally grabs data from a MySQL
instance, crunches some numbers, and then moves it on down the pipeline.
I've been using SBT with assembly tool to build a single jar for deployment.

I've gone through the paces of stomping out many dependency problems and
have come down to one last (hopefully) zinger.

java.lang.ClassNotFoundException: Failed to load class for data source:
> jdbc.
>
> at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
>
> at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
>
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>
> at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
>
> at her.recommender.getDataframe(her.recommender.scala:45)
>
> at her.recommender.getRecommendations(her.recommender.scala:60)
>

I'm assuming this has to do with mysql-connector because this is the
problem I run into when I'm working with spark-shell and I forget to
include my classpath with my mysql-connect jar file.

I've tried:

   - Using different versions of mysql-connector-java in my build.sbt file
   - Copying the connector jar to my_project/src/main/lib
   - Copying the connector jar to my_project/lib <-- (this is where I keep
   my build.sbt)

Everything loads fine and works, except my call that does
"sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
question but in my defense, I'm fairly new to Scala, and this is my first
go at deploying a fat jar with sbt-assembly.

Thanks for any advice!

-- 
David Yerrington
yerrington.net