Sure here it is:

import AssemblyKeys._

assemblySettings

// assemblyJarName in assembly := "recommender.jar"

test in assembly := {}

organization  := "com.example"

version       := "0.1"

scalaVersion  := "2.11.6"

scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8")

libraryDependencies ++= {
  val akkaV = "2.3.9"
  val sprayV = "1.3.3"
  Seq(
    "org.apache.spark" %% "spark-core" % "1.5.2",
    "org.apache.spark" %% "spark-sql" % "1.5.2",
    "org.apache.spark" %% "spark-hive" % "1.5.2",
    "org.apache.spark" %% "spark-streaming" % "1.5.2",
    "org.apache.spark" %% "spark-streaming-kafka" % "1.5.2",
    "org.apache.spark" %% "spark-streaming-flume" % "1.5.2",
    "org.apache.spark" %% "spark-mllib" % "1.5.2",
    "org.apache.commons" % "commons-lang3" % "3.0",
    "io.spray"            %%  "spray-can"     % sprayV,
    "io.spray"            %%  "spray-routing" % sprayV,
    "io.spray"            %%  "spray-testkit" % sprayV  % "test",
    "io.spray"            %%  "spray-json"    % "1.3.2",
    "com.typesafe.akka"   %%  "akka-actor"    % akkaV,
    "com.typesafe.akka"   %%  "akka-testkit"  % akkaV   % "test",
    "com.zaxxer"          %   "HikariCP-java6"% "2.3.3",
    "com.typesafe.slick"  %%  "slick"         % "3.1.0",
    "org.specs2"          %%  "specs2-core"   % "2.3.11" % "test",
    "mysql"               %   "mysql-connector-java" % "5.1.35",
    "org.slf4j"           %   "slf4j-nop"     % "1.6.4",
    "net.liftweb"         %%  "lift-json"     % "2.6+",
    "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.4.0"
  )
}

// For the jackson resolves business
resolvers += "Sonatype OSS Snapshots" at "
https://oss.sonatype.org/content/repositories/snapshots";

mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
  {
    case m if m.toLowerCase.endsWith("manifest.mf") => MergeStrategy.discard
    case m if m.startsWith("META-INF") => MergeStrategy.discard
    case PathList("javax", "servlet", xs @ _*) => MergeStrategy.first
    case PathList("org", "apache", xs @ _*) => MergeStrategy.first
    case PathList("org", "jboss", xs @ _*) => MergeStrategy.first
    case "about.html"  => MergeStrategy.rename
    case "reference.conf" => MergeStrategy.concat
    case _ => MergeStrategy.first
  }
}

Revolver.settings




Also, when I run "assembly", I see this, without warning / errors:

...

[info] Including: datanucleus-rdbms-3.2.9.jar
[info] Including: reactive-streams-1.0.0.jar
*[info] Including: mysql-connector-java-5.1.35.jar*
[info] Including: commons-pool-1.5.4.jar
[info] Including: commons-dbcp-1.4.jar

...


On Tue, Dec 22, 2015 at 12:04 AM, Vijay Kiran <m...@vijaykiran.com> wrote:

> Can you paste your libraryDependencies from build.sbt ?
>
> ./Vijay
>
> > On 22 Dec 2015, at 06:12, David Yerrington <da...@yerrington.net> wrote:
> >
> > Hi Everyone,
> >
> > I'm building a prototype that fundamentally grabs data from a MySQL
> instance, crunches some numbers, and then moves it on down the pipeline.
> I've been using SBT with assembly tool to build a single jar for deployment.
> >
> > I've gone through the paces of stomping out many dependency problems and
> have come down to one last (hopefully) zinger.
> >
> > java.lang.ClassNotFoundException: Failed to load class for data source:
> jdbc.
> >
> > at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
> >
> > at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
> >
> > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
> >
> > at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
> >
> > at her.recommender.getDataframe(her.recommender.scala:45)
> >
> > at her.recommender.getRecommendations(her.recommender.scala:60)
> >
> >
> > I'm assuming this has to do with mysql-connector because this is the
> problem I run into when I'm working with spark-shell and I forget to
> include my classpath with my mysql-connect jar file.
> >
> > I've tried:
> >       • Using different versions of mysql-connector-java in my build.sbt
> file
> >       • Copying the connector jar to my_project/src/main/lib
> >       • Copying the connector jar to my_project/lib <-- (this is where I
> keep my build.sbt)
> > Everything loads fine and works, except my call that does
> "sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
> question but in my defense, I'm fairly new to Scala, and this is my first
> go at deploying a fat jar with sbt-assembly.
> >
> > Thanks for any advice!
> >
> > --
> > David Yerrington
> > yerrington.net
>
>


-- 
David Yerrington
yerrington.net

On Tue, Dec 22, 2015 at 12:04 AM, Vijay Kiran <m...@vijaykiran.com> wrote:

> Can you paste your libraryDependencies from build.sbt ?
>
> ./Vijay
>
> > On 22 Dec 2015, at 06:12, David Yerrington <da...@yerrington.net> wrote:
> >
> > Hi Everyone,
> >
> > I'm building a prototype that fundamentally grabs data from a MySQL
> instance, crunches some numbers, and then moves it on down the pipeline.
> I've been using SBT with assembly tool to build a single jar for deployment.
> >
> > I've gone through the paces of stomping out many dependency problems and
> have come down to one last (hopefully) zinger.
> >
> > java.lang.ClassNotFoundException: Failed to load class for data source:
> jdbc.
> >
> > at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:67)
> >
> > at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:87)
> >
> > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
> >
> > at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
> >
> > at her.recommender.getDataframe(her.recommender.scala:45)
> >
> > at her.recommender.getRecommendations(her.recommender.scala:60)
> >
> >
> > I'm assuming this has to do with mysql-connector because this is the
> problem I run into when I'm working with spark-shell and I forget to
> include my classpath with my mysql-connect jar file.
> >
> > I've tried:
> >       • Using different versions of mysql-connector-java in my build.sbt
> file
> >       • Copying the connector jar to my_project/src/main/lib
> >       • Copying the connector jar to my_project/lib <-- (this is where I
> keep my build.sbt)
> > Everything loads fine and works, except my call that does
> "sqlContext.load("jdbc", myOptions)".  I know this is a total newbie
> question but in my defense, I'm fairly new to Scala, and this is my first
> go at deploying a fat jar with sbt-assembly.
> >
> > Thanks for any advice!
> >
> > --
> > David Yerrington
> > yerrington.net
>
>


-- 
David Yerrington
yerrington.net

Reply via email to