You could build a far jar for your application containing both your
code and the json4s library, and then run Spark with these two
options:

  spark.driver.userClassPathFirst=true
  spark.executor.userClassPathFirst=true

Both only work in 1.3. (1.2 has spark.files.userClassPathFirst, but
that only works for executors.)


On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev
<alexey.zinov...@gmail.com> wrote:
> Spark has a dependency on json4s 3.2.10, but this version has several bugs
> and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
> build.sbt and everything compiled fine. But when I spark-submit my JAR it
> provides me with 3.2.10.
>
>
> build.sbt
>
> import sbt.Keys._
>
> name := "sparkapp"
>
> version := "1.0"
>
> scalaVersion := "2.10.4"
>
> libraryDependencies += "org.apache.spark" %% "spark-core"  % "1.3.0" %
> "provided"
>
> libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"`
>
>
> plugins.sbt
>
> logLevel := Level.Warn
>
> resolvers += Resolver.url("artifactory",
> url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases";))(Resolver.ivyStylePatterns)
>
> addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
>
>
> App1.scala
>
> import org.apache.spark.SparkConf
> import org.apache.spark.rdd.RDD
> import org.apache.spark.{Logging, SparkConf, SparkContext}
> import org.apache.spark.SparkContext._
>
> object App1 extends Logging {
>   def main(args: Array[String]) = {
>     val conf = new SparkConf().setAppName("App1")
>     val sc = new SparkContext(conf)
>     println(s"json4s version: ${org.json4s.BuildInfo.version.toString}")
>   }
> }
>
>
>
> sbt 0.13.7, sbt-assembly 0.13.0, Scala 2.10.4
>
> Is it possible to force 3.2.11 version usage?
>
> Thanks,
> Alexey



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to