Thanks Ted, I'll try, hope there's no transitive dependencies on 3.2.10.

On Tue, Mar 24, 2015 at 4:21 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> Looking at core/pom.xml :
>     <dependency>
>       <groupId>org.json4s</groupId>
>       <artifactId>json4s-jackson_${scala.binary.version}</artifactId>
>       <version>3.2.10</version>
>     </dependency>
>
> The version is hard coded.
>
> You can rebuild Spark 1.3.0 with json4s 3.2.11
>
> Cheers
>
> On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev <
> alexey.zinov...@gmail.com> wrote:
>
>> Spark has a dependency on json4s 3.2.10, but this version has several
>> bugs and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
>> build.sbt and everything compiled fine. But when I spark-submit my JAR it
>> provides me with 3.2.10.
>>
>>
>> build.sbt
>>
>> import sbt.Keys._
>>
>> name := "sparkapp"
>>
>> version := "1.0"
>>
>> scalaVersion := "2.10.4"
>>
>> libraryDependencies += "org.apache.spark" %% "spark-core"  % "1.3.0" %
>> "provided"
>>
>> libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"`
>>
>>
>> plugins.sbt
>>
>> logLevel := Level.Warn
>>
>> resolvers += Resolver.url("artifactory", url("
>> http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases
>> "))(Resolver.ivyStylePatterns)
>>
>> addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
>>
>>
>> App1.scala
>>
>> import org.apache.spark.SparkConf
>> import org.apache.spark.rdd.RDD
>> import org.apache.spark.{Logging, SparkConf, SparkContext}
>> import org.apache.spark.SparkContext._
>>
>> object App1 extends Logging {
>>   def main(args: Array[String]) = {
>>     val conf = new SparkConf().setAppName("App1")
>>     val sc = new SparkContext(conf)
>>     println(s"json4s version: ${org.json4s.BuildInfo.version.toString}")
>>   }
>> }
>>
>>
>>
>> sbt 0.13.7, sbt-assembly 0.13.0, Scala 2.10.4
>>
>> Is it possible to force 3.2.11 version usage?
>>
>> Thanks,
>> Alexey
>>
>
>

Reply via email to