somehow. Can you double check that and remove the Scala
classes from your app if they're there?
On Mon, Mar 23, 2015 at 10:07 PM, Alexey Zinoviev
alexey.zinov...@gmail.com wrote:
Thanks Marcelo, this options solved the problem (I'm using 1.3.0), but it
works only if I remove extends Logging from
version3.2.10/version
/dependency
The version is hard coded.
You can rebuild Spark 1.3.0 with json4s 3.2.11
Cheers
On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev
alexey.zinov...@gmail.com wrote:
Spark has a dependency on json4s 3.2.10, but this version has several
bugs and I need
works for executors.)
On Mon, Mar 23, 2015 at 2:12 PM, Alexey Zinoviev
alexey.zinov...@gmail.com wrote:
Spark has a dependency on json4s 3.2.10, but this version has several
bugs
and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
build.sbt and everything compiled fine
Spark has a dependency on json4s 3.2.10, but this version has several bugs
and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
build.sbt and everything compiled fine. But when I spark-submit my JAR it
provides me with 3.2.10.
build.sbt
import sbt.Keys._
name := sparkapp