I figured out that Logging is a DeveloperApi and it should not be used outside
Spark code, so everything is fine now. Thanks again, Marcelo.
On 24 Mar 2015, at 20:06, Marcelo Vanzin van...@cloudera.com wrote:
From the exception it seems like your app is also repackaging Scala
classes
From the exception it seems like your app is also repackaging Scala
classes somehow. Can you double check that and remove the Scala
classes from your app if they're there?
On Mon, Mar 23, 2015 at 10:07 PM, Alexey Zinoviev
alexey.zinov...@gmail.com wrote:
Thanks Marcelo, this options solved the
Thanks Ted, I'll try, hope there's no transitive dependencies on 3.2.10.
On Tue, Mar 24, 2015 at 4:21 AM, Ted Yu yuzhih...@gmail.com wrote:
Looking at core/pom.xml :
dependency
groupIdorg.json4s/groupId
artifactIdjson4s-jackson_${scala.binary.version}/artifactId
Thanks Marcelo, this options solved the problem (I'm using 1.3.0), but it
works only if I remove extends Logging from the object, with extends
Logging it return:
Exception in thread main java.lang.LinkageError: loader constraint
violation in interface itable initialization: when resolving method
Spark has a dependency on json4s 3.2.10, but this version has several bugs
and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to
build.sbt and everything compiled fine. But when I spark-submit my JAR it
provides me with 3.2.10.
build.sbt
import sbt.Keys._
name := sparkapp
Looking at core/pom.xml :
dependency
groupIdorg.json4s/groupId
artifactIdjson4s-jackson_${scala.binary.version}/artifactId
version3.2.10/version
/dependency
The version is hard coded.
You can rebuild Spark 1.3.0 with json4s 3.2.11
Cheers
On Mon, Mar 23, 2015 at 2:12
You could build a far jar for your application containing both your
code and the json4s library, and then run Spark with these two
options:
spark.driver.userClassPathFirst=true
spark.executor.userClassPathFirst=true
Both only work in 1.3. (1.2 has spark.files.userClassPathFirst, but
that