Tobias,
From http://spark.apache.org/docs/latest/configuration.html it seems that there is an experimental property:

spark.files.userClassPathFirst

Whether to give user-added jars precedence over Spark's own jars when loading classes in Executors. This feature can be used to mitigate conflicts between Spark's dependencies and user dependencies. It is currently an experimental feature.

HTH,
Markus

On 11/04/2014 01:50 AM, Tobias Pfeiffer wrote:
Hi,

I tried hard to get a version of netty into my jar file created with sbt assembly that works with all my libraries. Now I managed that and was really happy, but it seems like spark-submit puts an older version of netty on the classpath when submitting to a cluster, such that my code ends up with an NoSuchMethodError:

Code:
  val a = new DefaultHttpRequest(HttpVersion.HTTP_1_1, HttpMethod.POST,
    "http://localhost";)
  val f = new File(a.getClass.getProtectionDomain().
    getCodeSource().getLocation().getPath())
  println(f.getAbsolutePath)
  println("headers: " + a.headers())

When executed with "sbt run":
~/.ivy2/cache/io.netty/netty/bundles/netty-3.9.4.Final.jar
  headers: org.jboss.netty.handler.codec.http.DefaultHttpHeaders@64934069

When executed with "spark-submit":
~/spark-1.1.0-bin-hadoop2.4/lib/spark-assembly-1.1.0-hadoop2.4.0.jar
Exception in thread "main" java.lang.NoSuchMethodError: org.jboss.netty.handler.codec.http.DefaultHttpRequest.headers()Lorg/jboss/netty/handler/codec/http/HttpHeaders;
    ...

How can I get the old netty version off my classpath?

Thanks
Tobias


Reply via email to