Hi,

On Thu, Jan 15, 2015 at 12:23 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>
> On Wed, Jan 14, 2015 at 6:58 AM, Jianguo Li <flyingfromch...@gmail.com>
> wrote:
>
>> I am using Spark-1.1.1. When I used "sbt test", I ran into the following 
>> exceptions. Any idea how to solve it? Thanks! I think somebody posted this 
>> question before, but no one seemed to have answered it. Could it be the 
>> version of "io.netty" I put in my build.sbt? I included an dependency 
>> "libraryDependencies += "io.netty" % "netty" % "3.6.6.Final" in my build.sbt 
>> file.
>>
>> From my personal experience, netty dependencies are very painful to get
right with Spark. I recommend to look at the dependency tree using <
https://github.com/jrudolph/sbt-dependency-graph> and then fine-tune your
sbt ignores until it works. There are too many issues depending on what
other packages you use to give a general advice, I'm afraid.

And once you have them right and use `sbt assembly` to build your
application jar and want to run it on a cluster with spark-submit, you'll
find that the netty version bundled with Spark will be put on the classpath
before the version you want to use. It seems like there are various  Spark
configuration options to change this,

http://apache-spark-user-list.1001560.n3.nabble.com/netty-on-classpath-when-using-spark-submit-td18030.html
and a unification process is running, I think:
  https://issues.apache.org/jira/browse/SPARK-2996
  https://github.com/apache/spark/pull/3233

I'm also looking forward to this one, as I am stuck with an ancient version
of Finagle due to these Netty issues.

Good luck,
Tobias

Reply via email to