Using 
        <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.10</artifactId>
                <version>1.0.0</version>
        </dependency>
          
I can create simple test and run under Eclipse.
But when I try to deploy on test server I have dependencies problems.

1. Spark requires 
    <artifactId>akka-remote_2.10</artifactId>
    <version>2.2.3-shaded-protobuf</version>

      And this in turn requires 

        <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty</artifactId>
            <version>3.6.6.Final</version>
        </dependency>

2. At the same time Spark itself requires 
    <artifactId>netty-parent</artifactId>
    <version>4.0.17.Final</version>

So now I have different Netty versions and I get either 

Exception in thread "main" java.lang.SecurityException: class
"javax.servlet.FilterRegistration"'s signer information does not match
signer information of other classes in the same package

When using 3.6.6.Final


Or

14/06/09 16:08:10 ERROR ActorSystemImpl: Uncaught fatal error from thread
[spark-akka.actor.default-dispatcher-4] shutting down ActorSystem [spark]
java.lang.NoClassDefFoundError: org/jboss/netty/util/Timer

When using 4.0.17.Final



What I am doing wrong and how to solve problem?

Thanks 
toivo




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-0-0-Maven-dependencies-problems-tp7247.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to